Time Synchronization
Time synchronization is critical when using multiple devices in a dynamic robotic system. Time synchronization allows the data of multiple sensors to be fused into a single system state that can be used for planning.
MultiSense cameras support various time synchronization methods with the host.
PTP (Precision Time Protocol)
Note
PTP is only supported on MultiSense S27, MultiSense KS21, and MultiSense S30 cameras
PTP (Precision Time Protocol) is a high-precision network-based time synchronization protocol that can be used to synchronize the clocks of many devices on a network. PTP can leverage hardware level timestamping if supported by all networking devices between the camera and the host PC support hardware PTP. Hardware PTP can provide sub-microsecond time sync accuracy. Software PTP is also available if devices lack hardware PTP support, with slightly degraded accuracy
PTP can be used to synchronize each of the MultiSense cameras connected on a network interface to the host machine’s system clock. This is an active time synchronization method, where the MultiSense camera’s system time will synchronize with the host machine’s PTP master. When PTP is enabled, image timestamps will have the host timestamp directly without any additional user intervention
Enabling on MultiSense
LibMultiSense
To enable PTP in LibMultiSense, one simply needs to call the networkTimeSynchronization
parameter of the LibMultiSense Channel
object.
auto channel_ptr = crl::multisense::Channel::Create("10.66.171.21");
channel_ptr->ptpTimeSynchronization(true);
crl::multisense::Channel::Destroy(channel_ptr);
It can take some time for the MultiSense to synchronize its clock with the host, so it is possible that the first few frames received from the camera were not properly timestamped. To check if the image was stamped with a proper PTP timestamp, one can use the Channel::getPtpStatus()
function
void image_callback(const crl::multisense::image::Header& header, void* channel_void_ptr)
{
auto channel_ptr = static_cast<crl::multisense::Channel*>(channel_void_ptr);
crl::multisense::system::PtpStatus ptp_status;
const auto status = channel_ptr->getPtpStatus(header.frameId, ptp_status);
if (status != crl::multisense::Status_Ok or not ptp_status.gm_present)
{
std::cout << "Image captured before ptp time sync was established" << std::endl;
return;
}
// process image here only if ptp was available
}
ROS 1
To enable PTP when using ROS, one simply needs to modify the ptp_time_sync camera parameter. See the ROS1 reconfigure section for details on reconfiguring via ROS
ROS 2
To enable PTP when using ROS2, one simply needs to modify the ptp_time_sync camera parameter. See the ROS2 configuration section for details on reconfiguring via ROS2
Windows
OSX
LibMultiSense Network Time Synchronization
LibMultiSense provides facilities to do a crude NTP-esque time synchronization routine between the MultiSense and the host machine’s LibMultiSense instance.
Note
This feature was developed for user convenience, and should only be used in situations where precise time synchronization is not necessary.
Note
Network Time Synchronization is not compatible with PTP. Do not enable network time sync if PTP is enabled
How It Works
LibMultiSense queries status information from the MultiSense at 1Hz in a stand-alone status thread. The returned status message contains the current MultiSense system time which is the same time-source used to stamp all transmitted sensor data. LibMultiSense tracks the total round-trip time from transmission of the status request, to receipt of the status response. The MultiSense takes this round-trip time and halves it to get an approximation of the network latency. This latency is subtracted from the system time reported in the MultiSense status response message, and is averaged with previous offsets to generate a single time offset between the MultiSense and host machine’s system clocks. When network time synchronization is enabled, LibMultiSense adds the estimated offset to all timestamps in sensor data sent from the MultiSense.
Enabling on MultiSense
LibMultiSense
To enable network time sync in LibMultiSense, one simply needs to call the networkTimeSynchronization
parameter of the LibMultiSense Channel
object.
The networkTimeSynchronization member of the
LibMultiSense Channel
can be used to enable network time synchronization via LibMultiSense
auto channel_ptr = crl::multisense::Channel::Create("10.66.171.21");
channel_ptr->networkTimeSynchronization(true);
crl::multisense::Channel::Destroy(channel_ptr);
ROS 1
To enable network time synchronization when using ROS, one simply needs to modify the network_time_sync camera parameter. See the ROS1 reconfigure section for details on reconfiguring via ROS
ROS 2
To enable network time synchronization when using ROS, one simply needs to modify the network_time_sync camera parameter. See the ROS2 configuration section for details on reconfiguring via ROS
Legacy Time Synchronization
Note
Only legacy MultiSense SL, MultiSense S21, MultiSense ST21, MultiSense S7, and MultiSense S7S cameras support these time synchronization methods
Legacy MultiSense cameras have opto-isolated inputs and outputs which can be used for time synchronization.
For detailed specifications of the opto-isolated inputs and outputs on legacy MultiSense cameras please reference
the MultiSense Opto-Isolation
document.
PPS
The opto-isolated output generates a pulse-per-second signal (PPS). After generating a PPS pulse, the FPGA sends a PPS timestamp message to the connected client, see PPS source for more information. This PPS signal can be used as input to various time synchronization tools like chrony to sync the host machine’s system time to the MultiSenses system time.
Hardware Triggering
The opto-isolated input can be used as a 1:1 frame trigger. When triggering the camera externally the camera’s trigger source must be set to either Trigger_External or Trigger_External_Inverted. This can be done via the LibMultiSense setTriggerSource API, or the ROS1/ROS2 configuration mechanisms.