Accurate time synchronization is crucial for multisensor fusion, which is widely used in mobile robotics, autonomous driving, and virtual reality. Despite many advancements, precise multi-sensor synchronization is still challenging due to the sensors’ internal characteristics, data filtering, disjointed clock reference, and transmission delay caused by operation system scheduling. This paper proposes a novel hardware-based synchronization solution to achieve synchronization in microsecond-level precision. By introducing a Sensor Adaptor board that provides a unified clock reference, the proposed hardware architecture enables high-precision synchronization across multiple sensors. Furthermore, we develop a method for Visual-Inertial time synchronization that actively controls the exposure duration using an ambient light sensor. By managing the IMU clock signal and exposure trigger, we align the camera’s sampling moment with the authentic IMU sampling time and significantly reduce the time discrepancy in the Visual-Inertial system. Experiments are conducted to evaluate the efficiency of the proposed method and system, including comparisons with previous work. The results indicate that our method can achieve precise time synchronization and be successfully implemented in multi-sensor systems.
Supplementary notes can be added here, including code and math.