Abstract:
We propose an electronic image stabilization method to attach an inertial measurement unit (IMU) to a camera. Then, we apply the IMU data to estimate the attitude changes of the camera indirectly and in real time to implement feedback transformation on real-time images. First, we calibrate the IMU and the camera coordinate system by using the quaternion method to conjoin the two coordinate systems. Then, we adopt the least square method to complete the minimum error match of the calibrated IMU and the camera to achieve trigger time synchronization. Finally, we use the IMU's attitude information to feedback the camera's jitter in real time to realize video stabilization. Experimental results show that the effects of IMU stabilization and scale-invariant feature transform (SIFT) are almost the same in a normal scene. In addition, in a scarce texture information scene, the stabilization correlation of the IMU increases by approximately 20% in comparison with the SIFT method and by 40% in comparison with the gray projection method. This method breaks through the limitations of application scenarios, and has high robustness and application value.