ABSTRACT
Robust 3D pose tracking of an object is a critical technique for various mobile sensing applications. Computer vision-based pose tracking method provides a cost-effective solution, but it is sensitive to occlusion and illumination change issues. In this work, we propose a novel visual-inertial sensor fusion framework and demonstrate the real-time implementation of a tightly-coupled sensor fusion algorithm: inertial perspective-n-point (IPNP) algorithm. With measurements from an inertial measurement unit (IMU), the prototype system only needs to detect two keypoints to track all six degrees of freedom of a planar object, e.g., a mobile X-ray detector, a 50% reduction on required number of keypoints, compared with the vision-based perspective-n-point algorithm.
- Raul Acuna and Volker Willert. 2018. Robustness of control point configurations for homography and planar pose estimation. CoRR abs/1803.03025 (2018).Google Scholar
- J. Delmerico and D. Scaramuzza. 2018. A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots. In 2018 IEEE ICRA.Google Scholar
- D. Scaramuzza and F. Fraundorfer. 2011. Visual Odometry [Tutorial]. IEEE Robotics Automation Magazine 18, 4 (Dec 2011), 80--92.Google Scholar
Cross Ref
Index Terms
Visual and inertial sensor fusion for mobile X-ray detector tracking: demo abstract
Recommendations
Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration
Visual and inertial sensors, in combination, are able to provide accurate motion estimates and are well suited for use in many robot navigation tasks. However, correct data fusion, and hence overall performance, depends on careful calibration of the ...
Relative Pose Calibration Between Visual and Inertial Sensors
This paper proposes an approach to calibrate off-the-shelf cameras and inertial sensors to have a useful integrated system to be used in static and dynamic situations. When both sensors are integrated in a system their relative pose needs to be ...
Fusion of Monocular Visual-Inertial Measurements for Three Dimensional Pose Estimation
MESAS 2016: Proceedings of the Third International Workshop on Modelling and Simulation for Autonomous Systems - Volume 9991This work describes a novel fusion schema to estimate the pose of a UAV using inertial sensors and a monocular camera. The visual motion algorithm is based on the plane induced homography using so called spectral features. The algorithm is able to ...





Comments