Abstract:This paper introduces a geometric Quaternion-based Unscented Particle Filter for Visual-Inertial Navigation (QUPF-VIN) specifically designed for a vehicle operating with six degrees of freedom (6 DoF). The proposed QUPF-VIN technique is quaternion-based capturing the inherently nonlinear nature of true navigation kinematics. The filter fuses data from a low-cost inertial measurement unit (IMU) and landmark observations obtained via a vision sensor. The QUPF-VIN is implemented in discrete form to ensure seamless integration with onboard inertial sensing systems. Designed for robustness in GPS-denied environments, the proposed method has been validated through experiments with real-world dataset involving an unmanned aerial vehicle (UAV) equipped with a 6-axis IMU and a stereo camera, operating with 6 DoF. The numerical results demonstrate that the QUPF-VIN provides superior tracking accuracy compared to ground truth data. Additionally, a comparative analysis with a standard Kalman filter-based navigation technique further highlights the enhanced performance of the QUPF-VIN.
Abstract:This paper investigates the orientation, position, and linear velocity estimation problem of a rigid-body moving in three-dimensional (3D) space with six degrees-of-freedom (6 DoF). The highly nonlinear navigation kinematics are formulated to ensure global representation of the navigation problem. A computationally efficient Quaternion-based Navigation Unscented Kalman Filter (QNUKF) is proposed on $\mathbb{S}^{3}\times\mathbb{R}^{3}\times\mathbb{R}^{3}$ imitating the true nonlinear navigation kinematics and utilize onboard Visual-Inertial Navigation (VIN) units to achieve successful GPS-denied navigation. The proposed QNUKF is designed in discrete form to operate based on the data fusion of photographs garnered by a vision unit (stereo or monocular camera) and information collected by a low-cost inertial measurement unit (IMU). The photographs are processed to extract feature points in 3D space, while the 6-axis IMU supplies angular velocity and accelerometer measurements expressed with respect to the body-frame. Robustness and effectiveness of the proposed QNUKF have been confirmed through experiments on a real-world dataset collected by a drone navigating in 3D and consisting of stereo images and 6-axis IMU measurements. Also, the proposed approach is validated against standard state-of-the-art filtering techniques. IEEE Keywords: Localization, Navigation, Unmanned Aerial Vehicle, Sensor-fusion, Inertial Measurement Unit, Vision Unit.