REAL-TIME ONBOARD 3D STATE ESTIMATION OF AN UNMANNED AERIAL VEHICLE IN MULTI-ENVIRONMENTS USING MULTI-SENSOR DATA FUSION

Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion

Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion

Blog Article

The question of how to estimate the state of an unmanned aerial vehicle (UAV) in real time in multi-environments remains a challenge.Although the global navigation satellite system (GNSS) has been widely applied, drones HANKS cannot perform position estimation when a GNSS signal is not available or the GNSS is disturbed.In this paper, the problem of state estimation in multi-environments is solved by employing an Extended Kalman Filter (EKF) algorithm to fuse the data from multiple heterogeneous sensors (MHS), including an inertial measurement unit (IMU), a magnetometer, a barometer, a GNSS receiver, an optical flow sensor (OFS), Light Detection and Ranging (LiDAR), and an RGB-D camera.Finally, the robustness and effectiveness of the multi-sensor data fusion system based on the EKF algorithm are verified by field flights in unstructured, indoor, outdoor, and indoor Water Tray Front Cover and outdoor transition scenarios.

Report this page