Enhancing Odometry Estimation through Deep Learning-based Sensor Fusion

Enhancing the Accuracy and Robustness of Odometry through Deep Learning-based Sensor Fusion

This is Nazrul’s MSc project. We explored ways to improve the accuracy and robustness of odometry estimation through multi-modal sensor fusion. Estimating odometry is crucial for autonomous navigation of mobile robots as well as autonomous vehicles. Many methods rely on advanced AI, particularly deep neural networks, but they tend to favor cameras as their primary sensors. This reliance often demands human labeling or supervision. In our research, we introduced an automated approach that trains deep neural network in a self-supervised manner to predict an agent’s trajectory or position by analyzing Laser range data (LiDAR) and an Inertial Measurement Unit (IMU). In addition, we have incorporated LSTM on the input sequences to capture temporal features.

People

  • Muhammad Nazrul Fitri Bin Hj Ismail
  • Owais Ahmed Malik (Main Supervisor)
  • Ong Wee Hong

Data/Codes

Publications

  • N. Ismail, O. W. Hong and O. A. Malik, “Optimizing Odometry Accuracy through Temporal Information in Self-Supervised Deep Networks,” 2023 20th International Conference on Ubiquitous Robots (UR), Honolulu, HI, USA, 2023, pp. 490-495, doi: 10.1109/UR57808.2023.10202307. (pdf)

Media