Many smartphone applications use inertial measurement units (IMUs) to sense movement, but the use of these sensors for pedestrian localization can be challenging due to their noise characteristics. Recent deep inertial odometry approaches to pedestrian navigation have demonstrated the increasing feasibility of inertial navigation. However, they still rely upon conventional phone orientation estimates that they assume to be accurate, while in fact these orientation estimates can be a significant source of error. To address the problem of inaccurate orientation estimates, we present a data-driven inertial localization pipeline that performs both device orientation and position estimation using a commodity smartphone. Our system first estimates device orientation via a deep recurrent neural network component, which is coupled with raw IMU measurements to estimate device positions via a second deep network component. To improve the robustness of the orientation estimates, we introduce a dynamic magnetometer calibration network and implement an Extended Kalman Filter to generate accurate orientation estimates from the calibrated magnetic field. Our proposed model outperforms state-of-the-art methods by 40% in orientation error and up to 50% in position error on a large dataset we constructed that contains 20 hours of pedestrian motion across 3 large buildings and 15 subjects.
Kris Kitani (Advisor)
Remote Participation Enabled. See announcement.