Visual odometry and mapping for unknown motion dynamics in natural environments using passive sensors
George Terzakis
Plymouth University, UK
: J Comput Eng Inf Technol
Abstract
The real-time recovery of vehicle pose and the estimation of the sparse geometrical structure of the surrounding environment in natural landscapes using monocular vision and passive sensors (excluding GPS) has received little attention in literature thus far, mainly because it is a very ill-posed problem. The motivation for the current research stems from GPS-denied scenarios involving autonomous vehicle navigation in natural environments (e.g., space exploration rovers, sea-surface vehicles, etc.). The natural cornerstone of vision based localization algorithms is the detection and tracking of a sparse set of point-features throughout a sequence of images. In natural scenes, we show that features can be more reliably tracked using optic flow estimation in comparison to descriptor/patch matching approaches. We introduce an algorithm for camera pose estimation and 3D mapping in real time which circumvents the typical execution scaling in the number of features of SLAM algorithms by marginalizing the entire map out of the state vector. The rationale behind this approach is to parametrize the 3D locations of the features in terms of the relative pose between the two initial frames in the sequence. This apprtoach fundamentally relies on the assumption that tracking is accurate between the first two frames and therefore, the uncertainty of the map is primarily associated with relative pose. We show that in natural scenes, provided optic flow based tracking and efficient treatment of outliers, real-time odometry is accurate and mapping is fairly reliable in practice.