Fast Localization and Tracking Using Event Sensors

The success of many robotics applications hinges on the speed at which the underlying sensing and inference tasks are carried out. Many high-speed applications such as autonomous driving and evasive maneuvering of quadrotors require high run time performance, which traditional cameras can seldom provide. In this paper we develop a fast localization and tracking algorithm using an event sensor, which produces on the order of million asynchronous events per second at pixels where luminance changes. The events are usually fired at the high gradient pixels (edges), where luminance changes occur as the sensor moves. We develop a fast spatio-temporal binning scheme to detect lines from these events at the edges. We represent the 3D model of the world using vertical lines, and the sensor pose can be estimated using the correspondences from 2D event lines to 3D world lines. The inherent simplicity of our method enables us to achieve a run time performance of 1000 Hertz.