TR2010-024

Pose Estimation in Heavy Clutter Using a Multi-Flash Camera


    •  Liu, M.-Y.; Tuzel, C.O.; Veeraraghavan, A.N.; Chellappa, R.; Agrawal, A.K.; Okuda, H., "Pose Estimation in Heavy Clutter Using a Multi-Flash Camera", IEEE International Conference on Robotics and Automation (ICRA), May 2010.
      BibTeX Download PDF
      • @inproceedings{Liu2010may,
      • author = {Liu, M.-Y. and Tuzel, C.O. and Veeraraghavan, A.N. and Chellappa, R. and Agrawal, A.K. and Okuda, H.},
      • title = {Pose Estimation in Heavy Clutter Using a Multi-Flash Camera},
      • booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},
      • year = 2010,
      • month = may,
      • url = {http://www.merl.com/publications/TR2010-024}
      • }
  • Research Area:

    Computer Vision


We propose a novel solution to object detection, localization and pose estimation with applications in robot vision. The proposed method is especially applicable when the objects of interest may not be richly textured and are immersed in heavy clutter. We show that a multi-flash camera (MFC) provides accurate separation of depth edges and texture edges in such scenes. Then, we reformulate the problem, as one finding matches between the depth edges obtained in one or more MFC images to the rendered depth edges that are computed offline using 3D CAD model of the objects. In order to facilitate accurate matching of these binary depth edge maps, we introduce a novel cost function that respects both the position and the local orientation of each edge pixel. This cost function is significantly superior to traditional Chamfer cost and leads to accurate matching even in heavily cluttered scenes where traditional methods are unreliable. We present a sub-linear time algorithm to compute the cost function using techniques from 3D distance transforms and integral images. Finally, we also propose a multi-view based pose-refinement algorithm to improve the estimated pose. We implemented the algorithm on an industrial robot arm and obtained location and angular estimation accuracy of the order of 1 mm and 2 degrees respectively for a variety of parts with minimal texture.