TR2014-078

Learning to Rank 3D Features


    •  Tuzel, O.; Liu, M.-Y.; Taguchi, Y.; Raghunathan, A.U., "Learning to Rank 3D Features", European Conference on Computer Vision (ECCV), DOI: 10.1007/978-3-319-10590-1_34, ISSN: 0302-9743, ISBN: 978-3-319-10589-5, September 2014, vol. 8689, pp. 520-535.
      BibTeX Download PDF
      • @inproceedings{Tuzel2014sep,
      • author = {Tuzel, O. and Liu, M.-Y. and Taguchi, Y. and Raghunathan, A.U.},
      • title = {Learning to Rank 3D Features},
      • booktitle = {European Conference on Computer Vision (ECCV)},
      • journal = {European Conference on Computer Vision (ECCV)},
      • year = 2014,
      • volume = 8689,
      • series = {Lecture Notes in Computer Science},
      • pages = {520--535},
      • month = sep,
      • doi = {10.1007/978-3-319-10590-1_34},
      • issn = {0302-9743},
      • isbn = {978-3-319-10589-5},
      • url = {http://www.merl.com/publications/TR2014-078}
      • }
  • MERL Contacts:
  • Research Areas:

    Computer Vision, Decision Optimization


Representation of three dimensional objects using a set of oriented point pair features has been shown to be effective for object recognition and pose estimation. Combined with an efficient voting scheme on a generalized Hough space, existing approaches achieve good recognition accuracy and fast operation. However, the performance of these approaches degrades when the objects are (self-)similar or exhibit degeneracies, such as large planar surfaces which are very common in both man made and natural shapes, or due to heavy object and background clutter. We propose a max-margin learning framework to identify discriminative features on the surface of three dimensional objects. Our algorithm selects and ranks features according to their importance for the specified task, which leads to improved accuracy and reduced computational cost. In addition, we analyze various grouping and optimization strategies to learn the discriminative pair features. We present extensive synthetic and real experiments demonstrating the improved results.