TR2020-143

Interactive Tactile Perception for Classification of Novel Object Instances


Abstract:

In this paper, we present a novel approach for classification of unseen object instances from interactive tactile feedback. Furthermore, we demonstrate the utility of a low resolution tactile sensor array for tactile perception that can potentially close the gap between vision and physical contact for manipulation. We contrast our sensor to high-resolution camera-based tactile sensors. Our proposed approach interactively learns a one-class classification model using 3D tactile descriptors, and thus demonstrates an advantage over the existing approaches, which require pre-training on objects. We describe how we derive 3D features from the tactile sensor inputs, and exploit them for learning one-class classifiers. In addition, since our proposed method uses unsupervised learning, we do not require ground truth labels. This makes our proposed method flexible and more practical for deployment on robotic systems. We validate our proposed method on a set of household objects and results indicate good classification performance in real-world experiments

 

  • Related News & Events

    •  NEWS    Radu Corcodel to present invited seminar at NYU on Robot Vision
      Date: May 4, 2022
      MERL Contact: Radu Corcodel
      Research Areas: Computer Vision, Robotics
      Brief
      • Radu Corcodel, a Principal Research Scientist in MERL's Computer Vision Group, will present an overview of the Robot Perception research published by MERL for advanced manipulation. The talk will mainly cover topics pertaining to robotic manipulation in unstructured environments such as machine vision, tactile sensing and autonomous grasping. The seminar will also cover specific perception problems in non-prehensile interactions such as Contact-Implicit Trajectory Optimization and Tactile Classification, and is intended for a broader audience.
    •  
  • Related Video