Visual 3D Perception for Interactive Robotic Tactile Data Acquisition


In this paper, we present a novel approach for tactile saliency computation on 3D point clouds of unseen object instances, where we define salient points as those that provide informative tactile sensory information with robotic interaction. Our intuition is that the local 3D surface geometries of objects contain characteristic information both in terms of texture and shape which can provide important discriminating information for tactile interactions. We solve the problem by taking as input a 3D point cloud of an object and develop a geometric approach which computes the tactile saliency map for the object without requiring pre-training. We furthermore develop a formulation to compute grasps using the tactile saliency for prehensile probing manipulation. We demonstrate our framework with evaluation on a variety of household objects in real-world experiments. Since it is difficult to manually define a ground truth tactile saliency measure, we evaluate our approach by having a human subject provide saliency information as baseline in pilot experiments. Results show good performance of our algorithm both in terms of the computation of tactile saliency and its usefulness to acquire informative tactile sensory data with a real-world robot.