computational robotics

from tasks to motions

Manipulation Planning

Object Recognition From Tactile Appearance Information

Tactile force sensors, consisting of an array of individual pressure sensors, are becoming common parts of modern manipulation systems. It is generally expected that a new robotic hand design will include tactile force sensors embedded in each fingertip and possibly along other surfaces of the hand. Given the advancement and ubiquity of tactile force sensors, it becomes important to be able to extract as much information as possible from these sensors about the task at hand.

This research explores the connection between sensor-based perception and planning in the context of haptic object identification. The approach combines (i) object recognition from tactile appearance with (ii) purposeful haptic planning to explore unknown objects to extract appearance information. A bag-of-features framework is developed that uses several tactile image descriptors, some adapted from the vision domain, others novel, to estimate a probability distribution over object identity as an unknown object is explored. Haptic exploration is treated as a search problem to take advantage of planning to explore the unknown object and construct its tactile appearance. The planner computes high-level specifications that define appropriate objectives in exploration and the necessary low-level motions to achieve those objectives. Simulation experiments of a robot arm equipped with a haptic sensor provide promising validation, indicating high accuracy in identifying complex shapes from tactile information. The framework is also validated by using actual tactile sensors to recognize real objects.

Related Publications

  • Pezzementi Z, Plaku E, Reyda C, and Hager GD (2011): "Tactile Object Recognition From Appearance Information." IEEE Transactions on Robotics, vol. 27(3), pp. 473--487  [publisher]  [preprint]