Tactile sensor lets robots gauge objects’ hardness and manipulate small tools

June 23, 2017

A GelSight sensor attached to a robot’s gripper enables the robot to determine precisely where it has grasped a small screwdriver, removing it from and inserting it back into a slot, even when the gripper screens the screwdriver from the robot’s camera. (credit: Robot Locomotion Group at MIT)

Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have added sensors to grippers on robot arms to give robots greater sensitivity and dexterity. The sensor can judge the hardness of surfaces it touches, enabling a robot to manipulate smaller objects than was previously possible.

The “GelSight” sensor consists of a block of transparent soft rubber — the “gel” of its name — with one face coated with metallic paint. It is mounted on one side of a robotic gripper. When the paint-coated face is pressed against an object, the face conforms to the object’s shape and the metallic paint makes the object’s surface reflective. Mounted on the sensor opposite the paint-coated face of the rubber block are three colored lights at different angles and a single camera.

Humans gauge hardness by the degree to which the contact area between the object and our fingers changes as we press on it. Softer objects tend to flatten more, increasing the contact area. The MIT researchers used the same approach.

A GelSight sensor, pressed against each object manually, recorded how the contact pattern changed over time, essentially producing a short movie for each object. A neural network was then used to look for correlations between changes in contact patterns and hardness measurements. The resulting system takes frames of video as inputs and produces hardness scores with very high accuracy.

The researchers also designed control algorithms that use a computer vision system to guide the robot’s gripper toward a tool and then turn location estimation over to a GelSight sensor once the robot has the tool in hand.

“I think that the GelSight technology, as well as other high-bandwidth tactile sensors, will make a big impact in robotics,” says Sergey Levine, an assistant professor of electrical engineering and computer science at the University of California at Berkeley. “For humans, our sense of touch is one of the key enabling factors for our amazing manual dexterity. Current robots lack this type of dexterity and are limited in their ability to react to surface features when manipulating objects. If you imagine fumbling for a light switch in the dark, extracting an object from your pocket, or any of the other numerous things that you can do without even thinking — these all rely on touch sensing.”

The researchers presented their work in two papers at the International Conference on Robotics and Automation.


Wenzhen Yuan | Measuring hardness of fruits with GelSight sensor


Abstract of Tracking Objects with Point Clouds from Vision and Touch

We present an object-tracking framework that fuses point cloud information from an RGB-D camera with tactile information from a GelSight contact sensor. GelSight can be treated as a source of dense local geometric information, which we incorporate directly into a conventional point-cloud-based articulated object tracker based on signed-distance functions. Our implementation runs at 12 Hz using an online depth reconstruction algorithm for GelSight and a modified secondorder update for the tracking algorithm. We present data from hardware experiments demonstrating that the addition of contact-based geometric information significantly improves the pose accuracy during contact, and provides robustness to occlusions of small objects by the robot’s end effector.