Robot gripper with GelSight sensor

Tactile sensors give robot pincers greater dexterity and sensitivity

Image credit: Robot Locomotion Group at MIT

Researchers at Massachusetts Institute of Technology (MIT) have equipped robots with GelSight sensors. These relatively low-tech tactile sensors allow the robots to judge the hardness of objects and handle small tools with greater dexterity.

The GelSight technology – which uses physical contact with an object to build up a detailed 3D map of its shape – was first demonstrated nearly a decade ago by MIT researchers. Now, MIT engineers have returned to the sensor in order to equip robots with new capabilities for handling objects.

A GelSight sensor consists of a block of transparent rubber, coated on one side with a reflective metallic paint. When this surface is pressed against an object, it takes on the shape of the object. Three coloured lights shine on the reflective metal paint, and, using this visual information, a computer vision algorithm builds up an impression of the object’s shape.

In demonstrations presented at the International Conference on Robotics and Automation last week, researchers described how mounting GelSight sensors on the robot’s “grippers” (blunt pincers) allowed them to gauge the hardness of objects, and manipulate smaller tools that before.

According to the MIT researchers, these abilities are crucial if robots are to operate smoothly in households, handling a variety of everyday objects.

Being able to assess the hardness of an object makes it possible for robots to grasp objects. Grasping – which comes very naturally to humans – has proved a challenge for roboticists.

Previously, robots have only been able to assess the hardness of objects by prodding them, and measuring how much they give. The GelSight sensors, however, allowed the robots to be programmed to measure hardness with a more humanlike approach: by squeezing the object and measuring how much the contact surface area increases.

A soft object like a sponge flattens when squeezed, increasing surface area, while hard objects resist change in shape.

Using the GelSight sensors and a neural network to search for patterns between change in contact patterns and hardness measurements, the robots were able to rank objects by hardness just as well as human participants.

While attempting to develop a control system to guide humanoid robots through tasks during tools such as power drills, another MIT team found that the GelSight sensors were also useful for helping robots judge the position of an object relative to itself, a problem faced by robots equipped with conventional vision systems. Using the camera-based GelSight sensor alongside the vision system, however, the robots were able to grasp and ‘understand’ the position and orientation of small tools such as screwdrivers.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close