Robot displays emotion with facial expressions… and texture changes
Image credit: Lindsay France/University Photography
Researchers based at Cornell University have developed a prototype robot capable of expressing its feelings by changing its texture, much like octopuses.
According to mechanical and aerospace engineer Professor Guy Hoffman, the inspiration for the robot came from the animal world, whereby feelings are expressed with limited facial expressions and vocalisation, such as by changing texture, posture or using non-human anatomy, such as plumage, tails or tentacles.
“I’ve always felt that robots shouldn’t just be modelled after humans or be copies of humans,” said Hoffman. “We have a lot of interesting relationships with other species. Robots could be thought of as one of those “other species”, not trying to copy what we do but interacting with us with their own language, tapping into our own instincts.”
Hoffman worked with PhD student Yuhan Hu to design skins for robots composed of arrays of goosebumps and spikes. These are connected with fluidic chambers and move using actuators integrated into texture modules, allowing them to respond to various emotional states by shifting shape.
Hoffman and Hu experimented with different control systems in order to minimise size and noise, such that the robot could switch smoothly between emotional displays, much like a real animal. The Cornell University pair created a prototype robot based on their findings, which is capable of changing its texture as well as its basic facial expressions.
“At the moment, most social robots express internal state only by using facial expressions and gestures,” they wrote. “We believe that the integration of a texture-changing skin, combining both haptic and visual modalities, can thus significantly enhance the expressive spectrum of robots for social interaction.”
At present, Hoffman does not have specific applications in mind for the texture-changing robot, although he believes that demonstrating the technology is a good first step towards the design of alternative robots.
“It’s really just giving us another way to think about h ow robots could be designed,” he commented.
Next, the engineers will work to scale the technology down to fit inside a self-contained robot, and to render the texture-changing technology more immediately responsive to the robot’s changes in feeling.