Robot dressing a person

Robot Jeeves learns to dress people by touch

Image credit: Georgia Tech

Researchers based at the Georgia Institute of Technology have created an assistance robot capable of dressing people in a hospital gown without the use of computer vision.

Already, more than one million Americans require daily assistance to dress themselves due to injury, disease or complications with advanced age. This number is likely to increase as the population ages. At present, these people rely on friends, relatives and social workers to help them dress. The assistance of robots in dressing people could allow these people greater independence, although putting on clothing requires great dexterity that humans often take for granted.

To experiment with the abilities of robots to assist with dressing, a team of researchers at Georgia Institute of Technology set out to create a robot - the PR2 - capable of helping patients into hospital gowns.

Rather than pre-programming the robot to perform the task, the researchers used a machine-learning approach; feeding a neural network 11,000 simulated examples of putting a gown onto a person’s arms. Instead of working by sight – which would require the use of cameras and image recognition – the robot works by force, detecting the forces it experiences as it pulls the gown over a person’s hand, elbow and onto their shoulder to rest.

“People learn new skills using trial and error. We gave the PR2 the same opportunity,” said Zackory Erickson, the PhD student who led the project.

“Doing thousands of trials on a human would have been dangerous, let alone impossibly tedious. But in just one day, using simulations, the robot learned what a person may physically feel while getting dressed.”

After a day of analysis, the robot was capable of putting the gown on a person at least some of the time, replicating the process studied through the thousands of examples. It also learned to predict what would happen – sliding, tightening or other consequences – when the gown was manipulated in different ways. The PR2 was able to use this knowledge to dress a person.

“The key is that the robot is always thinking ahead,” said Professor Charlie Kemp, an associate professor in the department of biomedical engineering at Georgia Tech and Emory University. “It asks itself, ‘If I pull the gown this way, will it cause more or less force on the person’s arm? What would happen if I go that way instead?’”

While some attempts were perfect – dressing a sitting person smoothly and comfortably within 10 seconds – others used excessive and dangerous force when the clothing caught on the hand or elbow. The robot performed best when given more time to predict the consequences of different actions and plan its strategy.

At present, the robot is only capable of putting a gown on a single arm. Hospital gowns are loose, simple items of clothing which require minimal manipulation. Fully dressing a person - which could require the manipulation of laces, buttons and fitted clothing - is, according to the team, still many steps away.

Recent articles

Info Message

Our sites use cookies to support some functionality, and to collect anonymous user data.

Learn more about IET cookies and how to control them

Close