Bionic hand assesses objects to determine grip in milliseconds
A bionic hand that is capable of seeing objects and instantly deciding what kind of grip to adopt has been developed by scientists at the University of Newcastle.
The device could lead to a new generation of prosthetic limbs giving the wearer the ability to reach for objects without thinking, researchers say.
A camera fitted to the hand rapidly takes a picture of the object in front of it and feeds the information to an electronic “brain”.
The computer automatically assesses the object’s shape and size and “within milliseconds” triggers the correct movements needed to pick it up, whether a light pinch or firm grip.
Dr Kianoush Nazarpour, a senior lecturer in biomedical engineering at the university, said: “Prosthetic limbs have changed very little in the past 100 years.
“The design is much better and the materials are lighter weight and more durable but they still work in the same way.
“Using computer vision, we have developed a bionic hand which can respond automatically.
“In fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.
“Responsiveness has been one of the main barriers to artificial limbs.
“For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison.
“Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking.”
Every year in the UK around 600 people suffer the loss of upper limbs, half of whom are aged 15 to 54. In the US, there are 500,000 new upper limb amputees each year.
Current prosthetic hands are controlled by myoelectric signals, muscular electrical activity recorded from the skin surface of the stump.
Learning to operate them takes practice, concentration and time, Nazarpour said.
Developing the new system, based on artificial intelligence (AI), involved “teaching” a computer how to recognise the grip needed for different objects.
Lead researcher PhD student Ghazal Ghazaei said: “We would show the computer a picture of, for example, a stick. But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up.
“So the computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up.
“It is this which enables it to accurately assess and pick up an object which it has never seen before, a huge step forward in the development of bionic limbs.”
The team has programmed the hand to perform four different “grasps” suitable for picking up a cup, holding a TV controller, and gripping objects with a thumb and two fingers or a pinched thumb and first finger. It has already been tested out on a small number of amputees.
“The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects, which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before,” Nazarpour said.
The research is part of a larger project to develop a bionic hand that can sense pressure and temperature and transmit the information back to the brain.