Smarter prosthetics created using sensors, software, electronics and 3D printing
Image credit: BrainRobotics
Sensors and software are transforming artificial limbs.
At first sight, the LUKE arm looks much like the prosthetic arms that preceded it. But this limb, named after the ‘Star Wars’ hero Luke Skywalker who acquires a robotic arm in ‘The Empire Strikes Back’, is much more responsive than its predecessors.
Ever since the ‘Six Million Dollar Man’ promised prosthetics that could be better, faster, stronger than their flesh equivalents, much of the attention has been on materials. Prosthetics makers have turned to titanium and carbon fibre in place of wood and steel to bring weight down without compromising strength, and adding polymers to the limb surfaces to make them softer. But the breakthrough of the LUKE arm lies hidden away - in the sensors that make the arm react to the wearer’s own movements.
Developed by DEKA Research and Development as part of the US Defense Advanced Research Projects Agency (Darpa) Revolutionizing Prosthetics initiative, the arm is being taken to market by start-up Mobius Bionics. It uses electromyogram (EMG) electrodes to detect electrical signals created when the wearer’s remaining muscles contract. The electrodes transmit the signals to a processor embedded in the prosthetic that controls its movement: opening and closing fingers or changing the grip. Sensors in the wearer’s shoe can transmit commands wirelessly in response to foot movements to alter the grip.
Unlike its futuristic movie counterpart, the number of movements in the LUKE arm is limited but wearers can perform more complex tasks than were possible with older designs. They can pick up more delicate objects, like an egg or a grape, because pressure sensors in the fingers report how much force the motors in the fingers are applying. Users have been able to turn keys in locks, prepare food and hold hand tools.
The arm did not need highly specialised sensors, according to Jan-Hein Broeders, European healthcare business development manager at Analog Devices, which supplied MEMS sensors, digital isolators and dual operational amplifiers used in the arm. The DEKA circuitry used regular building blocks, he adds, including the iCoupler digital isolators, to exceed the minimum isolation barrier of 4kV and achieve up to 5kV isolation to minimise the ageing effects of components, and the AD8233 integrated signalling block for EMG measurements. It also uses instrument amplifiers to power sensors in each finger which can measure the relative position of each finger from the other. This is needed for gripping objects so the correct pressure can be applied.
Other sensors are used to measure impact, for example if an object is touched, or a foot is placed on the ground. The acceleration profiles are quite different and the electronics need to recognise low or weak signals for accurate processing of data.
Gyroscopes measure rotation, for example of the wrist when pouring a glass of water. It is important to use an accurate gyroscope: by nature, they can be susceptible to drift in the range of one degree over a 20-minute period, says Broeders. The LUKE arm also uses capacitive sensor technology, where two plates with dielectric material between them can precisely measure movement. Analog Devices embedded the capacitive sensor in an analogue to digital converter (ADC) for precise detection of movements, continues Broeders.
To monitor position, of objects or fingers, for example, optical sensors use a light source to sense the position of the moving element of the LUKE arm.
A practical reason to use optical components, points out Broeders, is that the electronics can be sealed - they only need a window of light to operate - meaning that an embedded system in a limb can be water-resistant.
Considerations in prosthetic design are size, power dissipation and accuracy, Broeders says. Although control is provided by the natural electricity generated by the wearer’s muscles, a battery is needed to power the motors and electronics. This creates design challenges for developers. Although power dissipation is inevitable in motors, the onus is on engineering teams to minimise the drain of multiple sensors to a trickle of current.
MEMS devices such as accelerometers and force sensors can consume large amounts of current over a long period of use. But, much of the time, the arm may not be doing that much. Even while in use an arm typically moves far more slowly than the update rate of the sensor. The electronics can be programmed to wake up and power the various sensors only when motion is detected. This lets much of the sensor array run on mere nanoamps for most of the time.
Broeders says the next step is to lower power consumed during processing and computing operation, and to limit the power dissipation of the system, which limits measurement range.
The LUKE arm can be made for shoulder joint, mid upper arm or mid lower arm prosthetics.
For the wrist and up, BrainRobotics has created a robotic prosthetic hand, which it introduced at CES 2017. With the aid of EMG sensors, algorithms ‘learn’ the wearer’s intended operation during a series of 30-minute training sessions at a doctor’s surgery. Each gesture is learned in a 30-second test cycle. “The algorithms get better at recognising the difference in the signals,” explains Kacper Puczydlowski, robotics engineer at BrainRobotics. “By analysing the difference in lengths of time of the gesture, the algorithms assess if it is a flex movement, or if the signal is more forceful, indicating a control movement.” Software calculates the relationship between finger joint angles to determine the grip, and whether a rotation movement is needed.
Built from aluminium with a polymer sleeve, the choice of material provides high strength, says Puczydlowski. The hand is able to carry objects weighing up to 20kg. The Boston-based company, with a factory in China, expects to offer a full limb sleeve with sensing and fully functioning hand in January 2018 and customisable limbs in two to three years’ time.
As EMG sensors can only provide limited control over a prosthetic, research into neural prosthetics is examining how a modular prosthetic limb can respond to human thought. Researchers at the Applied Physics Laboratory of Johns Hopkins University in Baltimore are investigating how limbs can interpret and convert signals from the nervous system into motion.
A surgical procedure, targeted muscle reinnervation (TMR), enables thought control of a prosthetic limb instead. The nerves that controlled the original arm and hand are reassigned to muscles remaining in the limb. These nerves send electrical signals that are relayed electronically to controllers in the prosthetic device.
As well as controlling movement, prosthetic devices could feed signals back to the brain, allowing the wearer to experience sensations such as touch through the prosthetic device. Such targeted sensory reinnervation (TSR) re-routes the nerves. The surgical operation locates the fibre in the nerve that sends impulses from fingertips to the brain. This fibre is attached to nerves just below the skin to create touch points for fingers of a phantom hand on the remaining part of the limb. The brain does not recognise that what it identifies as the thumb is in fact located above the elbow joint. When the nerves vibrate, the wearer can feel the individual fingers, allowing him or her to feel when someone holds their hand, and also enables them to sense and grip objects using the prosthetic limb without looking.
In another project, Darpa is combining TMR and TSR in prosthetics. Trials confirm that not only could sensations be received, but that thought could transmit signals to move a prosthetic device that was physically detached and placed away from the test subject.
Michael McLoughlin, chief engineer at Johns Hopkins, says: “The arm learns how to understand what you want to do, rather than you learn to control the arm. This is the fundamental difference of what we are trying to do.”
Other limits in prosthetic limbs are the degrees of freedom and the range of movement that they can achieve. To reach out for a glass of water and to take a drink, involves seven degrees of freedom, with reach, grasp and rotate actions. The next step is to increase the repertoire of movements. Tests using cortical chips, carrying recording electrodes, implanted in spinal cord injury patients will test the theory that doubling the capacity to listen to the brain will result in more signals, and enable more complex movements.
In 2014, a Darpa-funded research team at the University of Pittsburgh demonstrated that neural implants enabled a 28-year-old quadriplegic, Nathan Copeland, to control a robotic arm and experience a sense of touch.
Two micro-electrode arrays were implanted in the motor cortex of Copeland’s brain, and two in the sensory cortex regions that corresponded to feeling in his fingers and palm of his hand. He was fitted with a robotic arm, equipped with torque detectors, by the Applied Physics Laboratory at John Hopkins University. The detectors enable the arm to detect when pressure is applied to a finger, and send electrical signals to the arrays to stimulate sensory neurons to provide the sensation of touch.
Development continues in pursuit of an increasingly intelligent prosthetic device. Researchers at the University of Pittsburgh plan to combine chips with telemetry systems to process recorded data before sending it to the implanted processor that controls the arm. The team is also looking at adding sensory capability, with materials that sense heat, for example, and convey that information to a chip implanted in the part of the brain that processes sensory stimuli.
In parallel, work is progressing on the materials used to construct the prosthetics. Open Bionics has turned to thermoplastic polyurethane for more affordable bionic prosthetics. The material has a rubbery feel and ‘give’, explains Joel Gibbard, CEO. “Most other bionic arms are made of aluminium, or other more rigid materials, for robustness,” he says. “Our combination of manufacturing and materials choice create a robust, flexible, rubbery, rather than a rigid, product. A rigid shell has no give under impact.”
The company also uses polylactic acid (PLA) plastic to create removable covers for the limbs. They are mainly aimed at young wearers, with a ‘snowflake hand’, inspired by the Disney film ‘Frozen’, an Iron Man hand to connect with the character from the Marvel Universe and a ‘Star Wars’ light-sabre hand, designed in collaboration with Lucasfilm’s ILMxLAB. More designs are planned and Gibbard hopes fashion brands will want to collaborate with his company on designs for adult wearers too.
The Bristol-based company currently has a development contract with the NHS to conduct clinical trials with bionic hands for children. The capability to produce bionic prosthetic arms for children as young as eight years old is only possible through the use of 3D printing, says Gibbard. One advantage afforded by the printing technology is a period of one day from receiving measurements to having a prosthetic ready to wear. This replaces three months of visits to a consultant for fitting, time taken to order and receive parts and the manufacturing process, he points out.
Another advantage is weight; the arms are considerably lighter than those currently available, says Gibbard. “3D printing can vary the density of components in critical areas,” he explains. This allows parts to be hollow where rigidity is not essential, which is not possible with other manufacturing processes, such as injection moulding or milling machining.
Instead of buying in mechanical components, the 3D-printing facility allows for custom-made component parts produced to fit and match the size and look of the other hand. “Open Bionics manufactures the full unit, hand and socket (the arm), with integrated electrodes and battery.” The cost is also reduced. “We are able to create a product with several functions for under £10,000 [compared with £25,000 to £100,000],” says Gibbard. “Prosthetics clinics will buy the hand, buy electrodes and the battery and fit them into the arm.”
Although still in development, the company is in the late stages of testing the bionic prosthetics.
“We want this technology to move very quickly,” says Gibbard. The company provides open-source software, so that change can be made easily by developers and made available to the development community. “One way is to make it more accessible for universities to work on this project, hence the Ada and development kits,” he continues.
The Ada robotic hand, named after Ada Lovelace, includes tutorials and instructions to build a fully articulated, robotic hand, printable on a desktop home 3D printer and assembled in less than one hour. The Ada hand can be controlled by a computer over a USB connection. It houses all the actuators needed to move the fingers and has a circuit board equipped with a microcontroller that can be programmed using the Arduino programming environment.
A soon-to-be announced Brunel robotic hand development kit is faster, lighter and stronger than Ada, and is a full assembly unit that can be placed straight onto a robot to advance prosthetic research projects.
The ease and low cost of 3D printing has been recognised by Mick Ebeling, CEO of Not Impossible Labs. He has set up Project Daniel in Sudan. With support from Intel, he has established a 3D prosthetic printing lab. Laptop computers and 3D printers are used to produce and fit arms to those injured in the war. The lab has been established as a training facility, where local people are able to learn how to measure, design and manufacture the limbs and fit them.
Armed with sensors, nerve interfaces and custom design, it seems that prosthetics are set to change way beyond what we have seen in the movies so far.