The idea is that interactive artificial intelligence will help prosthetics better recognize human intent, register their surroundings, and continue to evolve and improve over time.
It’s all in the mind: Mind-controlled prostheses
The technology behind upper limb prostheses has come on leaps and bounds over the past few decades. Using surface electromyography, skin electrodes on the rest of the arm stump can detect the smallest muscle movements. These biosignals can be converted into electrical impulses and transmitted to the prosthetic limb.
The wearer controls the artificial hand using a stump. Techniques from pattern recognition and interactive machine learning allow people to teach their prosthetics to their individual needs as they gesture or move.
Currently, advanced robotic prostheses have not yet reached optimal standards in terms of comfort, function and control, so many people with missing limbs still prefer purely cosmetic prostheses, often with no additional function.
Researchers are particularly focused on how to improve control of both real and virtual prosthetic upper limbs. The focus is on what is known as intent detection. They continue to work on the recording and analysis of human biosignals and develop innovative algorithms for machine learning aimed at detecting individual human movement patterns.
Previous research on people with and without disabilities is used to confirm the results. In addition, the shared autonomy between humans and robots was intended to verify the safety of the results.
Researchers are exploiting the potential offered by intent detection to control assistive and rehabilitation robots. This includes body-worn robots such as prostheses and exoskeletons, as well as robotic arms and simulations using virtual reality.