Mind-control over bionic limbs involves multiple technologies and approaches. Ultimately, though, we are headed toward a future where bionic limbs are fully integrated into our nervous system and as intuitive to use as our natural limbs.
What is Mind-Control Over Bionic Limbs?
By the simplest definition, every prosthesis is a mind-controlled device. No pirate’s hook ever did anything of its own accord.
But this is not what most people mean when they talk about mind-controlled bionic limbs. What they mean is controlling those limbs purely with thought, intuitively, without intermediate actions, the same way that we control our natural limbs.
With that definition in mind, let’s take a look at the existing control options for bionic limbs, starting with upper limbs.
Dual-Site, Direct Control Myoelectric Systems
If you don’t know what these are, have a quick read of Finding the Right Myoelectric Control System.
Do these systems offer true mind-control? No, they don’t. Have a look at this short video of some of the actions required to use a Hero Arm:
See the explicit intermediate steps? Now, this is a perfectly good bionic hand. In fact, it’s one of the most successful in the marketplace because its manufacturer — Open Bionics — does its best to make it affordable and is completely honest about its capabilities and limitations. But it does not offer true mind control.
Myoelectric Pattern Recognition
What is myoelectric pattern recognition? We have an entire article on the subject, which you can read about by clicking the preceding link. Or you can get a quick overview by watching this one-minute video on a pattern recognition system from Ottobock called Myo Plus:
What can you do with this type of control system? Let’s have a look:
As you can see, using this system is more intuitive than using a direct control system. But this is also a commercial, so Wolfgang’s assessment is a little too generous. There are currently limits to the number of patterns that a system like this can distinguish, meaning the user is restricted to a small subset of the actions available to a natural hand.
Also, this particular configuration still appears to use skin-surface sensors, which are prone to errors when used in some arm positions or movements, or in situations that are excessively hot or cold.
Admittedly, these problems are being solved. Sensors are getting better and can even be implanted in muscles for improved signal clarity. The Artificial Intelligence (AI) capabilities of pattern recognition are constantly improving. So, too, are mobile software applications that allow users to recalibrate their systems on the fly to better match changing circumstances.
But even if all of these improvements achieve their maximum potential, there is still a missing element when it comes to user control…
Agonist-antagonist Myoneural Interface (AMI)
AMI is a technology solution that revolves primarily around surgically restoring the agonist-antagonist muscle pairings that control our limbs. Basically, severed muscles in the residual limb are reconnected to form such a pairing:
This restores the brain’s natural sense of position and movement for the missing portion of the limb, which is called “proprioception”.
A bionic joint is then calibrated so that its movements correspond to the movements of the restored muscle pairs, and, voila, the brain can better control the bionic limb with thoughts and also sense its position in space.
All of that is a gross over-simplification, but you can get the full story by reading our complete article on AMI.
Once a bionic limb is set up so that its movements correspond to the movements of the muscle pairs, visual cues then reinforce this model, i.e. the user attempts to raise the front of his foot toward his shin, the muscle pair responds accordingly, the bionic foot mirrors this movement, and the user’s eyes confirm it. This creates a closed-loop control system.
Imagine having no feeling in your hand, i.e. no ability to feel any sensations, even though the hand works perfectly well in every other way.
How would you use it?
If you have a natural hand, use it now to grab something nearby: a pen, computer mouse, or remote control. See how you glance at an object just long enough to take hold of it, then pass off the rest of the task to your sense of touch?
In the absence of this capability, you have to visually guide every detail of your hand’s action. And I mean every detail, including force and fine adjustments.
Have a look at the first 13 seconds of this video:
Notice the broken eggs?
Now, the young lady in this video, Tilly Lockey, is one of the best spokespersons on the planet for those with limb differences. She’s also an experienced bionic hand user who could have avoided breaking the eggs if she wanted. But she was just having a bit of fun in the kitchen, so she wasn’t trying to manage the details of every move of her bionic hands.
But that’s kind of the point. Without sensory feedback, you can’t take your eye off a task with any bionic hand.
Now consider this next short video featuring a blindfolded man using his Psyonic Hand to pick up an empty eggshell, which is a lot more fragile than an intact egg:
The difference is that this hand has pressure sensors in its fingertips. When the fingers come into contact with an object, vibrators in the socket notify the user. The more force applied, the stronger the vibration.
Sensory feedback is a key component of mind control. But you ain’t seen nothing yet.
Sensory Feedback with a True Neural Interface
Imagine if you could implant electrodes into your arm to electrically stimulate your nerves and trick your brain into thinking it was feeling what your bionic fingers were feeling.
Turns out, you can. It’s called an Advanced Neural Interface for Bionic Hands.
Here’s what’s truly exciting: in 2024, Atom Limbs is expected to bring the world’s most advanced bionic arm/hand to market — the Atom Touch:
The footage in this video is actually of the Atom Touch’s predecessor, the Modular Prosthetic Limb (MPL), but make no mistake about it: the Atom Touch is another leap forward, especially in the area of sensory feedback.
The Touch will have 200+ sensors that will enable four types of sensory feedback:
This feedback will be conveyed via an advanced neural interface.
This is beyond anything we’ve seen to-date but even it does not come close to the full potential for sensory feedback, including sensing things like temperature, texture, shape, etc.
Mind Control for Bionic Legs & Feet
Historically, more attention has been paid to mind-control for upper limbs than for lower limbs.
The reason for this is that lower limbs have been quite successful using local microprocessors to control the actions of bionic knees and ankles completely independent of the user’s mind. Check out this slightly older video on Ottobock’s C-Leg as an example of how this is done for a bionic knee:
Another reason is that, until the introduction of AMI, lower-limb amputations didn’t typically preserve the muscle movements required for mind control. Now they can, which makes mind control much more feasible (note that the surgical process referred to as “Ewing Amputation” in this video is part of AMI):
This is now leading to improved mind-controlled systems for lower limbs:
Ultimately, we believe that lower limbs will use a hybrid control system: mind control to initiate actions and local microprocessors to optimize reactions, such as those required for stumble control.
For a comprehensive description of all current upper-limb technologies, devices, and research, see A Complete Guide to Bionic Arms & Hands.
For a comprehensive description of all current lower-limb technologies, devices, and research, see A Complete Guide to Bionic Legs & Feet.