We are entering a period of bionic arm & hand development that is both exciting and potentially confusing. This article is meant to clarify the most recent options for pattern recognition systems.
A Slowly Unfolding Revolution
It is no secret that dual-site, direct-control myoelectric systems have struggled to deliver adequate user control of bionic hands.
Even though they may deliver a satisfactory solution to some members of the limb-different community, we have yet to meet anyone who thinks that they are the ultimate solution.
Over the past decade, it has become increasingly clear that myoelectric pattern recognition systems offer far more potential for intuitive use. Systems like Coapt Engineering’s Gen2, Infinite Biomedical Technologies’ Sense, and Ottobock’s Myo Plus have all matured into very capable systems.
Most impressively, these systems have made significant strides in addressing the issues that have been so problematic for direct-control devices, mainly by being able to compensate for different arm positions and the ever-changing conditions of residual limbs (hot, cold, sweaty, tired, etc.), though the latter typically requires a quick recalibration.
Here is an example of such a system (Ottobock’s Myo Plus):
However, there are two problems with these systems:
- They are expensive. Adding a pattern recognition system to a mainstream bionic hand can increase the cost by 30 to 50 %, depending on the components involved.
- Just as these solid, reliable pattern recognition systems have begun to enter mainstream use, more revolutionary, proprietary, all-in-one solutions have appeared on the horizon.
The Promise of a More Transformative Revolution
The BrainRobotics Prosthetic Hand Story
In November of 2018, a video appeared on YouTube of a girl playing the piano with a bionic hand:
The hand in question shares its roots with a company called BrainCo. In early 2020, BrainCo won a couple of awards at the 2020 Consumer Electronics Show (CES) for its Dexus Prosthetic Hand, which again featured an impressive level of user control over individual fingers.
At that time, the company told us that they hoped to win FDA approval of the hand by the end of 2020.
After this, BrainCo spun off its bionic hand into a company called BrainRobotics. This new company now plans to release two bionic hands: a 2-channel hand in 2022 and a more advanced 8-channel hand sometime after that. Only the 8-channel hand will deliver the kind of control needed to play the piano.
BrainRobotics has recently made a lot of progress in sorting all this out, focusing mainly on getting FDA approval for their 2-Channel hand, as well as organizing their website and other media assets to support its commercial launch. But we have yet to see their product on the market.
There is a reason that we are telling you this story. There is a big difference between building breathtaking prototypes versus bringing a product to market. We are optimistic about the future of BrainRobotics, and still very much excited about a bionic hand that will eventually allow users to play the piano. However, from a consumer standpoint, caution is in order. Until you can actually buy a hand and take it home, it isn’t a viable option.
The Atom Touch Story
In 2006, the Defense Advanced Research Projects Agency (DARPA) launched its Revolutionizing Prosthetics program. The goal of this program was to develop a bionic arm that could mimic natural arm and hand movements for any level of amputation.
In 2015, DARPA pushed the envelope a bit further by launching its Hand Proprioception and Touch Interfaces (HAPTIX) program. The goal of this program was to enable precision control of a bionic hand and sensory feedback via bi-directional nerve implants.
The Modular Prosthetic Limb (MPL), developed by Johns Hopkins University (JHU), became the lead project for this second initiative. Here is a look at the MPL during its first take-home trial in 2018, again with a focus on playing the piano:
Roll forward four years. The MPL has now entered a new phase where its technology is being commercialized as the Atom Touch from Atom Limbs.
The claims of advanced functionality for the Atom Touch, scheduled to launch in 2023, include:
- user control over the hand that mirrors control of a natural hand, including control over individual fingers;
- wrist and elbow components that mirror the abilities of a natural wrist and elbow;
- advanced sensory feedback including the ability to sense contact, force, position, and velocity (it may also eventually include temperature).
We’re not talking about an incremental improvement over existing bionic arms/hands here; we’re talking about a quantum leap.
According to the company, even the artificial intelligence (AI) component goes well beyond that of existing pattern recognition systems. While we can debate the terminology, what we cannot deny is that any system built on data from 200 sensors has the potential to be a lot more sophisticated than one built on 8 or 16.
Will they deliver? We happen to think they will because of the caliber of the people involved. But whether they do in 2023 is a bit beside the point. Once those who have lost an arm or hand see something that will restore near-natural capabilities, they’re not going to relinquish that dream. And, right now, bionic hands with proprietary pattern recognition systems seem a lot closer to fulfilling those asperations than generic systems like Gen2 and Sense, and even closer than quasi-generic systems like Myo Plus.
Unfortunately, this leaves potential end-users in a tough spot. Should they go for an expensive but reliable generic system, which purposely limits its capabilities to match those of the component devices it supports? Or should they roll the dice with transformative technology that may have trouble making the leap from the laboratory to the home?
A Possible Middle Ground
Over the past year, we’ve begun to encounter new bionic hands that may offer alternatives to traditional pattern recognition systems.
One of these hands is called Adam’s Hand from BionIT Labs. It automatically adapts to the shape of whatever object it is trying to grasp, which eliminates the need for users to select different grips for different tasks. This, in turn, reduces the need for advanced pattern recognition to determine user intent:
Adam’s Hand doesn’t completely eliminate pattern recognition. Instead, it uses a simplified version of the technology to distinguish between open and close commands, one that requires only a 2-channel sensor system, i.e. the same type of sensor system used for direct control.
To be honest, we were skeptical of this concept at first, but every time we see a new video of Adam’s Hand, it just keeps getting better. We also can’t argue with its strategy, as simpler is always better from the user’s perspective. Note, also, that BionIT Labs is not some basement startup. It has serious support from various organizations in the European Union.
Another middle-ground contender may be the MeHand from Russia’s MaxBionic. The following video is in Russian with no English translations, but check out the impressive demonstrations of user control for some tasks:
The company claims that it supports the same level of user control as a Myo Plus system but with only 2-channel sensor input. This is a big claim but the company is already selling its hand in Russia, Germany, and the Commonwealth of Independent (CIS) countries. It’s also going through clinical trials in the U.K., with a similar process to follow in the U.S.
So, why do we refer to these two examples as a possible middle ground? First, their current focus is on the completion of daily tasks, not more advanced capabilities. Second, the complete cost of these devices, including the hand, socket, prosthetist fees, and their 2-channel pattern recognition systems is likely to be between $30,000 to $40,000 US. That is significantly less than the cost of a middle-of-the-pack bionic hand like the bebionic when paired with one of the mainstream pattern recognition systems.
What’s a User to Do?
Here’s a statement that we’re comfortable making about the existing mainstream pattern recognition companies like Coapt, IBT, and Ottobock (i.e. its Myo Plus product): if you purchase one of these systems and you put the required effort into proper training, you will get what they promise. It won’t be the Atom Touch or the ability to play the piano, but it will be a notable and reliable upgrade over direct-control systems.
If you wait for the Atom Touch, the more advanced version of the BrainRobotics hand, or other advanced devices to reach the market, be aware that some uncertainties remain.
Alternatively, if you have access to new hands like Adam’s Hand or the MeHand, we suggest that you take them on a trial run to see if their claims are true.
We don’t have a horse in this race. The whole point of this article is simply to make you, our readers, aware of your options and to encourage you to do a proper cost-benefit analysis of each. We hope we have done that.
Related Information
For a look at pattern recognition systems in general, see Myoelectric Pattern Recognition for Bionic Arms & Hands.
For a good overview of control systems in general, see bionic arm/hand control systems.
For a complete description of all current upper-limb technologies, devices, and research, see our comprehensive guide.