Advanced Neural Interfaces for Bionic Hands

Neurosprosthetics Cover Image

Our brains are intimately connected to our bodies by a sophisticated network of nerves and touch receptors. Even the simple act of grasping an object involves a fully integrated system of remarkable complexity and nuance. It was only a matter of time before we sought to incorporate our bionic hands into this system. That time has arrived, and our key to success is an advanced neural interface.

The Limitations of Myoelectric Control Systems

Myoelectric arms/hands use sensors placed on the skin’s surface to detect muscle movements in the residual limb. One type of movement causes the bionic fingers to open; another causes them to close. For a complete description of this technology, please see Understanding Bionic Arms and Hands and Finding the Right Myoelectric Control System.

This type of control system is unquestionably useful. For example, consider this young lady, who lost both her hands to meningitis when she was only 15 months old. Here she is pouring a glass of water using her myoelectric Hero Arms from Open Bionics:

This and the many tasks she demonstrates in other videos would be extremely difficult, if not impossible, without this technology.

But here are a couple of gifs demonstrating some myoelectric shortcomings at the Cybathlon event in 2016. In this first gif, the user has difficulty grasping a cone-shaped object:

In this next example, the user is unable to pick up a clothespin in a position that he can use to pin an article of clothing to the clothesline:

You may think these are merely examples of poor electromechanical dexterity, but there is more to it than this. The bionic hand being used here is an i-Limb. It has a feature called “automatic finger stalling”, which causes fingers to stall when they encounter a certain amount of resistance from an object, but allows the hand’s other fingers to continue to close until they also meet resistance. This is very similar to how our natural hands behave when grasping an irregular shaped object.

Now, picking up a cone is not easy. But the inability of the user to make a better attempt at this task is in large part a myoelectric control problem. Either she lacks the training to attempt a grip where finger stalling will help her, or she knows what to do but can’t make her bionic hand do it.

The failed attempt to grasp the clothespin is even more telling. When we grab something like a clothespin with our natural hand, we don’t attempt to pick it up so precisely. We don’t need to. As long as we grab it in the right general area, we can use our sense of touch combined with multiple small adjustments to achieve the perfect grip. We don’t even need to look at it as we do this — our touch of sense alone allows us to complete the task.

The motivation for using an advanced neural interface is to connect a bionic hand to the brain in a way that restores this natural intuitive control and sensory feedback.

Upper Limb Neural Interface

Scientists are also trying to make bionic control systems more reliable. The fact is, using skin surface sensors to detect the electrical signals is incredibly challenging. Inadvertent muscle movements can trigger the wrong commands. Sensors can shift away from the muscles they are trying to monitor or lose contact with the skin. Changes in temperature, humidity, and the state of the residual limb can all impact the quality of myoelectric signals.

Repeated failures like this shake the user’s confidence in a prosthetic, making it less useful than it should be.

The History of Neural Interfaces

Neural interfaces have been around since 1924. Their first involvement in medicine began with the introduction of the cochlear implant and the heart pacemaker in the 1950s, though it is doubtful that anyone viewed the interfaces used by these devices as separate from the devices themselves.

The modern concept of a neural interface began to emerge in the 1970s with research on brain-computer interfaces (BCI).

The expansion of this concept as a means to control bionic limbs is a much more recent development, only achieving critical mass with funding from programs like Hand Proprioception and Touch Interfaces (HAPTIX), launched in 2015 by the U.S. Government’s Defense Advanced Research Projects Agency (DARPA).

Within a few short years, universities working with DARPA were already creating impressive prototypes, as shown in this lighthearted video by the University of Utah:

How Neural Interfaces are Used in Bionic Hands

We would explain this to you in text, but there happens to be a PBS video that does a great job of this already. It involves an early prototype built by Case Western Reserve University — another of DARPA’s university partners:

As the video shows, the sensors on the bionic hand transmit signals to electrodes that have been surgically embedded in the user’s arm. The electrodes stimulate the nerves that they are attached to, which then transmit the requisite information to the brain.

That’s how the sensory feedback portion of the system works. The methods used to exert control over the bionic hand vary. In some cases, a myoelectric control system is still used:

DARPA's Hybrid Neural Interface for Bionic Hands

The advantage of this hybrid model over a traditional myoelectric system is that the user’s attempted actions are better informed by the sensory feedback.

More recent prototypes use the embedded electrodes for two-way communication:

DARPA Two-Way Neural Interface

Because commands can be sent from the brain directly to the embedded electrodes via nerves, and then from the electrodes to the bionic hand’s control system, this should eliminate many of the problems caused by using myoelectric sensors.

Remaining Challenges

Challenge # 1 — Eliminate Invasive Surgery

Surgery is surgery, with all its incumbent risk of scarring and infection. It is also expensive.

Fortunately, researchers at the University of Pittsburgh recently discovered that existing spinal cord stimulators can be used to produce a sense of touch in missing limbs.

With 50,000 people already receiving implants of these stimulators every year in the U.S. alone, that means there are doctors all over the world who are already trained in this procedure.

Even more important, it is a simple outpatient procedure, thereby avoiding the potential complications of more invasive surgery.

There is still a fair bit of work to be done before this can be used as a neural interface for a bionic hand, but here is what one female patient said about her experience with the new technique:

Challenge # 2 — Improve Sensory Capabilities

Touch sensors are useful, but to truly take advantage of neural interfaces, we need to expand the sensory capabilities of our bionic hands. One solution might be to use electronic skin, which you can read about in our article: A Quick Look at Electronic Skin.

Challenge # 3 — Improve Signal Processing and Interpretation

Signals sent from bionic sensors to electrodes must be processed and interpreted to determine the correct stimulation of the attached nerves. Similarly, commands sent from the brain to the nerves must be processed and interpreted to determine the signals that need to be sent to the bionic hand’s control system.

All of this is not only processing-intensive; it also requires sophisticated artificial intelligence.

Currently, the AI routines are still too specific. That is, they require too much task-specific training.

For example, a baseball and an orange are quite similar. The baseball has seams, whereas the orange is heavier with pitted skin. But both objects are a similar size and shape.

When neural interface AI routines have been trained to handle a baseball, they should know roughly how to handle an orange without having to be retrained all over again.

Right now, that is not the case with many similar tasks, which imposes too high a training cost on all involved.

Challenge # 4 — Reduce Costs

In what is a near-universal complaint about all bionic technologies, we need to get the costs down to make the technologies more accessible. There is no easy solution to this problem. We just have to keep pushing.

If we need a little hope on this goal, look no further than the cost of traditional myoelectric systems. A few years ago, even the most basic myoelectric arm costs tens of thousands of dollars. Now there are 3D printed myoelectric devices available for a fraction of that price.

Related Information

For more information on upper limb bionics, see Understanding Bionic Arms & Hands.

For more general information on bionic touch, see Understanding Bionic Touch.

For more general information on thought control, see Mind-Controlled Bionic Limbs.