Skip to main content

Brain Power Moves Virtual Objects

Researchers closer to technology allowing quadriplegics to move arms and legs

Cover_OriginalRemix copy_0.jpg
Monkeys have been trained to move virtual objects with only their thoughts and an avatar hand. Credit: Katie Zhuang, Duke.

In
a first-ever demonstration of a two-way interaction between a primate brain and
a virtual body, two monkeys trained at the Duke University Center for
Neuroengineering
learned to employ brain activity alone to move an avatar
hand and identify the texture of virtual objects.

"Someday
in the near future, quadriplegic patients will take advantage of this technology
not only to move their arms and hands and to walk again, but also to sense the
texture of objects placed in their hands, or experience the nuances of the
terrain on which they stroll with the help of a wearable robotic
exoskeleton," said study leader Miguel Nicolelis, MD, PhD,
professor of neurobiology at Duke University Medical Center and co-director of
the Duke Center for Neuroengineering.

Without
moving any part of their real bodies, the monkeys used their electrical brain
activity to direct the virtual hands of an avatar to the surface of virtual
objects and, upon contact, were able to differentiate their textures.

Although
the virtual objects employed in this study were visually identical, they were
designed to have different artificial textures that could only be detected if
the animals explored them with virtual hands controlled directly by their
brain's electrical activity.

The
texture of the virtual objects was expressed as a pattern of minute electrical
signals transmitted to the monkeys' brains. Three different electrical patterns
corresponded to each of three different object textures.

Because
no part of the animal's real body was involved in the operation of this
brain-machine-brain interface, these experiments suggest that in the future
patients severely paralyzed due to a spinal cord lesion may take advantage of
this technology, not only to regain mobility, but also to have their sense of
touch restored, said Nicolelis, who was senior author of the study published in
the journal Nature
on October 5, 2011.

"This
is the first demonstration of a brain-machine-brain interface (BMBI) that
establishes a direct, bidirectional link between a brain and a virtual
body," Nicolelis said

"In
this BMBI, the virtual body is controlled directly by the animal's brain
activity, while its virtual hand generates tactile feedback information that is
signaled via direct electrical microstimulation of another region of the
animal's cortex."

"We
hope that in the next few years this technology could help to restore a more
autonomous life to many patients who are currently locked in without being able
to move or experience any tactile sensation of the surrounding world,"
Nicolelis said.

"This
is also the first time we've observed a brain controlling a virtual arm that
explores objects while the brain simultaneously receives electrical feedback
signals that describe the fine texture of objects 'touched' by the monkey's
newly acquired virtual hand," Nicolelis said.

"Such
an interaction between the brain and a virtual avatar was totally independent
of the animal's real body, because the animals did not move their real arms and
hands, nor did they use their real skin to touch the objects and identify their
texture."

"It's
almost like creating a new sensory channel through which the brain can resume
processing information that cannot reach it anymore through the real body and
peripheral nerves."

The
combined electrical activity of populations of 50 to 200 neurons in the
monkey's motor cortex controlled the steering of the avatar arm, while
thousands of neurons in the primary tactile cortex were simultaneously
receiving continuous electrical feedback from the virtual hand's palm that let
the monkey discriminate between objects, based on their texture alone.

"The
remarkable success with non-human primates is what makes us believe that humans
could accomplish the same task much more easily in the near future,"
Nicolelis said.

It
took one monkey only four attempts and another nine attempts before they
learned how to select the correct object during each trial. Several tests
demonstrated that the monkeys were actually sensing the object and not
selecting them randomly.

The
findings provide further evidence that it may be possible to create a robotic
exoskeleton that severely paralyzed patients could wear in order to explore and
receive feedback from the outside world, Nicolelis said.

Such
an exoskeleton would be directly controlled by the patient's voluntary brain
activity in order to allow the patient to move autonomously. Simultaneously,
sensors distributed across the exoskeleton would generate the type of tactile
feedback needed for the patient's brain to identify the texture, shape and
temperature of objects, as well as many features of the surface upon which they
walk.

This
overall therapeutic approach is the one chosen by the Walk Again Project, an
international, non-profit consortium, established by a team of Brazilian,
American, Swiss, and German scientists, which aims at restoring full body
mobility to quadriplegic patients through a brain-machine-brain interface
implemented in conjunction with a full-body robotic exoskeleton.

The
international scientific team recently proposed to carry out its first public
demonstration of such an autonomous exoskeleton during the opening game of the
2014 FIFA Soccer World Cup that will be held in Brazil.

Other
authors include Joseph E. O'Doherty, Mikhail A. Lebedev, Peter J. Ifft, Katie
Z. Zhuang, all from the Duke University Center for Neuroengineering and
Solaiman Shokur, and Hannes Bleuler from the Ecole Polytechnic Federale de
Lausanne (EPFL), in Lausanne, Switzerland.

This
work was funded by the U.S. National Institutes of Health.