Monkey Brain Makes Robot Walk

Last year I had the pleasure of seeing Dr. Miguel A. L. Nicolelis do a lecture at Reykjavik University about his research on mind-controlled robotics. His previous efforts have put him in the spotlight more than once which include enabling a monkey to control a robotic arm with his mind. Now Nicolelis and his team has pushed the bar by enabling a monkey to move robotic legs in a walking pattern.

Making a Robot Walk by Thinking

Nicolelis’ latest advances are an excellent step forward and a great continuation of our last featured article on entwining biological tissue with machines.

With implanted electrodes monitoring brain activity (as opposed to a non-invasive BCI), a rhesus monkey named Idoya was able to make a humanoid robot move its legs in a walking pattern for 3 whole minutes! The monkey had independent control of each leg, and the exact controllable parameters were the stride length and the speed of movement. However, the legs did not touch the ground, but were suspended a few centimeters in the air. (You probably won’t find that emphasized in many other articles. Leaving it at ‘made it walk’ sells a lot better). In order to make it really walk, they’ll need to stimulate the monkey’s brain to feel force from the legs — which is something that they’re currently working on. This step, however, having the monkey control each leg’s force and stride length independently, is a great achievement that brings us a lot closer to remote controlled exoskeletons, super prosthetics and general technologies that help us decipher brain activity.

Nicolelis' monkey and Cheng's CB robot

The research is a collaboration between Nicolelis’ team at Duke University and Gordon Cheng’s at the ATR Computational Neuroscience Laboratories in Kyoto, Japan. The robot is called Computational Brain (CB) and according to the NY Times it was chosen due to accurate mimicking of human locomotion (strangely, I’ve never heard of this robot). It’s noteworthy that the monkey was stationed in North Carolina and the robot in Japan — connected via our beloved internet.

When Idoya’s brain signals made the robot walk, some neurons in her brain controlled her own legs, whereas others controlled the robot’s legs. The latter set of neurons had basically become attuned to the robot’s legs after about an hour of practice and visual feedback.

Idoya cannot talk but her brain signals revealed that after the treadmill stopped, she was able to make CB walk for three full minutes by attending to its legs and not her own.

Vision is a powerful, dominant signal in the brain, Dr. Nicolelis said. Idoya’s motor cortex, where the electrodes were implanted, had started to absorb the representation of the robot’s legs — as if they belonged to Idoya herself. [via NYTimes]

Predicting Movements

Duke high density array with 128 microwires in a monkey's motor cortexThe mechanism that predicts or translates Idoya’s movements is AI software (a cocktail of artificial neural networks and other software, if I remember correctly). The software analyzes firing patterns of neuronal groups and associates them with the generated, physical motion. Current technology only enables us to monitor 250-300 neurons in real time (a human brain has an estimated one hundred billion). Yet despite the relatively low number, the software can predict movements with 90% accuracy 3-4 seconds before they actually happen.

Practical Use, Human Prosthetics

Nicolelis hopes that his research will prove useful for human prosthetics, where we’ll be able to control artificial limbs using only our minds. They estimate that, within the next year, they’ll start work on robotic leg-brace prototypes intended to help people suffering from paralysis.

Duke’s press release video below leads me to ask you an intriguing question: Why do you think they show a 3D simulation rather than the actual monkey?



[[Alternate video location at YouTube]]

Previous Research and Online Media

Nicolelis’ work is fascinating and I’m sorry to report that my suggestion to record the lecture he held back here was delivered too late to execute. But thankfully I found a similar one, albeit a few months older, titled Actions from Thoughts held in Aspen in July 2007. You can also check out the brief video coverage in an older Think Artificial post, showing a monkey control a robotic arm.

Links & References


Related posts:

  1. Robot Uprising 2008-2015, Market to Reach $15B A new report states that personal robotics industry is expected to reach USD$15 billion in 2015. If we compare this...
  2. Living Tissue to Power Your Computer? A few years ago I read about an experiment that used living cortical neurons from a rat brain to perform...

6 Comments, Comment or Ping

  1. Jim

    “Why do you think they show a 3D simulation rather than the actual monkey?”

    Beacuse the film of a monkey with wires coming out of it’s head might disturb sensitive people enough that they wouldn’t be able to get past it and see how incredible this development is.

  2. Exactly Jim. In addition, the way they got the monkey to stay on the treadmill might be an additional factor (can’t have it falling or ripping out the wires). I’m sure that, for a lot of people, the only thing that’s worse than a monkey with wires coming out of its head — is a monkey with wires coming out of its head forced to run for months tied to a treadmill.

  3. Elijah

    This is quite amazing.

  4. I’m really interested in this now. It’s one thing to find the corresponding nerves that make your legs move and then read them; it’s another thing altogether to find out that you can learn to will the attached device to move independently.

    This could prove to be really huge. Who’s to say you even -have- to find the correct group of nerves to monitor? What if our brain is able to learn to fire the correct neuronal groups to correspond with any devices we attach to it? You could learn to control much more than just robotic arms and legs at will.

  5. Hi Amaroq.

    Who’s to say you even -have- to find the correct group of nerves to monitor? What if our brain is able to learn to fire the correct neuronal groups to correspond with any devices we attach to it?

    You hit the nail on the head and struck a nerve. That’s exactly how this works: The software is learns to detect a certain neuronal firing pattern which corresponds to a mental state or visualization the monkeys are trained to produce.

    This could, for example, be a visualization of a leg move—or one of a cow being milked by Mickey Mouse in a penguin suit. All that matters is that a pattern can be generated without too much noise, loss of focus, drifting mind, etc. That pattern can theoretically be associated with any command—a keyboard key of your choice or your DeLorean‘s fuel injector.

    Emotiv’s EPOC headset for consumers uses a similar approach; it’s a must-see for anyone interested in BCIs!

Reply to “Monkey Brain Makes Robot Walk”

Please read the Terms of Use before commenting!

Basic HTML allowed (a, blockquote, strong, em)


Other ..

Think Artificial is a proud member of the 9rules blog community.