Last year I had the pleasure of seeing Dr. Miguel A. L. Nicolelis do a lecture at Reykjavik University about his research on mind-controlled robotics. His previous efforts have put him in the spotlight more than once which include enabling a monkey to control a robotic arm with his mind. Now Nicolelis and his team has pushed the bar by enabling a monkey to move robotic legs in a walking pattern.
Making a Robot Walk by Thinking
Nicolelis’ latest advances are an excellent step forward and a great continuation of our last featured article on entwining biological tissue with machines.
With implanted electrodes monitoring brain activity (as opposed to a non-invasive BCI), a rhesus monkey named Idoya was able to make a humanoid robot move its legs in a walking pattern for 3 whole minutes! The monkey had independent control of each leg, and the exact controllable parameters were the stride length and the speed of movement. However, the legs did not touch the ground, but were suspended a few centimeters in the air. (You probably won’t find that emphasized in many other articles. Leaving it at ‘made it walk’ sells a lot better). In order to make it really walk, they’ll need to stimulate the monkey’s brain to feel force from the legs — which is something that they’re currently working on. This step, however, having the monkey control each leg’s force and stride length independently, is a great achievement that brings us a lot closer to remote controlled exoskeletons, super prosthetics and general technologies that help us decipher brain activity.
The research is a collaboration between Nicolelis’ team at Duke University and Gordon Cheng’s at the ATR Computational Neuroscience Laboratories in Kyoto, Japan. The robot is called Computational Brain (CB) and according to the NY Times it was chosen due to accurate mimicking of human locomotion (strangely, I’ve never heard of this robot). It’s noteworthy that the monkey was stationed in North Carolina and the robot in Japan — connected via our beloved internet.
When Idoya’s brain signals made the robot walk, some neurons in her brain controlled her own legs, whereas others controlled the robot’s legs. The latter set of neurons had basically become attuned to the robot’s legs after about an hour of practice and visual feedback.
Idoya cannot talk but her brain signals revealed that after the treadmill stopped, she was able to make CB walk for three full minutes by attending to its legs and not her own.
Vision is a powerful, dominant signal in the brain, Dr. Nicolelis said. Idoya’s motor cortex, where the electrodes were implanted, had started to absorb the representation of the robot’s legs — as if they belonged to Idoya herself. [via NYTimes]
The mechanism that predicts or translates Idoya’s movements is AI software (a cocktail of artificial neural networks and other software, if I remember correctly). The software analyzes firing patterns of neuronal groups and associates them with the generated, physical motion. Current technology only enables us to monitor 250-300 neurons in real time (a human brain has an estimated one hundred billion). Yet despite the relatively low number, the software can predict movements with 90% accuracy 3-4 seconds before they actually happen.
Practical Use, Human Prosthetics
Nicolelis hopes that his research will prove useful for human prosthetics, where we’ll be able to control artificial limbs using only our minds. They estimate that, within the next year, they’ll start work on robotic leg-brace prototypes intended to help people suffering from paralysis.
Duke’s press release video below leads me to ask you an intriguing question: Why do you think they show a 3D simulation rather than the actual monkey?
[[Alternate video location at YouTube]]
Previous Research and Online Media
Nicolelis’ work is fascinating and I’m sorry to report that my suggestion to record the lecture he held back here was delivered too late to execute. But thankfully I found a similar one, albeit a few months older, titled Actions from Thoughts held in Aspen in July 2007. You can also check out the brief video coverage in an older Think Artificial post, showing a monkey control a robotic arm.
Links & References
- Nicolelis’ Faculty Page at Duke
- DukeMedNews Press Release
- The feature image is a Duke high density array with 128 microwires in a monkey’s motor cortex. Copyright Nicolelislab.net
- Robot Uprising 2008-2015, Market to Reach $15B A new report states that personal robotics industry is expected to reach USD$15 billion in 2015. If we compare this...
- Living Tissue to Power Your Computer? A few years ago I read about an experiment that used living cortical neurons from a rat brain to perform...