Brainmachine Interface Turns Robots Into Gamers By Tim Hornyak

NOT A PEACE SIGN: A mechanical robot hand can form a rock, paper or scissors after being fed data from a player's brain as "read" by an fMRI machine.

MIND

READING

To translate thought into robotic actions, Advanced Telecommunications Research and Honda rely on functional magnetic resonance imaging to "read" the mind. But other methods exist. Duke University neurobiologist Miguel Nicolelis, for instance, implanted electrodes in a monkey's brain to manipulate a robot arm. In the hope of helping disabled people communicate, other groups have used electroencephalography (EEG) to translate human thoughts into computer commands via noninvasive means such as caps covered with electrodes. Researchers also experiment with a brain-scanning method using magnetic fields called magnetoencephalography (MEG).

As a time-honored way to make decisions in Japan, adults often resort to janken, a local version of the child's game of rock, paper, scissors. Japanese scientists have developed a new twist on this tradition, a machine that can read minds

_ and then form the "weapon"

of choice on a mechanical hand—in effect, a mind-controlled robot.

The joint project by Kyoto-based Advanced Telecommunications Research (ATR) Institute International and Honda Research Institute Japan is a novel "brain-machine interface." In the ATR-Honda approach, demonstrated this past May, a subject in a functional magnetic resonance imaging (fMRI) machine forms the rock, paper or scissors with his hand. A machine-learning algorithm analyzes the fMRI data on changes in blood flow connected with neural activity in the motor cortex. The decoded data are transmitted to the robot hand, which reproduces the choice within about seven seconds and with 85 percent accuracy.

Although the brain-machine interface might suggest the fantasy of having a personal army of robots controlled simply by mental command, the research is essentially about pattern recognition. "We have been working on methods for decoding brain activity," says cognitive neuroscientist Yuki-yasu Kamitani of ATR. "A brain-machine interface is only one of many possible applications of the decoding technique. For instance, if you could decode a person's attitude toward some product, you could use that for marketing." The interface could also lead to a kind of "information terminal," he adds, that would enable thought-guided operation of cell phones and computers. Honda Research Institute's Tat-suhiko Sekiguchi envisions a brain analyzer that would monitor drivers' mental states and warn of drowsiness or inattention.

The ATR-Honda investigators chose functional MRI because it provides the greatest accuracy of all brain-reading strategies, according to Kamitani. "You can get a cross section of the brain," he says. "With MEG [magnetoencephalography] or EEG [electroencephalography], you can get only a map of magnetic or electrical fields on the scalp, and using that you can infer where the current is coming from, but you cannot say for sure where it is. You cannot [directly] get a spatial map." EEG, which is faster but also prone to interference, usually requires intensive user training and special thinking processes to produce activity that can be detected. The fMRI-based approach, though, requires no training, only natural brain activity.

Kamitani's group needs to identify more complex mental activity for its brain-machine interface to become practicable. On the engineering side, the large scanning apparatus would have to be made smaller and lighter, like an electrode cap. Kamitani speculates that by studying accumulated fMRI data, a compact system could be developed that specializes in a certain brain area or mental task. The team is also experimenting with MEG to see if it might be appropriate.

For now, the researchers want to accelerate and improve their decoding technique to detect mere intention in rock, paper, scissors and eliminate the need to physically form the hand shapes. "This game involves some social aspect—you want to win and predict what the other person is doing," Ka-mitani explains. "We want to decode the intention rather than the actual movement." Perhaps in five to 10 years, suggests Honda Research Institute president Tomohiko Kawanabe, the interface will be good enough to control the automaker's famous Asimo humanoid robot. If so, it's sure to play a mean game of janken.

Tim Hornyak, based in Tokyo, is author of Loving the Machine: The Art and Science of Japanese Robots.

SCAN

NEED TO KNOW:

Was this article helpful?

0 0
Solar Stirling Engine Basics Explained

Solar Stirling Engine Basics Explained

The solar Stirling engine is progressively becoming a viable alternative to solar panels for its higher efficiency. Stirling engines might be the best way to harvest the power provided by the sun. This is an easy-to-understand explanation of how Stirling engines work, the different types, and why they are more efficient than steam engines.

Get My Free Ebook


Post a comment