acm-header
Sign In

Communications of the ACM

ACM TechNews

Paralyzed People Inhabit Distant Robot Bodies With Thought Alone


View as: Print Mobile App Share:
Capable of hosting us?

The European Union's VERE project has demonstrated the ability of humans to remotely direct the actions of a robot by concentrating on arrows superimposed on a display.

Credit: Hironori Miyata/Camera Press

The European Union's VERE project aims to dissolve the boundary between the human body and a surrogate.

The system was tested with three volunteers, each of whom wore an electroencephalogram (EEG) cap and a head-mounted display that showed what a robot in Japan was seeing. The volunteers made the robot move by concentrating on arrows superimposed across the display, each flashing at different frequencies.

A computer detected which arrow a participant was looking at using the EEG readings that each frequency provoked, and it sent the corresponding movement to the robot. The researchers found this system enabled the participants to control the robot in near-real time, and they were able to make the robot pick up a drink, move across the room, and put the drink on a table.

The researchers tried to improve the feeling of embodiment using auditory feedback. As they controlled the robot, both able-bodied volunteers and those with spinal cord injuries were able to place the drink closer to a target location when they heard footsteps as they walked, instead of a beep or no noise at all.

The improved control suggests users feel more in tune with the robot itself when there is auditory feedback, says University of Rome researcher Emmanuele Tidoni.

From New Scientist
View Full Article

 

Abstracts Copyright © 2016 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account