Researchers at the Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory (CSAIL) have invented a system that taps human muscle signals from wearable sensors to pilot a robot.
Conduct-A-Bot uses electromyography and motion sensors worn on the user's biceps, triceps, and forearms to quantify muscle signals and movement, and processes that data with algorithms to identify gestures in real time.
The researchers used Conduct-A-Bot with a Parrot Bebop 2 drone, translating user actions like rotational gestures, clenched fists, and tensed arms into drone movement.
The drone correctly responded to 82% of roughly 1,500 human gestures when remotely piloted to fly through hoops, and correctly identified about 94% of cued gestures when not being piloted.
From MIT News
View Full Article
Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA
No entries found