Columbia Engineering scientists melded advanced tactile sense with motor learning algorithms into a dexterous robot hand that does not rely on vision.
The hand was able to arbitrarily rotate an unevenly shaped grasped object in a stable, secure grip without visual feedback.
The hand features five fingers outfitted with touch-sensing technology and 15 independently actuated joints, and uses deep reinforcement learning to learn new tasks via practical application.
Physics simulators and highly parallel processors enabled the robot to learn roughly one year of practice in just hours, which translated into the desired dexterity with the actual hand.
Columbia Engineering's Matei Ciocarlie said, "Once we also add visual feedback into the mix along with touch, we hope to be able to achieve even more dexterity, and one day start approaching the replication of the human hand."
From Columbia Engineering News
View Full Article
Abstracts Copyright © 2023 SmithBucklin, Washington, D.C., USA
No entries found