acm-header
Sign In

Communications of the ACM

ACM TechNews

Closing the Loop for Robotic Grasping


View as: Print Mobile App Share:
The robot arm adjusts position to grasp different objects.

A faster, more accurate way for robots to grasp objects, particularly in cluttered and changing environments, has been developed by scientists at Queensland University of Technology in Australia.

Credit: Queensland University of Technology

Scientists at Queensland University of Technology (QUT) in Australia have developed a faster, more accurate way for robots to grasp objects, particularly in cluttered and changing environments.

They say this breakthrough has the potential to improve the robots' usefulness in both industrial and domestic settings.

The new approach, which is based on a Generative Grasping Convolutional Neural Network, enables a robot to quickly scan the environment and map each pixel it captures to its grasp quality using a depth image.

Real-world tests demonstrated accuracy rates of as much as 88% for dynamic grasping and up to 92% in static experiments.

"We have been able to program robots, in very controlled environments, to pick up very specific items,” said QUT’s Jurgen Leitner. He added, “Robots need to be able to adapt and work in very unstructured environments if we want them to be effective.”

From Queensland University of Technology
View Full Article

 

Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account