acm-header
Sign In

Communications of the ACM

ACM Careers

Robots ­sing Tools: Researchers Aim to Create 'macgyver' Robot


View as: Print Mobile App Share:
Mike Stilman of Georgia Tech with Golem Krang humanoid robot

Georgia Tech Assistant Professor Mike Stilman poses with Golem Krang, a humanoid robot designed and built in Stilman's laboratory to study whole-body robotic planning and control.

Credit: Josh Meister / Georgia Tech

Robots are increasingly being used in place of humans to explore hazardous and difficult-to-access environments, but they aren't yet able to interact with their environments as well as humans. If today's most sophisticated robot was trapped in a burning room by a jammed door, it would probably not know how to locate and use objects in the room to climb over any debris, pry open the door, and escape the building.

A research team led by Professor Mike Stilman at the Georgia Institute of Technology hopes to change that by giving robots the ability to use objects in their environments to accomplish high-level tasks. The team recently received a three-year, $900,000 grant from the Office of Naval Research to work on this project.

"Our goal is to develop a robot that behaves like MacGyver, the television character from the 1980s who solved complex problems and escaped dangerous situations by using everyday objects and materials he found at hand," says Stilman, an assistant professor in the School of Interactive Computing at Georgia Tech. "We want to understand the basic cognitive processes that allow humans to take advantage of arbitrary objects in their environments as tools. We will achieve this by designing algorithms for robots that make tasks that are impossible for a robot alone possible for a robot with tools."

The research will build on Stilman's previous work on navigation among movable obstacles that enabled robots to autonomously recognize and move obstacles that were in the way of their getting from point A to point B.

"This project is challenging because there is a critical difference between moving objects out of the way and using objects to make a way," Stilman says. "Researchers in the robot motion planning field have traditionally used computerized vision systems to locate objects in a cluttered environment to plan collision-free paths, but these systems have not provided any information about the objects' functions."

To create a robot capable of using objects in its environment to accomplish a task, Stilman plans to develop an algorithm that will allow a robot to identify an arbitrary object in a room, determine the object's potential function, and turn that object into a simple machine that can be used to complete an action. Actions could include using a chair to reach something high, bracing a ladder against a bookshelf, stacking boxes to climb over something, and building levers or bridges from random debris.

By providing the robot with basic knowledge of rigid body mechanics and simple machines, the robot should be able to autonomously determine the mechanical force properties of an object and construct motion plans for using the object to perform high-level tasks.

For example, exiting a burning room with a jammed door would require a robot to travel around any fire, use an object in the room to apply sufficient force to open the stuck door, and locate an object in the room that will support its weight while it moves to get out of the room.

Such skills could be extremely valuable in the future as robots work side-by-side with military personnel to accomplish challenging missions.

"The Navy prides itself on recruiting, training and deploying our country's most resourceful and intelligent men and women," says Paul Bello, director of the cognitive science program in the Office of Naval Research. "Now that robotic systems are becoming more pervasive as teammates for warfighters in military operations, we must ensure that they are both intelligent and resourceful. Professor Stilman's work on the 'MacGyver-bot' is the first of its kind, and is already beginning to deliver on the promise of mechanical teammates able to creatively perform in high-stakes situations."

To address the complexity of the human-like reasoning required for this type of scenario, Stilman is collaborating with researchers Pat Langley and Dongkyu Choi. Langley is the director of the Institute for the Study of Learning and Expertise, and is recognized as a co-founder of the field of machine learning, where he championed both experimental studies of learning algorithms and their application to real-world problems. Choi is an assistant professor in the Department of Aerospace Engineering at the University of Kansas.

Langley and Choi will expand the cognitive architecture they developed, called ICARUS, which provides an infrastructure for modeling various human capabilities like perception, inference, performance, and learning in robots.

"We believe a hybrid reasoning system that embeds our physics-based algorithms within a cognitive architecture will create a more general, efficient, and structured control system for our robot that will accrue more benefits than if we used one approach alone," Stilman says.

After the researchers develop and optimize the hybrid reasoning system using computer simulations, they plan to test the software using Golem Krang, a humanoid robot designed and built in Stilman's laboratory to study whole-body robotic planning and control.

This research is sponsored by the Department of the Navy, Office of Naval Research, through grant number N00014-12-1-0143.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account