In a 30,000-sq.-ft. lab in Seattle replete with an IKEA-designed kitchen, researchers are working with cobots to perform tasks like picking up an unfamiliar object, moving over to a drawer, opening it, and putting the object inside.
Perhaps you've seen this demonstrated before in a video; as a result, you may be thinking, "so what?" The point is, getting a cobot (a robot that works collaboratively alongside a human) to perform such a task is far from simple; it requires a lot of training and underlying software to do something so basic, and then be able to do it again when the environment is no longer a kitchen.
The lab was recently opened by chipmaker NVIDIA, and "we want to not just focus on the smaller-scale projects, but force ourselves to work toward larger-scale, complete robotic systems so we learn what the challenges are and what the bottlenecks are,'' says Dieter Fox, senior director of robotics research at NVIDIA and a computer science professor at nearby University of Washington. The goal is to build software that can be used by other research labs that are focused on various aspects of training a robot, such as in manipulating objects, he says.
"The robot has to be able to see a person and anticipate what the person wants to do,'' says Fox. "So if you say 'bring me the salt', the robot opens the right drawer and brings you the right item. That's getting the cobot to where it understands what you want in natural language, so it can help you in specific tasks" like cooking, Fox explains.
At the same time, the cobot must be able to operate safely next to a person and not bump into them, demonstrating the ability to anticipate the person's motion, he adds.
Training cobots to mimic human behavior will mean they can eventually perform more than just mundane, repetitive tasks. Ensuring human safety, however, is tricky, and that is one of the challenges. Industry observers say robots are being designed to work safely near humans, but at present they are not intelligent enough to move beyond the ways in which they have been programmed to move.
Researchers are working on taking cobots to the next level, motivated by a number of factors, including the growing sophistication of robot platforms, and the fact that cobots are no longer a novelty.
Market projections support that. While the cobot market saw less than $500 million in annual global revenue in 2017, that figure is expected to reach nearly $13 billion in 2027, according to market research firm ABI Research.
NVIDIA is not alone in working to train cobots to work safely alongside humans. Researchers at Rice University have developed an algorithm to train a cobot to readjust its trajectory and recalculate the path to its goal when it has been interrupted, or given new information.
Collaborative robotics today fall on a spectrum, says Rian Whitton, strategic technologies research analyst at ABI Research. At the lower end of the scale, Whitton says, robots in tandem with "human-in-the-loop offerings can have reasonable range and reliability, but struggle to maintain a high rate of performance and speed."
At the higher end, Whitton says, "More advanced systems run continuously … with limited human intervention and at a high rate, comparable with trained people doing similar tasks." While they run faster, they are not as collaborative, he says.
The aim, he says, is to use advanced cognitive systems, such as machine vision, motion control, haptics, or augmented reality to improve safety so that collaborative systems can become more autonomous.
While cobots can be trained in the same way as non-interactive robots, they require additional operating constraints associated with safety, says James Conrad, an IEEE senior member and professor of electrical and computer engineering at the University of North Carolina, Charlotte. "The training needs to include these constraints, like the speed of motion and stopping motion if a human is in the path of motion,'' he says.
The challenges to training cobots are the same for all industrial robots, Conrad notes. Training takes a lot of time, and cobots are not very adaptable.
"One of the major limitations in existing machine learning approaches is the gap … between the training phase and actual operation in the field," says Conrad. "This highly limits the adaptability of cobots to properly react to new situations."
Like the work being done at Rice, Conrad says, "A cobot needs to anticipate the movement of humans based on the observations and experiences the humans have."
Sarah Boisvert, author of The New Collar Workforce: An Insider's Guide to Making Impactful Changes to Manufacturing and Training, says she doesn't expect cobots will be making complex decisions for about two decades. "True transformation takes a change of mindset, and industries like manufacturing are conservative,'' says Boisvert, who is also founder of New Mexico-based startup Fab Lab Hub. "It's also expensive to rip out what you've got and put in all new machines."
NVIDIA's Fox thinks cobots will be able to help with cooking tasks in the kitchen "very robustly and naturally" in about a decade. He adds, "But we hope along the way … there will be pieces that will be useful in other settings, like manufacturing."
Esther Shein is a freelance technology and business writer based in the Boston area.
No entries found