By turning the brain cell activity underlying fly eyesight into mathematical equations, researchers have found an ultra-efficient method for pulling motion patterns from raw visual data.
Though they built the system, the researchers don’t quite understand how it works. But however mysterious the equations may be, they could still be used to program the vision systems of miniaturized battlefield drones, search-and-rescue robots, automobile navigation systems and other systems where computational power is at a premium.
“We can build a system that works perfectly well, inspired by biology, without having a complete understanding of how the components interact. It’s a non-linear system,” said David O’Carroll, a computational neuroscientist who studies insect vision at Australia’s University of Adelaide. “The number of computations involved is quite small. We can get an answer using tens of thousands of times less floating-point computations than in traditional ways.”
The best-known of these is the Lucas-Kanade method, which calculates yaw — up-and-down, side-to-side motion changes — by comparing, frame by frame, how every pixel in a visual field changes. It’s used for steering and guidance in many experimental unmanned vehicles, but its brute-force approach requires lots of processing power, making it impractical in smaller systems.
From Wired.com
View Full Article
No entries found