Augmented reality technologies have been in development in university labs and small companies for almost 50 years, and emerging consumer products such as Google Glass are drawing attention to wearable electronics. However, the wearable revolution started with University of Utah computer scientist Ivan Sutherland, who in 1965 first described a head-mounted display that enabled the user to see a virtual world superimposed on the real world.
Sutherland's work was advanced by the University of Toronto's Steve Mann and Columbia University's Steven Feiner, and now the technology is finally catching up with their concepts. "You need to have technology that is sufficiently comfortable and usable, and a set of potential adopters who would be comfortable wearing the technology," Feiner says.
Required augmented reality components such as cameras, computers, sensors, and connectivity are shrinking in size and price and ramping up in speed, accuracy, and resolution to a point where wearable computers will be seen as a cool accessory, mediating people's engagement with analog and digital environments. Feiner says when such technology is "very small and comfortable, you don't feel weird, but cool."
From CNet
View Full Article
Abstracts Copyright © 2013 Information Inc., Bethesda, Maryland, USA
No entries found