Although Apple has pioneered the mainstream multitouch user interface (UI) through innovations such as the iPhone and iPad, Microsoft could provide the next major UI breakthrough by combining voice, touch, and gesture-based commands, writes Mike Elgan.
Microsoft's Kinect for Windows project, slated for launch early next year, will differ from its Kinect motion-detection gesture control for the Xbox 360 gaming platform by controlling a PC and registering gestures that are in closer proximity to the screen.
Elgan notes that Microsoft also seeks to take advantage of third-party developers by rolling out the Kinect for Windows PC development kit, and by funding and mentoring Kinect-related initiatives. These programs and advancements mean that a Kinect-like gesture UI for Windows PCs should become available in 2012, as well as applications that exploit the interface.
Elgan is particularly intrigued by the possibility of body language-interpreting software that can deduce what the user is doing. "Best of all, we can all look forward to a computing environment that's controllable by touch, voice, and in-air gestures," he writes. "This will eventually appear on all of the major platforms. But Microsoft just might get there first."
From Computerworld
View Full Article
Abstracts Copyright © 2011 Information Inc. , Bethesda, Maryland, USA
No entries found