acm-header
Sign In

Communications of the ACM

ACM News

­ist Conference Presents Prototype Interface Projects


View as: Print Mobile App Share:
FreeD

The FreeD is a handheld, digitally controlled, milling device that is guided and monitored by a computer, but also preserves the craftspersons freedom to sculpt and carve.

Credit: MIT Media Lab

User interfaces have moved beyond mice and keyboards to touch screens, voice controls, and visual inputs like the Microsoft Kinect. And researchers who attended the recent 25th ACM Symposium on User Interface Software and Technology in Cambridge, MA, are working on even newer ideas, which they presented during Demo Night.

These were rough prototypes, lab demonstrations not ready for commercialization, but intriguing nonetheless. David Kim, of Microsoft Research, had a small infrared laser and IR detector strapped to his wrist, a device that allows gestural input without requiring the user to wear a data glove. The device tracks the position of joints and finger tips and reproduces an image of the user’s hand on a screen. Simply by gesturing, Kim could manipulate objects on the screen—pinching to shrink them, pulling to bring them closer. The Digits device, which won second place in the Best Demo awards, could be used in video games, or could let someone manipulate the controls of a smartphone without taking it out of his or her pocket.

A few tables down, Amit Zoran of MIT’s Media Lab was showing off the FreeD, a milling tool that allows even those of us with poor hand-eye coordination to carve objects, perhaps as a supplement to computer-aided design. The pattern for the object is given to the computer, and a magnetic motion tracker determines where the handheld tool is in space. As you move to a certain distance from the material you’re carving, the tool starts up. But as you near a boundary defined by the pattern, the tool slows, and then stops before you can drill too far, though an override button will let you keep going.

One of the ironies of touch screen technology is that it doesn’t actually make good use of the sense of touch; you have to look to see what "button" displayed on the screen you’re pressing. A project from Microsoft Research Asia aims to overcome that, using piezo actuators vibrating at a hypersonic frequency to give the feeling of friction to part of a touch screen. I could slide my finger along the screen and actually feel the image my finger passed over. A separate set of actuators made the screen vibrate slightly, and give off a "plink" sound when I pressed an image of a button, so it felt as if I were actually depressing a physical button.

Another attempt at tactile feedback came from Keio University and the University of Tokyo. The researchers use sound to replicate the feeling of a surface. Run a stylus over a rough surface, and an attached microphone records the sound produced and associates it with a photo of the object in question. Then you touch the image of that object, and a playback of the sound vibration gives you a sense of the texture. It didn’t exactly reproduce the feeling of a fleece pull-over I tried it on, but it was a decent approximation. The idea is that online merchants could allow people to not just see but feel an item they were shopping for over the Internet.

Whether any of these ideas will see the light of day is unknown, but it’s clear that researchers are generating a lot of new ideas about interfaces.  

 

Neil Savage is a science and technology writer based in Lowell, MA.  


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account