Over the last half-century, enormous gains in processing power, far more sophisticated software, and advanced algorithms have unleashed computing gains that would have once seemed unfathomable. Suffice it to say that today's mobile devices — including smartphones and tablets — are transforming the way we view the world and sense the environment around us.
Today, smartphones can see, hear, and feel. They possess built-in cameras, microphones, GPS chips, accelerometers, gyroscopes and other devices that can act and react to a dizzying array of environmental factors. "All these chips and sensors make it possible to navigate our world in a profoundly different way," says Michael Morgan, senior analyst for mobile devices, applications and content at ABI Research.
But researchers are working to make smartphones even smarter. They're attempting to add smell and taste to the sensory mix — and make devices more context-aware. This could usher in an array of new capabilities and lead to devices that make decision on their own. "Right now we have products available that can detect what a user wants to do, but only given a command," says Karl Volkman, chief technology officer at IT services firm SRV Network Inc. Soon, "We're going to see products that can sense what the user wants without a direct verbal or physical cue."
To be sure, the convergence of different digital technologies unlocks possibilities that would have been unimaginable only a few years ago. "When you overlay a number of functions and combine everything with the right software it's possible to create sophisticated capabilities that transcend any particular function," Morgan says. "Cameras, microphones, and sensors can work together to dramatically increase the intelligence of the device."
The field is advancing rapidly. San Francisco-based Adamant Technologies is currently developing a small processor that digitizes smell and taste. The system uses about 2,000 sensors to detect aromas and flavors, says Adamant CEO Samuel Khamis. This compares to about 400 sensors in the human nose. The system would detect when a person has bad breath or is intoxicated and over the legal limit to drive. A digital nose in a smartphone could also one day detect underlying medical conditions or rancid food, Morgan says.
In addition, context-aware sensing could change the way we use devices. For example, within a few years, smartphones will likely sense when they are in a purse or pocket and adjust settings, including the ring tone level, automatically. This could help lengthen battery life and allow them to behave more intelligently. A phone might send a call directly to voicemail if it is tucked into a purse and the individual is running, which could indicate that the person is late for an appointment or event and unavailable.
Multi-sensor devices could also use vibration and movement to provide haptic feedback that allows shoppers to "feel" fabrics and textures we're now only able to view on a screen. They could also deliver augmented reality (AR) that lets a user hold up a phone — in front of the Eiffel Tower, for example — and instantly view information about it along with the surrounding environment. A similar app might translate a foreign language by converting speech to text in real time. Already, some rudimentary AR apps exist and the U.S. Defense Advanced Research Projects Agency (DARPA) is developing language translation technology.
In addition, personal wireless networks such as Bluetooth along with more advanced ports, such as Apple's Thunderbolt technology, will provide ways to integrate phones with an even wider array of sensing devices built into clothing, shoes, and more. These might include heart rate monitors, blood sugar monitors, and exertion monitors. In the not-too-distant-future, these devices will likely incorporate multiple sensing capabilities, including heart rate, temperature, and perspiration sensors. This would determine when it's time for a runner or cyclist to drink water or consume an energy bar and when to take a rest.
Make no mistake, smartphones are evolving rapidly and they will transform the way we interact with our environment in the years ahead. Concludes Morgan: "As developer and engineers become more adept at taking basic sensory data and combining it in new and different ways, smartphones — in some case connected to external sensing devices — will transform our lives."
No entries found