Google has launched an app that uses artificial intelligence (AI) to help visually impaired users better perceive their surroundings by aiming their phone at objects and receiving verbal feedback.
The Lookout app functions like the Google Lens, receiving information and supplying feedback based on what is captured on the device's rear camera.
Said Google's Patrick Clary, "Lookout detects items in the scene and takes a best guess at what they are, reporting this to you."
Google said Lookout can help users in situations such as learning about a new environment for the first time, reading text and documents, and completing daily routines like cooking, cleaning, and shopping.
The company recommended the phone be positioned on a lanyard around the user's neck, or in the front pocket of a shirt, although it acknowledged Lookout "will not always be 100% perfect."
From ZDNet
View Full Article
Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA
No entries found