acm-header
Sign In

Communications of the ACM

ACM TechNews

How Your Device Knows Your Life Through Images


View as: Print Mobile App Share:
New research in neural networks may let computers identify our daily actions more accurately than the apps on the market that track things like GPS location and heart rate.

A new computer model has achieved about 83 percent accuracy in identifying the activities it sees in real-life imagesand with just a bit of training it could do this for any user it encounters.

Credit: Tim Peacock

Georgia Institute of Technology (Georgia Tech) researchers have designed an artificial neural network to identify scenes in photographs taken from the point of view of people using wearable cameras or mobile phones.

They have trained the network on a set of about 40,000 images taken over a six-month period by a single individual who manually associated each image with a basic activity, such as driving, watching TV, family time, and hygiene.

A separate learning algorithm enables the network to learn common associations between activities and make predictions about the user's upcoming schedule. "It can leverage deep learning, and the basic contextual information on daily activities," says Georgia Tech graduate student Steven Hickson.

The team reports the computer model achieved about 83-percent accuracy in identifying activities.

The researchers say the technology has the potential to track daily activities more accurately than current apps and offer more insightful services. For example, an app could use the technology to monitor eating or exercise habits and suggest possible adjustments. The technology also can learn schedules and make intelligent suggestions on the fly.

From Technology Review
View Full Article

 

Abstracts Copyright © 2015 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account