DeepMind has created a YouTube dataset of 300,000 video clips and 400 human action classes, designed to train its deep-learning algorithms to better identify human activities and behaviors.
With the assistance of online workers via Amazon's Mechanical Turk service, DeepMind has correctly identified and tagged the actions in the YouTube clips in its Kinetics dataset.
Although algorithms trained on the dataset were about 80% accurate in classifying actions such as "playing tennis," "crawling baby," "cutting watermelon," and "bowling," their accuracy fell to about 20% or less for actions performed by the animated character Homer Simpson.
Meanwhile, a preliminary study found the dataset appears to be to fairly gender-balanced, with neither gender dominating within 340 out of the 400 action classes--otherwise it was impossible to determine gender in those actions.
DeepMind wants outside researchers to help suggest new human action classes for the dataset, improving the identification accuracy of artificial intelligences trained on it.
From IEEE Spectrum
View Full Article
Abstracts Copyright © 2017 Information Inc., Bethesda, Maryland, USA
No entries found