Scientists at the University of Edinburgh in the U.K. and Adobe Research used deep learning neural networks to help digital characters in video games move more realistically.
The team trained a neural network on a database of motions by a live performer on a soundstage which they recorded and digitized.
The network can adapt what it learned from the database to most scenarios or settings so characters move in natural-looking ways.
The network is filling the gaps between a digital character's various poses and motions, intelligently and seamlessly stitching together these elements into a whole.
From Gizmodo
View Full Article
Abstracts Copyright © 2019 SmithBucklin, Washington, DC, USA
No entries found