acm-header
Sign In

Communications of the ACM

ACM TechNews

How 'bullet Time' Will Revolutionize Exascale Computing


View as: Print Mobile App Share:
Bullet time

Researchers at Japan's Kobe University say the cinematic special effects technique 'bullet time' could revolutionize how exascale computer simulations are accessed.

Credit: MIT Technology Review

Kobe University researchers have developed a method for compressing output data without losing its essential features in exascale computing. The approach uses "bullet time," a Hollywood filming technique that slows down ordinary events while the camera angle changes as if it were flying around the action at normal speed.

The technique involves plotting the trajectory of the camera in advance and then placing many high-speed cameras along the route. The Kobe researchers want to use a similar technique to access exascale computer simulations by surrounding the simulated action with millions of virtual cameras that all record the action as it occurs.

The compression occurs as each camera records a two-dimensional image of a three-dimensional scene. Using this technique, the footage from a single camera can be compressed into a file about 10 megabytes in size, so even if there are 1 million cameras recording the action, the total amount of data they produce is only about 10 terabytes, according to the researchers.

"Our movie data is an order of magnitude smaller," the researchers say. "This gap will increase much more in larger-scale simulations."

From Technology Review
View Full Article

 

Abstracts Copyright © 2013 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account