acm-header
Sign In

Communications of the ACM

ACM TechNews

3D Graphics & Reality Fuse on the Fly


View as: Print Mobile App Share:
PTAM software

A screenshot of the Parallel Tracking and Mapping (PTAM) software, which can add virtual objects to real world surfaces.

Credit: University of Oxford

Oxford University researchers have developed the Parallel Tracking and Mapping (PTAM) program, a camera-tracking system for fusing real and three-dimensional (3D) computer-generated visuals. PTAM enables users to project virtual objects or characters into a video stream that appears on real world surfaces.

"The blending of real and virtual worlds is common enough in films and television, but is usually achieved by extensive processing of the recorded images or by filming in studios with known objects at fixed locations," says Oxford professor David Murray. "The PTAM software allows developers to augment a camera's video stream in real time and in everyday locations."

PTAM builds a map of thousands of features from objects and scenes, tracks accurately and at a standard frame rate, and calculates the camera viewpoint and angle. The technology also could improve global positioning systems and digital compasses, and provide support for satellite, 3G, and Wi-Fi signals.

View a video of the Parallel Tracking and Mapping software working on an iPhone.

From University of Oxford
View Full Article

 

Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account