acm-header
Sign In

Communications of the ACM

ACM News

Vr Trauma Surgery 101: Going For the Grisly


View as: Print Mobile App Share:
Amitabh Varshney (left) and Sarah Murthi at work in the University of Marylands Augmentarium.

Researchers at the University of Maryland are working to simulate the experience of performing surgery in a shock trauma surgery unit for medical students.

Credit: Joe Dysart

Researchers at the University of Maryland (UM) are working on a new virtual reality (VR) simulation that will enable medical students to experience the controlled chaos of trauma surgery.

The simulation is akin to the VR attractions at many theme parks, in which park-goers don three-dimensional (3D) glasses and enter a giant, circle-shaped room outfitted with a 360-degree video screen. However, instead of experiencing a Disney-esque flight of fancy, users of the UM experience get a grim-and-gritty immersion into the world of bloody scalpels and screaming patients.

“Medical students and young trainees have little or no prior exposure to the chaos of trauma medicine,” said Dr. Sarah Murthi, a trauma surgeon at the University of Maryland Medical Center, and a clinical associate professor at the university’s School of Medicine.  “To immerse them in various virtual-reality scenarios in advance could lessen the initial shock.”

Murthi is co-developing the simulation with Amitabh Varshney, director of the University of Maryland Institute for Advanced Computer Studies, who has decades of experience in VR research.

Varshney created the VR simulator by first shooting video of an actual surgery, using numerous cameras positioned at different angles in a surgery trauma center. “By capturing trauma events using immersive camera arrays, they can be scaled up so that any number of people, from any location, can view this environment as if they were there in real time,” Varshney says.

Next, Varshney developed programming that weaves those videos together to create a 3D interactive surgery experience.

Users immerse themselves in the simulation by donning NVIDIA 3D Vision Pro glasses and entering an ‘Augmentarium’—the same kind of circular, 360-degree viewing room you see at VR theme-park attractions. By any measure, UM's Augmentarium is a serious rig; it features 15 projectors driven by a Mercury GPU 408 4U server, which are fused into a single usable stereoscopic desktop interface with the help of NVIDIA Mosaic software and NVIDA 3D Vision Pro kits, according to Varshney.

Funding for that tech largess came in part from a $1.6-million grant from the U.S. National Science Foundation, as well as  from the university’s own coffers, according to Varshney.

While not quite ready for prime time, UM’s VR simulator is an important first step in providing students the visceral experience of a real-life trauma unit, according to Varshney and Murthi.

David Laidlaw, a computer science professor working in VR at Brown University, agrees.  Medical students ultimately make “life-altering decisions, often under tremendous time pressure,” he says.  “They need technology that clearly provides a substantial improvement in decision-making or in efficiency.  Amitabh Varshney's work here shows excellent promise.”

With his proof of concept a done deal, Varshney is now waiting on another round of funding, which he needs to shoot more footage for the Augmentarium. The goal of the new shoot: amp up the VR realism by using as many as 1,200 small cameras placed throughout a trauma bay.

“By fusing these large camera displays accurately, we are seeking total immersion into real-life settings,” says Barbara Brawn-Cinani, associate director of UM’s Center for Health-related Informatics and Bioimaging. “This would allow residents to virtually move about within the operating room or trauma bay, and observe from multiple viewpoints various techniques and procedures underway.”

Besides developing a trauma surgery boot camp for students, Varshney and Murthi are also looking to capture rare surgeries for their VR Augmentarium—surgeries that might only occur once a year at a trauma center. “These are presentations that could be invaluable for residents-in-training to experience at a later date in a controlled setting,” Varshney says.

Yet another VR surgery tool Varshney and Murthi are working on is an augmented reality ultrasound viewer. Ideally, the viewer will be able to project an image of an in-utero fetus, for example, on top of a mother’s abdomen.

The researchers are hopeful they can enable surgeons to someday to wear Google Glass or similar goggles during a surgery, which they could use to view information and imagery suspended before them in 3D.

Other work going on in medical VR includes:

  • NeuroVR, from CAE Healthcare, a VR neurosurgery simulator already on the market, which enables medical residents and surgeons to practice brain surgery.
  • Laidlaw’s work at Brown University, which includes a number of research projects exploring the use of VR visors to view 3D data produced by magnetic resonance imaging and computed tomography scans.
  • Research underway at the University of Central Florida by Gregory F. Welch, who holds the Florida Hospital Endowed Chair in Healthcare Simulation. Welch is working on virtual patient prototypes, combining projectors and cameras with various sensors and actuators to generate a human-sized patient ‘shell’ that feels like a human and exhibits dynamic visual, tactile, temperature, and aural cues, and even reacts to touch, Welch says.

“We have reached a point in time when researchers can develop application-specific tools and experiences that can be widely experienced via relatively robust and low-cost systems,” Welch says.  “I expect the impact in the coming years to be tremendous.”

Joe Dysart is an Internet speaker and business consultant based in Manhattan, NY.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account