acm-header
Sign In

Communications of the ACM

ACM Careers

'Smart Machine' Enhances Navy Pilot Training


View as: Print Mobile App Share:
Android pajamas CMU

Brian Yee, a master's student in Carnegie Mellon's School of Computer Science, hands out Android pajamas from Google on campus.

Credit: Brian Yee

Navy pilots and other flight specialists soon will have a new "smart machine" installed in flight training simulators that learns from expert instructors to more efficiently train students.

Sandia National Laboratories' Automated Expert Modeling & Student Evaluation (AEMASE, pronounced "amaze") is being provided to the Navy as a component of flight simulators. Components are now being used to train Navy personnel to fly H-60 helicopters and a complete system will soon be delivered for training on the E-2C Hawkeye aircraft, says Robert G. Abbott, a Sandia computer scientist and AEMASE's inventor. The work is sponsored by the Office of Naval Research.

AEMASE is a cognitive software application that updates its knowledge of experts' performance on training simulators in real time to prevent training sessions from becoming obsolete and automatically evaluates student performance, both of which reduce overall training costs, Abbott says.

"AEMASE is able to adapt and is aware of what's going on," he says. "That's what's driving our cognitive modeling and automated systems that learn over time from the environment and from their interactions with people."

Previous flight simulators have not done well with ambiguous or new situations that required time-consuming reprogramming, making it difficult for the military to adapt quickly to changing environments and tactics.

AEMASE bypasses lengthy interviews of instructors and reprogramming once the simulator is running. Instead, instructors fly the simulator themselves to capture their expertise, a feature that works particularly well in ambiguous situations where it's difficult to program a set of explicit rules, Abbott says.

Melissa Walwanis, a senior research psychologist at the Naval Air Warfare Center's Training System Division in Orlando, Fla., says AEMASE will give Navy trainees specific ways to improve performance through machine learning, automated performance measurement and recordings of trainees' voices during the training sessions.

AEMASE will save taxpayers money by improving the training and skills students gain, so the Navy can use limited flight time more efficiently, reducing fuel costs and wear on the aircraft, she says.

Sandia experiments showed that the scores given students by AEMASE agreed with human graders 83 to 99 percent of the time. In a study of students learning to operate the E-2 Hawkeye aircraft's battle space management system, those whose training used AEMASE performed better than those whose simulators lacked the software, says Abbott and Sandia cognitive psychologist Chris Forsythe.

AEMASE grew out of Sandia's research into cognitive systems that started more than a decade ago, Forsythe says. At the time, he says, there were massive computers able to compute large amounts of data, but no software that could model how people make decisions.

AEMASE addresses a needle-in-a-haystack problem. Just as search engines find certain words across the Internet, AEMASE scans hundreds of training sessions to find specific actions or scenarios and makes comparisons, Abbott says.

The software is designed for context recognition. It searches until it recognizes a situation it has seen before and determines whether the students are making a desirable decision, Abbott says.

The software recognizes there may be multiple right answers that incorporate different ways of responding to the situation, Forsythe adds. For example, AEMASE tracks certain flight parameters — say distance, the angle of the aircraft from the ground, and velocity — to create vectors that are treated as points within a multidimensional space defined by the parameters. Different "right" answers are expressed as points in the space, but will tend to gather in one area, while poor performance can be measured by a point's distance from the "expert" points.

But for instructors, AEMASE's interface is simple. They can flag actions by pushing a one-click thumbs-up button to record good behavior or a thumbs-down button when students fly too low or too close together in the simulation, Abbott says.

AEMASE places those flagged events on a timeline display, so instructors and students can review errors in recordings of student performance. Then AEMASE uses that information to recognize other instances of the errors, helping the instructors become more efficient by automatically flagging errors for them to review with other students.

These flags are the seeds for the model's future development as scenarios and preferred actions evolve over time, Forsythe says.

AEMASE also incorporates speech recognition technology to assess how effectively teams communicate.

"Are people talking to each other often and using the terminology that we would expect if they know what they're doing or are they hmmm-ing and hawing a lot, using a lot of filler words like uh, ah, or um, which indicates less proficiency?" Abbott says. "That's one of the things we want the system to assess."

The Navy is considering other training uses for AEMASE, but it also could be readily adapted to monitor live operations. For example, it could model operators at the height of their ability, and then alert them when they later fail to take the same actions in similar situations, perhaps due to fatigue or distraction, Abbott and Forsythe say.

Sandia is adapting the software to similar training aids for computer security analysts. Potential applications include driver's education, automating robots, and many other areas, Abbott says.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account