The grants have led to the research and creation of a technology that would advise pilots of potential accident situations. Leading this project is Lance Sherry, director of Volgenau’s Center for Air Transportation Systems Research, along with John Shortle, professor of systems engineering and operations research. Collaborators from the University of Oregon and University of Iowa are also assisting.
A subset of airline accidents, according to Sherry, is the result of “controlled flight” into terrain or into a stall. These accidents are not actually caused by a malfunctioning part or pilot error, he said, but come from an interaction between the components of flight automation that is indiscernible by the pilot.
“Technology has facilitated the development of increasingly complex automation,” Sherry said. “There are no natural checks that limit this complexity when complex components interact with other complex components.”
A typical airliner has more than 100 sensors that send data to approximately 36 computers on the aircraft. Those computers then communicate with each other to make decisions about how a plane should respond. However, in rare circumstances, small differences between sensor data can be misinterpreted. An error like this would force the automation to make a logical decision about which sensor is correct and could lead to an inappropriate action taken by the aircraft.
One example of this, according to Sherry, is that the automation could be commanding a plane to decelerate to its required landing speed in preparation for a landing, which a pilot would find normal. However, a long chain of logical decisions made the by the automation based on a discrepancy between sensors may lead to deceleration way past the landing the speed until the plane stalls.
So, how can Sherry and his team prevent these types of accidents from happening?
They started by studying pilot behavior in flight and noticed that the senior captain will often provide experience-based advice to the first officer when passing off the controls. They started to wonder if a type of technology could provide the same type of guidance in potential accident scenarios.
Now coined the “Paranoid Associate,” the technology uses machine-learning algorithms to process massive amounts of flight and weather data collected by Sherry and his team about anomalies that occur during flights across the country. These data are then used to create tips or advisories for pilots to prevent them from encountering potential accident scenarios.
“Our objective is to be a constructive backseat driver,” Sherry said. “A human pilot may only accrue experience from 300 flights a year, but the machine learning algorithm can accrue knowledge from every flight flown by every pilot.”
The technology is still in its developing stages. Sherry said that technical issues, like avoiding nuisance alerts and how to best communicate the advisories to the pilot, still need to be solved. One student working on the project is currently experimenting with a Fitbit as a way of notifying the pilot.
In addition, Sherry and the team are applying these ideas to autonomous vehicles.
The project is expected to last three years and Sherry is optimistic about the benefits that it can potentially have on all automated vehicles, like ships and cars.
“This is more than research where you write a paper and put it on a shelf,” Sherry said. “This type of technology is eventually going to find its way into consumer products.”