Because humans often need to perform at their best in the toughest of circumstances, psychological research is at the heart of our national interest
As industry take-up of automation and AI accelerates, high-stakes sectors like defence and aviation are turning to psychology researchers for guidance on designing technology that better suits the humans who will ultimately have to use it in challenging conditions.
Professor Shayne Loft, an Australia Research Council Future Fellow within UWA’s School of Psychological Science, has been working with the Australian Defence Force since he joined the University in 2009. In recent years, his work has become increasingly focused on human-ready technology design and analysis.
Image: Professor Shayne Loft.
“For the foreseeable future, you’ll always need a human to ultimately interact with automation and AI,” Professor Loft says. “So how do you best design for that? How do you help software designers and engineers build technology that will enable humans to make informed decisions and be able to recover when the technology sometimes fails?”
Professor Loft’s research team engages with defence personnel to gather the information needed to conduct experiments in unclassified simulated environments. UWA is home to a medium-fidelity simulation of a submarine control room that broadly represents the types of tasks submariners do.
“Whether it’s purpose-designed technology or off-the-shelf, you have to keep the human in mind,” Professor Loft explains. “Only a human understands the nuances of task context.”
He gives the example of Target Motion Analysis (determining the position of a contact at sea as detected by passive sensor information). “When you work with a human, you can see the sweat on their brow, you can see how certain they are, you can ask them questions, but how do you make the automation of target motion analysis transparent to the human?
“It’s also difficult for humans to recover when automation works 99.9 percent of the time, as it does in some aviation contexts. Do you train new air traffic controllers to detect aircraft conflicts in a raw, manual way or do you train them in how to monitor automated conflict detection systems? How we expect humans to intervene in very rare events is asking a lot.”
Another defence activity in which Professor Loft’s research is being applied is the management of uninhabited vehicles. These are devices that operate while in contact with the ground and without a human onboard presence, not unlike a drone. Increasingly, automation decision aids will help humans decide which uninhabited vehicle to deploy for a particular exercise or mission based on a number of complex and interacting factors.
“We’re encouraging engineers to design technology that makes those decisions transparent,” Professor Loft says, “so that a human can see and understand on what basis automation has decided that ‘option A’ is the best one and so that they can then make an informed decision about whether to follow the advice or whether to override it and choose a different uninhabited vehicle.”
Pushing tin: air traffic control
A similarly high-stakes application of Professor Loft’s research is in the field of air traffic control, where automation is increasingly being introduced to handle increases in air traffic load. He says the en-route air space is a good example of what can be expected when automation is introduced to existing human-driven systems.
“Basically, people change how they allocate their attention and what information they look at,” Professor Loft explains. “So if you’ve got a reliable system that is automatically detecting conflicts, for example, then you’re paying less attention to certain things about the aircraft approaching your controlled sector. And automation is sometimes wrong, because nothing is perfect, and then the human operator has to recover. One of the things we study is how you design automation in such a way that humans have the situation awareness to recover if they need to intervene.
“Another challenge in any sort of high-stakes work environment is how to remember to do things in the future when a task is deferred. An air traffic controller may know a particular flight comes in every morning, but there’s a thunderstorm coming so they need to give a different altitude when it comes in, but that task of remembering is in the future and they’re focused on what’s happening now.
“In aviation, there have been accidents where people have simply forgotten to do deferred tasks and planes have collided, so we’ve been studying what is known as prospective memory. How do people remember to do things in the future and how do they recover from being interrupted, as often happens in the context of air traffic control?”
One of Professor Loft’s proudest research achievements is having come up with a workload prediction tool for air traffic controllers that is still in use today. “If you can take the flight plans for the day and the computational models of how humans do their tasks, then you can come up with a prediction tool for workloads and that’s what we created for Air Services Australia to help them make decisions about team composition and staffing,” he says.
Partnering with the US
Professor Loft’s research has recently attracted a significant grant from AUSMURI, the nine-year $25 million investment program encouraging Australian universities to collaborate with US researchers on high-priority projects for future defence capabilities.
He explains: “We’re working with neuroscientists and mathematical psychologists to understand human attention control, which is essentially the ability to select what you need to attend to and how to prioritise. The objective is to build targeted training tasks that people in the military can do to improve attention control and then to test that training in simulated military tasks to see if it works. Ultimately, this type of research may inform personnel selection, retention and also work design.”
Professor Loft says his professional work is driven by a personal desire to make a practical difference. “Altruistically,” he says, “I believe Australia needs to defend itself and if I can play a small role in that, make a tiny contribution, then that is very satisfying. Part of the appeal of doing meaningful research it to make society safer, while also achieving economic growth.
“With greater economic productivity, by doing important things better, we then have more assets to help people.”
Media references
Annelies Gartner (UWA Media Advisor) 08 6488 6876