The autonomous system uses visual cues from cameras to control landings. Because it is independent of outside technology such as laser-range sensors, radio beacons, or GPS signals, the system could improve passenger safety by deterring blocking or hacking.
“It is totally independent of GPS signals, which can be blocked or hacked, and is a start for aircraft to independently understand their surroundings,” says Saul Thurrowgood, who works in the Neuroscience of Vision and Aerial Robotics laboratory at the University of Queensland.
To create the system, Thurrowgood took cues from bee biology.
“Bees use optic flow for their descent—using the rate of motion beneath them to guide their landing—and recent testing also shows that they may also use stereo vision for their touchdown, which is using two eyes to judge distance,” Thurrowgood says.
“We have incorporated both of these techniques in our automatic landing system, but modified them for use in a fixed-wing aircraft.”
The system uses cameras mounted to the front of an aircraft with a two-meter wingspan. “The plane used the visual system to guide itself, sense its altitude, control its throttle, and shut itself off when it landed.
“All commercial aircraft need to have backup systems, and this research provides the option of having different types of sensing. If one isn’t working then the pilot has something else to fall back on.”
The Australian Research Council, the US Army Research Office, Boeing Research and Technology Australia, and a Queensland Premier’s Fellowship funded the study, which is published in the Journal of Field Robotics.