AI research offers more eyes and ears to search and rescue missions
Drone assistance in natural disaster response now is simplistic at best with a number of hurdles. But new research led by Purdue University professors is working to use artificial intelligence and learning algorithms to create a platform allowing multiple drones to communicate and adapt as mission factors change.
Shaoshuai Mou and Dan DeLaurentis, professors in aeronautics and astronautics, are leading the research, which received three-year funding from Northrop Grumman Corp. as part of the Real Applications of Learning Machine consortium.
“For the system, we focused on a multi-agent network of vehicles, which are diverse and can coordinate with each other,” Mou said. “Such local coordination will allow them to work as a cohesive whole to accomplish complicated missions such as search and rescue.”
“There are challenges in this area. The environment may be dynamic, for example, with the weather changing. The drones have to be adaptive and must be capable of real-time environment perception and on-line autonomous decision making.”
Distributed control, human-machine mixed autonomy, life-long learning and artificial intelligence will be key enablers to the proposed research, Mou said.
“For complex situations, we still need to involve humans in the loop and try to do mixed autonomy consisting of machines and humans,” Mou said.
In the mission scenarios, a ground-based powerful-processing vehicle will communicate to either air, ground or aquatic drones that can cover a wide area.
“The utilization of the combination of heterogeneous vehicles should be a key to so many complicated problems,” Mou said.
Mou and DeLaurentis are joined on the project by faculty from the University of Illinois-Chicago and University of Massachusetts at Amherst.
Source: Purdue News Room