Skip navigation

Pitts & team awarded $5.5M for human-machine interaction research by NSF

Pitts & team awarded $5.5M for human-machine interaction research by NSF

Photo of UAS facility
An unmanned aerial systems research and test facility at Purdue will be the test bed of virtual and augmented reality experiments on human interactions with drones and other aircraft. (Purdue University photo/Darcy Bullock)
Photo of person with headgear
To help machines recognize and adapt to someone’s skill level, researchers are studying how much people trust machines as they use them. (Purdue University photo/Marshall Farthing)
Photo of Pitts' lab driving simulator.
Experiments using this driving simulation in the NHanCE Lab will help researchers to develop algorithms that allow an autonomous vehicle to understand and predict human behavior. (Purdue University photo/DeEtte Starr)
Collaborative, multi-institute research is helping intelligent machines to adapt to a person’s skill level.

Vehicles make more decisions for people than we might realize – and they have been making them for a while. Just nine years after the Wright brothers made the first airplane flight in 1903, autopilot was invented. Cruise control came along in 1948.

But as both air and road travel witness the introduction of more automated features and shift toward highly autonomous systems, people will need more help learning how to use them.

A multi-institute team of researchers has recently been formed to develop algorithms that would allow a system to recognize when a human doesn’t know how to use it, and then adapt to that person’s skill level.

The researchers believe that these systems would not only train users faster, but also increase safety.

“We’re already asking humans to interact with intelligent machines and autonomous systems all the time, but we need to do it much better than we currently do,” said Neera Jain, an assistant professor of mechanical engineering at Purdue University.

The work, co-led by the University of New Mexico and Purdue in collaboration with the University of Colorado and the University of Texas at Austin, is part of a $5.5 million grant awarded by the National Science Foundation Cyber-Physical Systems (CPS) program. The project is titled “Cognitive Autonomy for Human CPS: Turning Novices into Experts.”

Participants’ performance will be studied as they interact with advanced driving simulations in the lab of Brandon Pitts, a Purdue assistant professor of industrial engineering, and virtual or augmented reality environments, such as a 20,000-square-foot unmanned aerial systems research and test facility at Purdue scheduled for completion this spring. The Purdue team will conduct experiments on how humans interact with complex machines, such as planes, drones and autonomous cars. To study how much participants trust these systems, the team will collect data on changes in heart rate, blood pressure, eye movement and other metrics through psychophysiological sensors. Jain and Tahira Reid, a Purdue associate professor of mechanical engineering, developed models in 2018 that use these measurements to help a system estimate a human’s level of trust.

“Imagine an autopilot system that can identify your experience level and then gradually relinquish control as you improve. It could significantly reduce the amount of time it takes to train a pilot,” said Inseok Hwang, a professor of aeronautics and astronautics and principal investigator for Purdue on this project. Hwang’s lab will conduct research on human interactions with drones and other aircraft.

University of New Mexico researchers, led by Meeko Oishi, a professor of electrical and computer engineering and the project principal investigator, will use data from these experiments to come up with theories of how humans best learn while using these machines. The collaboration as a whole will develop algorithms and test them for translation into software enabling machines to understand, predict and adapt to human behavior.

Anthropology researcher Tryphenia Peele-Eady, the director of the University of New Mexico’s College of Education Multicultural Education Center, will manage and analyze the effectiveness of the program. The project also aims to broaden participation in engineering and computing through research and mentoring opportunities of both underrepresented and underserved communities. A new program at the University of New Mexico, called the Summer Intensive Research Institute, will target underrepresented undergraduate students and provide opportunities for them to join research projects in cyber-physical systems at each of the four collaborating universities. The program will engage students in professional development activities to prepare them for careers in cyber-physical systems.

The team hopes the work will not only grow the field by developing machines that are responsive to human behavior, but also inform ways to prevent ongoing pitfalls of how humans use autonomous systems.

“There are three things that the field wants to avoid fundamentally in human-machine interaction – misuse, disuse and abuse of the technology. We’re designing new algorithms to make advances toward overcoming each of those barriers,” Jain said.

The work aligns with Purdue's Giant Leaps celebration, acknowledging the university’s global advancements made in artificial intelligence, algorithms and automation as part of Purdue’s 150th anniversary. This is one of the four themes of the yearlong celebration’s Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Writer: Kayla Wiles, 765-494-2432, wiles5@purdue.edu

Sources:

Inseok Hwang, 765-494-0687, ihwang@purdue.edu

Neera Jain, 765-496-0436, neerajain@purdue.edu

Brandon Pitts, 765-494-0062, bjpitts@purdue.edu

Tahira Reid, 765-494-7209, tahira@purdue.edu

Related Link: https://www.purdue.edu/newsroom/releases/2019/Q4/one-day,-a-plane-could-give-you-flying-lessons.html