Barrett Caldwell and doctoral student Jordan Hill have performed NASA-funded research among the sharp volcanic shards and rugged terrain of “Mars analogs” on Earth.
“Actually, I still have some scars from it,” says Caldwell, professor of industrial engineering.
Their mission was to help NASA develop systems that monitor an astronaut’s vital signs and allow humans and machines to interact seamlessly in space. When it comes to Mars, a central challenge is dealing with a communication delay of up to 20 minutes, depending on the planet’s orbital distance from Earth.
“Trying to figure out how to get teams in sync with that much communication lag is really hard,” he says. “You certainly can’t have people on the ground say, ‘Well, do this and wait until the next time we have something to say to you,’ because it’s like a 40-minute delay roundtrip. And then you’ve got the astronauts just twiddling their thumbs. Worse yet, if there’s a really crucial issue or there is a threat to astronaut health and safety, by the time Earth finds out, the astronaut could be already dead.”
The results of their research will help determine the requirements for a system to monitor an astronaut’s physiology using machine-learning algorithms.
“You’ve got to be able to have the capabilities on Mars, either with the other crew members or with automation systems onboard, because you can’t wait for the ground to do it for you,” he says. “Is an astronaut about to pass out because they are overworked? Are they not recognizing that they are too fatigued and need to stop?”
Commercial products have recently been developed for remote monitoring, where health care providers are alerted in the case of abnormal or concerning measurements. However, these products are largely meant to be used in a home or other controlled environment and are rarely meant for continuous transmission of data over a long period, Caldwell says.
He and Hill were part of a team that performed research on volcanic terrain in Idaho and Hawaii, where they integrated off-the-shelf products.
“The team would go out and collect real geological samples with a mission profile that simulated a Mars communication delay and look at how that influences the communication process between the science team on Earth and the habitat crew and the extracurricular vehicle crew,” Caldwell says.
The research was part of NASA’s Biologic Analog Science Associated with Lava Terrains, or BASALT, program. It was carried out at the Craters of the Moon National Monument and Preserve in Idaho, and Hawaii Volcanoes National Park. The Purdue researchers made several trips to the Mars analogs, most recently in November 2017.
The lava terrains of Earth analogs are considered a “hostile setting.”
“We would be out for one to two weeks and go out every day,” he says. “These lava terrains were very rough, very sharp because of the glassy formations. It’s black, so imagine it’s out in the sun all the time, so you are out there on a day when the air temperature is 85 or 90 degrees. The rock actually gets to be too hot to sit on.”
Research findings were published in July 2018 in IISE Transactions on Healthcare Systems Engineering, published by the Institute of Industrial Systems Engineers. A new paper will be presented during a Human Factors and Ergonomics Society conference in October 2019.
“We were basically demonstrating that you could do live real-time physiological monitoring and that you could both store it locally and transmit it in a way that people in the habitat could monitor it,” Caldwell says. “And then we figured out as we were doing this that, yes, you can get it to the habitat and you can put it on a computer screen, but the astronauts in the habitat are way too busy to be looking at it all the time.”
To be effective, elements of automation should be integrated, freeing humans for other, more productive activities.
Role playing on Mars analogs
Team members playing the role of astronauts wore off-the-shelf monitors that fastened around the chest.
“We didn’t have to put electrodes on people, but we had to set up a backpack communication network that would allow us to send a signal through a dedicated network,” Caldwell says.
A fundamental limitation of commercial biomonitoring products is that they are designed for the population at large.
“But astronauts aren’t going to look like the physical parameters of the normal population,” he says. “While the average person might be getting to 130 beats per minute while exercising, if you are a highly trained athlete your resting pulse might be 55, so what looks like working hard for someone else might not for you.”
A potential solution is machine-learning algorithms that recognize patterns. “If you can have algorithms that can recognize each individual astronaut’s patterns, you no longer have to have a crew member sitting there staring at their vital signs the way you would have a doctor or a nurse in a hospital monitoring someone’s vital signs.”
Researchers in his lab are determining the requirements for such a system. “What would be the most effective way for automation to communicate with humans in the same way that humans can collaborate with each other?” Caldwell asks.
Machine-learning requirements
The research laid the groundwork for proper “function allocation.”
“In other words, who needs to be doing what and how would you do it, so that when people start writing the machine-learning algorithms, they know what’s important to code in the algorithms,” Caldwell says. “One of the challenges is that everyone’s physiology looks different, and everyone’s physiology looks different based on which tasks they are doing. In order to effectively support the astronauts while they are doing their EVAs, you can’t just have a human monitor, and you can’t have a simple population algorithm off-the-shelf that does it because that’s based on the norms of lots of people. You are making critical mission decisions based on individual data, so basically you need algorithms that can learn each crew member as an individual.”
In related research, Megan Nyre-Yu, a doctoral student working in Caldwell’s Group Performance Environments Research (GROUPER) Lab, is tackling the challenges of human-system integration for cybersecurity.
“In both cases you have the concern that the computer system is able to collect a lot of data, store a lot of data and in some cases analyze a lot of data,” Caldwell says, “but what’s really hard is to be able to communicate that in a way that a human analyst, either an astronaut or cybersecurity professional, can make sense of and respond effectively to.”
Grant support for this research was provided by NASA Planetary Science and Technology Through Analog Research (PSTAR) Program (NNH14ZDA001N-PSTAR) grant (14-PSTAR14_2-0007) to D. Lim and additional support from NASA SSERVI FINESSE grant to J. Heldmann. Barrett Caldwell was the Purdue PI for both grants (FINESSE: NNX14AF37A; BASALT: NNX15AM05A).
Photo At Top:
Jocelyn Dunn (Ph.D. IE ’16) was part of a six-crew member team that spent eight months in a domed habitat on a Hawaiian volcanic landscape that mimicked life on a Martian outpost. Dunn currently works as a human performance engineer at the Human Physiology, Performance, Protection, and Operations (H-3PO) Laboratory at NASA Johnson Space Center.
Purdue University image – Jocelyn Dunn
“You are making critical mission decisions based on individual data, so basically you need algorithms that can learn each crew member as an individual.”
— Barrett Caldwell