Human-Robot Teaming
During the augmented virtual reality roofing experiment, researchers controlled the temperature and humidity settings to replicate typical summer working conditions as closely as possible.
Evaluating workers' perceptions of interacting with AI on the jobsite
Artificial intelligence and machine learning have already begun to transform the construction site. Robotic bricklaying machines can place up to 3,000 bricks per day. AI-powered cameras and sensors can alert the crew of potential safety hazards. Drones can deliver tools and materials on site.
But there’s still a human component to human-robot teaming and not much research has been done to evaluate how an individual’s reaction to technology affects their ability to interact with it. Woei-Chyi Chang, a PhD candidate and graduate researcher working with Sogand Hasanzadeh, assistant professor of civil and construction engineering, sought to better understand the worker perspective when interacting with AI on the job through multidisciplinary research converging at the nexus of civil engineering, AI and human factors.
“We expect to see a lot of different AI technologies incorporated in future worksites,” said Chang. “Through this research, I wanted to evaluate three primary components essential for successful human-robot teaming: trust, communication and safety.”
The researchers created two different environments to simulate future job sites and evaluate how workers react to human-robot teaming. In a mixed-reality bricklaying scenario, participants donned a virtual reality (VR) headset that depicted a large building construction project. In the immersive environment, participants held physical tools while moving around the space, interacting with a bricklaying robot and virtual bricks. Periodically, a virtual drone would fly into the scene to deliver a fresh bucket of mortar or communicate messages and change orders.
“The mixed reality environment contains passive haptics, which are physical objects in the space that represent virtual objects participants see with the VR headset,” Hasanzadeh said. “Combined, these elements create an immersive environment of the construction site. When the drone approaches the worker, they can hear the blades spinning very fast.”
To determine whether a malfunction of the technology would affect workers’ trust, Chang programmed the virtual drone to fly into the participant’s body. Although participants were not physically harmed, the VR environment created the perception that the drone made contact.
“It can take time for workers to build their trust with AI, but once that trust is lost due to malfunction of the system, how long will it take to reestablish their trust?” Hasanzadeh said. “To achieve a safe, inclusive construction site of the future, AI systems must be able to anticipate, track and predict all types of worker behavior to team effectively and prevent jobsite injuries.”
The worker-AI collaborative bricklaying experiment necessitates workers’ interaction with a bricklaying robot, drones, and AI-assistant under various conditions, such as technology malfunction, time pressure, and multitasking.
The second scenario simulated a residential roofing job where participants perched on a physical sloped platform meant to replicate roof decking. For this augmented virtual reality experiment, participants wore a harness, just as they would on an actual job site, and interacted with a physical hammer to install actual asphalt shingles.
An extended reality headset displays a full virtual environment that extends beyond the physical platform, while still allowing the individual to see the roof decking, materials and their own hands in the physical world. Once again, the participants interacted with a virtual drone that inspected their performance, delivered messages or asked questions regarding work progress.
In the roofing experiment, Chang and Shiva Pooladvand (PhD CE’24) controlled the temperature and humidity settings to replicate typical summer working conditions as closely as possible. The chamber used for the roofing environment, located in Herrick Laboratories, registered a heat index of 107 degrees.
The researchers collected subjective data from participants, asking them questions to evaluate their level of trust and acceptance of technology. They also collected objective data through eye tracking, brain activation, heart rate and skin conductance, which is a measure of nervous system arousal in response to stimuli. They analyzed the data to determine how trust, communication and safety affect worker acceptance of technology.
“Construction is already a physically and mentally demanding field,” Chang said. “The demands of productivity can create high pressure situations where workers over-trust technology or do not pay enough attention to it. When that technology malfunctions, it becomes problematic and can cause safety risks. It is important for workers to have effortless communication with AI so they are not overtaxed.”
The next step is to devise strategies on how to design construction technology in a manner that helps human workers successfully interact with AI on the job site. Despite advances in technology, Hasanzadeh doesn’t envision a future where AI takes over the construction industry entirely.
“The construction environment is very complicated and dynamic,” Hasanzadeh said. “AI can help improve safety and enhance efficiency, but it cannot work alone. We cannot remove humans from the process. Robots can ease some of the physical demands of a construction site, but for rational problem solving with changing variables, we need human input. Workers should not be the passive recipients of these technologies, they should be active members of a team.”