Skip to main content

Researchers use drone-mounted microphones to suss out the shapes of rooms


Can a drone determine the shape of a room using microphones alone? That’s what Purdue University researchers set out to discover in a study that will be published next week in the SIAM Journal on Applied Algebra and Geometry. In it, they propose a drone-mounted microphone array that measures sound emitted by a loudspeaker to inform a digital reconstruction of the physical surroundings.

Normally, the microphones wouldn’t be able to determine the distance corresponding to each virtual source point, but the team designed a method to label the distances that correlates with each wall, a process they call “echo sorting.” As it turns out, echo sorting requires simply a rigid configuration of non-coplanar microphones, or mics that don’t lie on the same surface or plane.

“The microphones listen to a short sound impulse bouncing on finite planar surfaces — or the ‘walls,'” said Mireille Boutin, a professor of mathematics and electrical and computer engineering at Purdue University. “When a microphone hears a sound that has bounced on a wall, the time difference between the emission and reception of the sound is recorded. This time difference corresponds to the distance traveled by the sound during that time.”

Boutin, Technical University of Munich professor Gregor Kemper, and colleagues aligned four microphones in a “rigid configuration” to pick up echoes — both those that bounced only once on a wall (first-order echoes) and those that bounced back from each wall (second-order echoes). The time delay of the first-order echoes provided a set of distances from microphones to mirror images of the source, which the researchers leveraged to model bounced sound as if it were coming from a virtual source behind the wall instead of the source.

VB Event

The AI Impact Tour – Atlanta

Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.
Request an invite

Concretely, the team’s echo sorting technique discovered whether the four microphone-measured distances lay on the “zero set” of a polynomial (a mathematical expression consisting of variables and coefficients) in four variables, as determined by symbolic computations performed by a computer algebra system. A nonzero value revealed that the distances couldn’t bounce from the same wall, while a polynomial equal to zero indicated the distances could have come from the same wall.

The researchers expect that certain drone placements will interfere with the mathematical model, but they note that their work has a range of applications beyond the scope of the study. Their paper describes scenarios with indoor sound sources and sources placed on vehicles that might get rotated and translated due to movement.

“These microphones can be placed inside a room or on any vehicle, such as a car, an underwater vehicle, or a person’s helmet,” Kemper said. “A moving car is different from a drone or an underwater vehicle in an interesting way. Its positions have only three degrees of freedom — x-axes, y-axes, and orientation — whereas a drone has six degrees of freedom. Our work indicates that these six degrees of freedom are sufficient to almost always detect the walls, but this does not necessarily mean that three degrees will also suffice. The case of a car or any surface-based vehicle is the subject of ongoing research by our group.”

They leave to future work identifying less computationally intensive techniques to prove the same results.

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.