July 27, 2020

ECE researchers to study adversarial attacks on deep neural networks

Purdue researchers recently received an award from the Army Research Office (ARO) to conduct theoretical and experimental studies on adversarial robustness of deep neural networks. Over the past decade, deep neural networks have made substantial progress in several areas such as computer vision and speech recognition.

Purdue researchers recently received an award from the Army Research Office (ARO) to conduct theoretical and experimental studies on adversarial robustness of deep neural networks. Over the past decade, deep neural networks have made substantial progress in several areas such as computer vision and speech recognition. However, studies have also shown that these networks are very vulnerable to adversarial attacks --- artificial patterns created intentionally to fool the neural networks. Concerns have been raised that if the networks can fail so easily, these powerful machine learning methods may be very unreliable.

Purdue’s research will focus on adversarial attacks in real physical environments, by using computational imaging techniques to perturb the appearance of a 3D object so that the camera system will misclassify the object. Stanley Chan, assistant professor of electrical and computer engineering and statistics, is the Principal Investigator on the project.

schematic diagram of project
The figure shows a schematic diagram highlighting the concept of a newly funded project by the Army Research Office. The goal is to analyze the feasibility of attacking a camera system using computational imaging techniques. In this figure, the projector projects an adversarial attack pattern onto an object in real physical environment. When the camera sees the perturbed object, it misclassifies it as another object. The research will study the feasibility of such an attack modality, and means to defend. Image courtesy: Prof. Stanley Chan

“While most of the existing results in the literature are focusing on attack and defense in the digital domain such as images and videos, only a handful of studies have shown results in real physical environments,” says Chan. “Our goal is to fill the gap by theoretically analyzing the interactions between the attack and the environment. We hope that our conclusions can shed new light on the future research in adversarial robustness.”

The Army Research Office competitively selects and funds basic research proposals from educational institutions, nonprofit organizations and private industry that will enable crucial future Army technologies and capabilities.

Share