Towards Computational Acoustic Cameras: Neural Deconvolution and Rendering for Synthetic Aperture Sonar
Event Date: | April 19, 2024 |
---|---|
Location: | 10:00 am |
Contact Name: | WANG 1004 |
Priority: | No |
School or Program: | Electrical and Computer Engineering |
College Calendar: | Hide |
Arizona State University
Abstract
Acoustic imaging leverages sound to form visual products with applications including biomedical ultrasound and sonar. In particular, synthetic aperture sonar (SAS) has been developed to generate high-resolution imagery of both in-air and underwater environments. In this talk, we explore the application of implicit neural representations and neural rendering for SAS imaging and highlight how such techniques can enhance acoustic imaging for both 2D and 3D reconstructions. Specifically we discuss challenges of neural rendering applied to acoustic imaging especially when handling the phase of reflected acoustic waves that is critical for high spatial resolution in beamforming. We present two recent works on enhanced 2D circular SAS deconvolution in air as well as a general neural rendering framework for 3D volumetric SAS. This research is the starting point for realizing the next generation of acoustic cameras for a variety of applications in air and water environments for the future.
Bio
Hosts
Stanley Chan, stanchan@purdue.edu, and Qi Guo, qiguo@purdue.edu
2024-04-19 08:00:00 2024-04-19 17:00:00 America/Indiana/Indianapolis Towards Computational Acoustic Cameras: Neural Deconvolution and Rendering for Synthetic Aperture Sonar Suren Jayasuriya Arizona State University 10:00 am