AAE Associate Professor, Jonathan Poggie, Wins DoE INCITE Award

Event Date: November 19, 2015
AAE Associate Professor, Jonathan Poggie, won a grant of 150 million CPU-Hours per year for two years on Argonne National Laboratory's supercomputer, Mira, the fifth largest in the world.

Sample slice of the instantaneous density field in a Mach 2.3 turbulent boundary layer computed by Prof. Poggie. The computational mesh for this direct numerical simulation was over 33 billion cells, and was run on up to 102400 cores under a DoD HPCMP Frontier Project. For full details of the computations, see Poggie et al., Computers & Fluids, v. 120, pp. 57-69, 2015.Prof. Jonathan Poggie, new to Purdue this fall, has won a grant of 150 million cpu-hours per year for two years on Argonne National Laboratory’s supercomputer Mira, the fifth largest in the world. The grant of supercomputer time comes under the INCITE Program (Innovative and Novel Computational Impact on Theory and Experiment) of the US Department of Energy’s Office of Science.

With his collaborators at the Air Force Research Laboratory, Poggie will investigate unsteady separation in compressible, turbulent ow. Unsteady separation is characterized by the presence of long time-scale (1–100 ms), low-frequency (10–1000 Hz) pressure uctuations. These uctuations lie in a regime near the typical resonant frequency of aircraft panels, and thus lead to severe structural fatigue loading. A key scientific question remains as to why such low-frequency oscillations exist.

The disparity of length and time scales between fine-grain turbulence and large-scale flow unsteadiness makes computational simulation of these flows inherently challenging. With the extraordinary computational power of the DoE’s supercomputers, Poggie’s project will attack the problem through massively-parallel, direct numerical simulations.

For more information on this, please visit http://www.anl.gov/articles/incite-grants-awarded-56-computational-research-projects

 

Photo caption: Sample slice of the instantaneous density field in a Mach 2.3 turbulent boundary layer computed by Prof. Poggie. The computational mesh for this direct numerical simulation was over 33 billion cells, and was run on up to 102400 cores under a DoD HPCMP Frontier Project. For full details of the computations, see Poggie et al., Computers & Fluids, v. 120, pp. 57-69, 2015.


Publish date: November 19, 2015