Skip to main content

Seminar: High-Fidelity Simulations on Complex Configurations using Massively Parallel Supercomputers

Dr. Nicholas Bisek, Air Force Resesarch Laboratory

All dates for this event occur in the past.

N056 Scott Lab
N056 Scott Lab
201 W. 19th Ave.
Columbus, OH 43210
United States

Since the dot-com boom of the late 1990’s, computing architecture and performance has continued to grow and evolve at an exponential rate.  These advances have allowed computational engineers to simulate and predict complex flowfields on increasing larger and more complex computer systems.  For example, in 2000 Adams, et al. published one of the largest Computational Fluid Dynamics (CFD) predictions at the time, containing nearly 20 million grid points (running for 750 hours on 8 vector processors = 6,000 CPU-hrs).  Fast forward a decade, and members of Stanford’s Center for Turbulence Research were able to use the DoE IBM BlueGene/Q Sequoia supercomputer to apply CFD to grid with over 3 trillion points (running on over 1.6 million processors simultaneously).  Advancements like these have not only allowed researchers to use higher-density grid systems, which has allowed the users to replace lower-fidelity models with high-fidelity approaches that remove many of the limiting assumptions, but have also allowed CFD to be applied to entire geometries rather than just individual components or sub-systems.

This presentation will review some of the latest high-fidelity CFD research being performed in Basic Sciences branch of the High-Speed Division within the Air Force Research Laboratory’s Aerospace Systems Directorate (AFRL/RQHF). Specifically, the talk will review the progress made under the first DoD HPCMP Frontier Project, the largest single computational award ever given within the DoD HPCMP.  The overall goal of the project is to apply near-wall resolved implicit large eddy simulations (LES) to a full-scale hypersonic geometry at realistic flight conditions. The geometry considered is the sixth vehicle in the Hypersonic International Flight Research Experimentation program (HIFiRE-6).  The talk will cover the steps required to achieve this feat, including the first attempts at simulations with grids in excess of a 2^31 points, overcoming the challenges of running the massively parallel jobs for many months to collect the long time histories needed to resolve the full range of time-scales present in the system, and finally (and probably most importantly), managing, compressing and extracting useful information from the PetaBytes of data being generated within each the simulations.

About the Speaker

Dr. Nicholas Bisek received his PhD from the University of Michigan (2010), investigating the effects of plasma-based flow control devices for hypersonic concepts.  He continued his research in magnetohydrodynamics (MHD) during a post-doctoral appointment at AFRL/RB working under the supervision of Dr. Jon Poggie, before transitioning to a civilian appointment in the Computational Sciences Center at the end of 2010.  He subsequently began exploring unsteady supersonic/hypersonic turbulent flows under the mentorship of Dr. Donald Rizzetta.  Of particular interest were low-frequency oscillations of the shock foot, which can be detrimental for high-speed systems.  Using high-fidelity near-wall-resolved large eddy simulations, Dr. Bisek not only successfully reproduced these events, but also implemented a plasma-based controller to demonstrate a significant reduction in SBLI-induced flow separation and, with it, an ability to shift the low frequency oscillations to higher and safer frequencies. For the last few years, Nick has been a co-Principal Investigator on the DoD HPCMP AFRL Frontier Project, where he has been applying high-fidelity numerical methods to increasingly complex configurations.  He recently completed the first near-wall resolved implicit large-eddy simulation (ILES) of the full-scale HIFiRE-6 flight vehicle at actual flight conditions.  This feat would not have been possible with out a very large commitment of computational resources on the largest supercomputers available in the DoD.  He has authored or co-authors over 10 journal articles and 25 conference papers in the areas of computational fluid dynamics and magnetohydrodynamics.  He is an active member of the APS, AMSE, and AIAA, where is a member of the Fluid Dynamics TC and chair of the LES DG.