Research
Computational Astrophysics
Professor Paul Woodward has developed a series of numerical methods for multifluid gas dynamics that enable his research and which are also in use in a variety of community codes, such as FLASH and ENZO. As the director of the University’s Laboratory for Computational Science & Engineering (LCSE) within the Digital Technology Center, Woodward has led a number of collaborations over the years with national laboratories and industries that have advanced computing technologies and the ability of simulation codes to exploit them. Woodward chaired the Science and Engineering Team Advisory Committee (SETAC) for the NSF’s Blue Waters Track-1 computing project from 2013 to 2019. NSF has now shifted its premier national computing project to Frontera, at the Texas Advanced Computing Center, and Woodward has moved his simulation work to that center. In his recent research, Woodward has used the entire Frontera machine to simulate the interaction between two overlying convection zones in a massive star that is separated by only a very thin layer of stably stratified material (cf. www.lcse.umn.edu). How a star behaves in this sort of situation has been an outstanding problem in stellar evolution theory for decades. Running on 7020 nodes of the Frontera machine at over 4 Pflop/s, a description of the star (see figure) on a grid of 56.6 billion cells is updated 16 times per second by Woodward’s latest simulation code, PPMstar. Running at this scale and speed enables simulation of important stages in stellar evolution that are brief, but not as brief as an explosion. The advent of still larger and faster machines, such as Argonne National Lab’s Aurora in 2021, will further expand our ability to simulate such events in stellar evolution in detail and with confidence that the results are accurate. These new machines pose substantial challenges to simulation code design, because of their integration of advanced CPU and GPU technologies as well as their capacity to generate datasets exceeding 1 petabyte per simulation. One focus of Woodward’s research in computational astrophysics is how to meet these challenges and thus harness the tremendous power of the coming generation of exascale computing systems.


Stars, Planetary Systems & their Evolution
Professor Paul Woodward uses large-scale computer simulations of deep stellar interiors to investigate brief events in stars, lasting from days to months, where the one-dimensional descriptions of stellar evolution codes cannot provide an adequate description. This work, in collaboration with Professor Falk Herwig of the University of Victoria, focuses on details of mixing processes at the boundaries of convection zones that can profoundly influence the evolution of the star by carrying nuclear fuels into regions where they can burn semi-explosively. Such events occur in late evolutionary stages. For massive stars, they have the potential to strongly affect the ultimate supernova explosion. Recent application of Woodward’s simulation codes to massive main-sequence stars, with spherical core-convection regions, addresses the ingestion of additional hydrogen fuel into the central burning region, which prolongs the star’s life on the main sequence. These more quiescent simulations (see figure) allow us to predict the spectrum of gravity wave disturbances, driven by the convection, that reach the star’s surface. These disturbances are observed by exoplanet-seeking satellites like TESS. This asteroseismology provides the opportunity for observational evidence of the behavior and structure of the deep interior of stars. The simulations can connect the observations to the interior structure of the star, allowing us a check on the correctness of the stellar models.

