Project name: Supernova Science Center

Responding PI: Stan Woosley

A. Understanding the Primary Goals and Objectives of the Application:

We are seeking to understand how supernovae work, how some stars die in the most energetic explosions in the universe. We will accomplish this goal by numerical simulation, chiefly in multi-dimensions, of the complex interplay of nuclear physics, particle physics, (magneto)hydrodynamics, turbulence, and equation of state that go on in these events. We also want to understand the origin of the elements, most of which have been synthesized in supernovae, and to be able to match, with our simulations, the rich detail of supernova

We also will provide a refereed archive of standard nuclear reaction rates for astrophysical application to the community in a computer- friendly format.

A website for our program, still under vigorous development, is http://www.supersci.org

Major Project Milestones:

Year 1:

  • Library of presupernova models of massive stars - 11 to 60 solar masses, various metallicities (UCSC)
  • Studies of turbulent flame propagation (in 2D) relevant to the Type Ia supernova problem. First cut 2D Type Ia supernova models using FLASH code (UCSC)
  • Studies of 3D stellar convection coupled to nuclear burning in massive stars, with and without rotation, using anelastic hydrodynamics (UCSC)
  • Library of standard nuclear data for astrophysics on web. Version 1 (LLNL)
  • First 3D supernova calculations using smooth particle hydrodynamics - calculations of both the explosion and mixing after the explosion, begin studies of asymmetric explosions (LANL)
  • Integrated Astrophysical Modeling and Visualization System parallel programming and debugging facility, run time and execution infrastructure. First stages (Arizona).
  • The all-purpose Monte Carlo code to calibrate the 1D, 2D, and 3D neutrino and photon Boltzmann solvers. Complete coding and initial testing. Gamma-ray line spectra of multi-D models. (LANL)
  • 1D implicit neutrino radiation-hydro code, employing the method of short characteristics to solve the Boltzmann equation and Winkler's adaptive-grid hydrodynamics methodology. Code development (Arizona)
  • The atomic-data-language parser, to be augmented with extensions to the atomic database, used for NLTE spectral synthesis. (Arizona)
  • Year 2:

  • 3D supernova models using SPH. Incorporate flux limited radiation diffusion. First studies of rotation and kicks. (LANL)
  • Nucleosynthesis library from simulated 1D explosions (UCSC)
  • First 2D SN Ia models using the FLASH code (UCSC, MPA, Chicago)
  • First nucleosynthesis studies in 2D and 3D models (UCSC/LANL/Ariz)
  • First 3D MHD calculations of rotating stellar evolution using anelastic hydro. Studies of onset of thermonuclear runaway in Type Ia supernovae. Build up to do protoneutron star formation (UCSC)
  • Complete astrophysical data archive. Begin computation of improved theoretical rates (LLNL)
  • SCALE, a parallel 2D and 3D short-characteristics Boltzmann solver for neutrino transport/transfer, coupled to an ALE hydrodynamics code (with both Newtonian and post-Newtonian gravity). This code will be used to simulate the core-collapse supernova mechanism Coding finished end of year 2 (Arizona)
  • A parallel version of an NLTE radiation solver (from EDDINGTON), coupled to the 1D, 2D, and 3D variants of SCALE to perform fully-resolved simulations of light curves and simple spectra code finished year 2 (Arizona)
  • 1D implicit neutrino radiation-hydro code, employing the method of short characteristics. GR and Newtonian versions used to simulate spherical core collapse and explosion. (Arizona)
  • 3D Monte-Carlo calculations of polarization (LANL)
  • Integrated Astrophysical Modeling and Visualization System - continue development and assist efforts at Arizona, UCSC, and LANL
  • Comparison of results - LANL, Arizona, MPA
  • Year 3

  • 3D Calculations of core collapse supernovae with SCALE (Arizona).
  • 2D and possibly 3D light curves and spectra (Arizona)
  • 3D-GR models using SPH (LANL)
  • 3D Production modules of Type Ia supernovae (UCSC/MPA/Chicago)
  • 3D MHD studies of neutron star formation; 3D Monte-Carlo studies of radiation transport in all models (LANL)
  • Nucleosynthesis studies in all models (UCSC/LLNL/Arizona/LAN)
  • Improved theoretical nuclear reaction rates (LLNL)
  • Integrated Astrophysical Modeling and Visualization System - continue development and assist efforts at Arizona, UCSC, and LANL
  • B. Understanding the Initial State of the Computational Application

    For the core part of our project, we shall employ a suite of eight codes, several of which are already in a mature stage of development. These codes will calculate the presupernova evolution of stars, their explosion, and radiation transport in the resultant debris. In general, these codes operate sequentially in that the output of one - e.g., the presupernova code - is used as input for another - e.g, the supernova explosion code which in turn provides input to a radiation transport calculation. Some codes have duplicate functions to allow for cross comparison.

    In response to specific questions in this section:

    1. The hydro codes to be studied and optimized are running as one large system

    2. Our approach to optimize the code using analytical and/or simulation models of the code modules to analyze execution at run time and enable us to reconfigure the parallel execution to optimize its performance.

    3. We will use a combination of functional and data programming models.

    4. We use our own monitoring codes as well as other tools available such as netlogger, NWS, etc.

    5. The FLASH code runs on the ASCI machines. The machines at UCSC are running on a combination of ASCI machines and local Beowulf clusters, 16 and 32 CPUs (AMD 1.2 GHz). In fall, 2001, the UCSC codes will run on a new 256 CPU (AMD 1.2 GHz) Beowulf cluster on which the co-I's will have 25% of the time. At LANL, the codes are running on AVALON, a local Beowulf cluster developed by collaborator Mike Warren. At Arizona, they will initially use an SGI Origin and a cluster of workstations connected by a high speed network (gigabit Ethernet). Beginning this fall we expect to be major users at NERSC.

    6. Yes, we have a need to a wide area network access in order to run our codes on the IBM-SP (seaborg), the T3E, and ASCI machines.

    Part C: Understanding Your Code Development Plans for Meeting Your Objectives

    1. Hydro Codes: Zeus-MP, and ASCII-Flash, Anelastic MHD code, 2D/3D radiation-hydro code using the method of short characteristics, Implicit Monte Carlo code, 3D SPH code, ZATHRAS - revisions to be made: the codes will run faster using our optimized techniques - we will develop models that enable us to optimize module executions at run time - we will develop a DEVS-based run time system to assist in controlling, managing and visualizing the results produced by our simulations.

    2. Yes, we expect to face multilanguage integration issues. a. The languages will be C, Fortran 77, Fortran 90, C++, and Java b. A hybrid programming models (data parallel and functional parallel)

    3. I am not sure about the numerical algorithms that we might need for each module

    4. We have our own programming environments that we have developed such as ADViCE, CATALINA, and DEVS. In addition, we will use other tools and programming environments, etc.

    5. Small jobs will run on local Beowulf clusters of which we have access to several. Typically these are AMD architecture using the PGI compiler. But the core of our research requires running on the largest computers available. We have time requested at NERSC and may continue to have access to all the unclassified ASCI machines.

    6. Speed at out 4 sites vary from about 50 to 100 Mbps.

    7. Arizona: ECE Building (ECE 224), Astronomy Department; UCSC Interdisciplinary Science Building.

    8. We have a strong requirement for visualization. For steering and controlling the visualization, we will use mobile agents being developed at UofA and also the DEVS real-time environment.

    Part D: Identifying the Most Critical Applied Math and Computer Science Needs for Meeting Your Key Objectives

    We will interact with the following ISIC centers:

    1. Solvers - reactive chemistry - we are interested in reactive nuclear chemistry and solving elliptical, hyperbolic, and Poisson equations, especially in the context of general relativity.

    2. Locally Structured Grid Methods - we are very interested in combustion physics, nuclear and chemical combustion have many similarities, especially flame propagation in the presence of turbulence, distributed burning, transition to detonation, etc.

    3. Terascale Simulation tools and technologies - most of our codes are grid based. Only one is adaptive mesh at the present time.

    4. Performance - the design of efficient algorithms is also one of our core mission goals.

    5. Common Component Architecture - our codes are mostly Fortran, C+ and C++ (LANL codes are chiefly C+, C++, others f77 and f90)

    6. Scalable Systems Software - we expect our programs to run efficiently on a variety of parallel platforms from small Beowulfs to the largest ASCI machines.

    7. Data Management - we will have to manage very large data sets generated by the 3D calculations.