Archive of SciDAC Discovery Highlights

Researchers use PFLOTRAN on Jaguar to study CO2 sequestration

CO2 fingering

Fingering of dissolved CO2 during subsurface carbon sequestration enhancing the rate of dissipation of supercritical CO2. The color scale represents the mole fraction of dissolved CO2.

The SciDAC project, Modeling Multiscale-Multiphase-Multicomponent Subsurface Reactive Flows using Advanced Computing, led by Peter Lichtner of Los Alamos National Laboratory, is using the PFLOTRAN code to study CO2 sequestration. Injecting supercritical (heated and pressurized) CO2 into subsurface geologic formations is one type of carbon sequestration that has been proposed as a way to mitigate the atmospheric accumulation of greenhouse gases released by the burning of fossil fuels.

The researchers are running simulations based on the CO2 output of a 1000MW gas-fired power plant. The simulation domain is an area 7 x 7 kilometers with a thickness of 250 meters. Permeability and porosity typical of sandstone are used. A CO2 volume corresponding to roughly 75% of the plant's output is injected for 20 years, with the simulation continuing (without further injections) for 300 years. This allows scientists to see how the CO2 might dissipate over time.

Gas-fired plants are an important area of study. In 2000, 19% of the U.S. electricity was provided by gas-fired plants. Of the plants added from 2001-2005, 91% were gas-fired. In 2006 there were more gas-fired generators than coal and hydroelectric combined, accounting for 41% of the total U.S. capacity.

PFLOTRAN, a next generation reactive flow and transport model, has been demonstrated to scale to 27580 processors on ORNL’s XT4 Cray, "Jaguar". PFLOTRAN is based on the PETSc parallel libraries which enable highly efficient solution to partial differential equations using domain decomposition. The code demonstrates strong scaling, approaching petaflop performance.

Blue Gene/P Simulations Shed Light on Key Process in Type Ia Supernovae

turbulence simulations

Results from simulation of a buoyantly-unstable flame front.
Left: flame surface in fully-developed, self-regulated state.
Right: volume rendering of vorticity magnitude.

 

In their study of Type Ia supernovae, among the brightest and most powerful exploding stars in the universe, University of Chicago researchers have addressed a critical question about buoyancy-driven turbulent nuclear combustion, a key physical process in these explosions.

Using the FLASH code on the IBM Blue Gene/P supercomputer at the Argonne Leadership Computing Facility, researchers addressed the question, "Is buoyancy-driven turbulent nuclear combustion due primarily to large-scale or small-scale features of the flame surface?" They used more than 40 million processor-hours on the BG/P to run a grid of simulations for different physical conditions. The research team also developed parallel processing tools needed to analyze the large amounts of data produced by the FLASH simulations of buoyancy-driven turbulent nuclear combustion. Preliminary analysis of these results showed that the flame surface is complex at large scales and smooth at small scales.

The results have been published in the SciDAC 2008 conference proceedings. These findings will be used to treat buoyancy-driven turbulent nuclear combustion more accurately in the whole-star, three-dimensional simulations of Type Ia supernovae at the DOE NNSA ASC/Alliance Flash Center, The University of Chicago.

CEDPS success with caBIG and APS beamline
Add a little gRAVI "and science just happens"

caBIG network
Advanced Photon Source

Under the leadership of Ravi Madduri, member of the SciDAC CEDPS project, Argonne software developers have designed and implemented several innovative technologies in the caBIG architecture and have developed a collaborative information network to enable interoperability among biomedial databases and analytical tools.

The tool called gRAVI (grid remote application virtualization interface, pronounced "gravy") along with Introduce toolkit (part of the caGrid toolkit) provides a framework to enable fast and easy creation of Globus based grid services (while hiding all "grid-ness" from the developer). It addresses service support features such as: advertisement, discovery, invocation, and security (Authentication/Authorization. It allows researchers to wrap executables, applications as Grid services without writing a single line of code and thus reducing barrier to entry for researchers to expose their applications as services promoting reuse.

The caBIG initiative is a four-year project, funded by the National Cancer Institute, with the mission of linking the more than 60 cancer centers across the U.S. into an integrated distributed-computing system. There are over 900 caBIG participants accessing 45 different services through caGRID.

The new infrastructure for caBIG, called caGrid, uses gRAVI for creating, registering, discovering, and invoking analytical routines as Grid services. Authorized researchers nationwide can invoke these services and compose multiple services into workflows for individual applications. The infrastructure also allows one to create a common gateway service between the caGrid and the TeraGrid, which integrates high-performance computers, data storage, and high-end experimental facilities around the country. This new gateway service, bridges caGrid authentication and authorization processes to the TeraGrid security services, so users can easily access the resources of the TeraGrid without having to modify their applications.

The toolset is finding fame in application areas outside of SciDAC, from the caBIG cancer researchers, who have given several awards to the Argonne project team, to experimental researchers like Brian Tieman of the Advanced Photon Source.

At a recent workshop on lightweight tools for collaborative science, Tieman reported that use of gRAVI has shortened his development time for new services from over a month to around two days. Tieman is generating services that control a beamline experiment, and the data analysis, visualization and modeling that follow on. In addition to the shorter development time, Tieman says gRAVI provides security for jobs, and tracks when his remote job finishes, “and science just happens”.

For more details on the gRAVI project, see the wiki,
for more about the caBIG awards see Argonne highlight

Breakthrough Fusion Simulation with SciDAC Code
GTC consumes 93 percent of 263TF Jaguar (Cray XT4)


artist's rendering of ITER

A team of researchers from the University of California-Irvine (UCI), working with staff at Oak Ridge National Laboratory's National Center for Computational Sciences (NCCS), reports the largest run in fusion simulation history.

The team, led by Yong Xiao and Zhihong Lin of UCI, used 93 percent of the NCCS's flagship supercomputer Jaguar, a Cray XT4, with the classic fusion code GTC (Gyrokinetic Toroidal Code), the key production code of two fusion SciDAC projects (GPS-TTBP and GSEP).

The researchers discovered, among other things, that for a device the size of ITER, the containment vessel will demonstrate GyroBohm scaling, meaning that the heat transport level is inversely proportional to the device size. In other words, the simulation supports the ITER design: a larger device will lead to more efficient confinement.

"The success of fusion research depends on good confinement of the burning plasma," said Xiao. "This simulation size is the one closest to ITER in terms of practical parameters and proper electron physics."

However, the huge amounts of data produced by fusion simulations can create I/O nightmares: in one GTC run, the team can produce terabytes of data (in this case 60TB). To address this potential bottleneck, the team used ADIOS, a set of library files that allows for easy and fast file input and output, developed mainly by the NCCS's Scott Klasky and Chen Jin and Georgia Tech's Jay Lofstead and Karsten Schwan.

For more details see NCCS article on HPCwire's "off the wire"

SciDAC Team Develops Petascale-Ready Version of CCSM’s Atmospheric Model


Cubed-sphere spectral element grid

The SciDAC project “Modeling the Earth System” is focused on creating a first-generation Earth system model based on the Community Climate System Model (CCSM). As these improvements will require petascale computing resources, the project is also working to ensure that CCSM is ready to fully utilize DOE’s upcoming petascale platforms. The main bottleneck to petascale performance in Earth system models is the scalability of the atmospheric dynamical core. Team members at Sandia, ORNL and NCAR have thus been focusing on the integration and evaluation of new, more scalable, dynamical cores (based on cubed-sphere grids) into the atmospheric component of the CCSM. The first model successfully integrated uses a new formulation of the spectral element method that locally conserves both mass and energy and has positive preserving advection.

This dynamical core allows the CCSM atmospheric component to use true two-dimensional domain decomposition for the first time, leading to unprecedented scalability demonstrated on LLNL’s BG/L system. The model scales well out to 96,000 processors with an average grid spacing of 25 km. Even better scalability will be possible when computing with a global resolution of 10 km, DOE’s long term goal (DOE ScaLeS Report, 2004). As part of the project’s model verification work, a record-setting one-year simulation was just completed on 64,000 processors of BG/L. This initial simulation was obtained using prescribed surface temperatures and without the CCSM land and ice models. Coupling with the other CCSM component models is the team’s current focus.

Unified Programming Environment for Quantum Chromodynamics

In this snapshot of gluon fields from a supercomputer QCD simulation, the gluon fields are started in a nonuniform, chaotic state (left), and quickly diffuse into the full volume of space simulated on the computer (middle and right).

According to the Standard Model of Particles and Interactions, the fundamental constituents of subatomic particles, such as protons and neutrons, are quarks and gluons. The equations governing the forces among quarks have been known for decades. These forces are mediated by particles called gluons, in much the same way that electromagnetic forces are mediated by photons. However, unlike the forces of electricity and magnetism, they become stronger as quarks are pulled apart; this remarkable behavior, which is responsible for the permanent confinement of quarks, is not captured by other force or field theories. The part of the Standard Model that describes this strong interaction, or color force, between quarks and gluons is called Quantum ChromoDynamics (QCD). Only large scale numerical simulations have allowed us to calculate, to high precision, QCD quantities such as the masses and lifetimes of particles containing quarks (i.e. protons, neutrons, etc.). In QCD, quark and gluon fields are defined on a four-dimensional space-time grid called a lattice. The quantum fluctuations of these fields are calculated by Monte Carlo methods. Under its SciDAC grants the U.S. QCD Collaboration (www.usqcd.org) has created a unified programming environment (www.usqcd.org/software.html) for large scale simulations of lattice QCD. With it, they have performed a wide variety of calculations. These include investigations at unprecedented precision of the properties of strongly interacting matter at high temperatures and densities, investigations of the structure and interactions of hadrons, and determinations of the fundamental parameters of the Standard Model, which encompasses our current knowledge of the forces of nature.

New Methods for Accelerating Molecular Dynamics Simulations

Silver (100) surface at T=600K in contact with a Lennard-Jones model liquid

Using an extension of the parallel-replica dynamics method, the surface diffusion of the silver adatom was accurately accelerated by a factor of 6.5 on 8 processors. Using this same approach with more processors will give much larger boost factors for systems at lower temperature

The main challenge in the SciDAC project "Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking" is developing a computational methodology that can simultaneously treat the vast range of scales in time (picoseconds to seconds and beyond) and length (angstroms to millimeters) necessary for accurately simulating the technologically critical process of stress corrosion cracking. As part of this multi-institution project (involving University of Southern California, Harvard, Purdue, California State University at Northridge, and Los Alamos and Lawrence Livermore national laboratories), researchers at Los Alamos National Laboratory are developing a method for accelerating molecular dynamics simulations at the solid-liquid interface.

In the parallel-replica dynamics method, time is parallelized to achieve longer simulations for infrequent-event processes, such as the diffusion of atoms on a surface, or, as is relevant to this project, the activated processes that advance a stress-loaded crack tip. Because stress corrosion cracking often involves a liquid phase in contact with the crack tip, the parallel-replica dynamics method is being extended so that it can be used to accelerate the dynamics at a solid-liquid interface. Initial results look promising for obtaining significant parallel speedup in time for this much more complex system, which heretofore was limited to time scales accessible to direct molecular dynamics.

Topological Analysis Provides Deeper Insight into Hydrodynamic Instabilities


different stages of the mixing process [animation (mov)]

 

Valerio Pascucci of Lawrence Livermore National Laboratory, working with members of the SciDAC Visualization and Analytics Center for Enabling Technology (VACET), has developed the first feature-based analysis of extremely high-resolution simulations of turbulent mixing. The focus is on Rayleigh-Taylor (RT) instabilities, which are created when a heavy fluid is placed above a light fluid and tiny vertical perturbations in the interface create a characteristic structure of rising bubbles and falling spikes. RT instabilities have received much attention because of their importance in understanding many phenomena, ranging from the rate of formation of heavy elements in supernovae to the design of capsules for inertial confinement fusion. However, systematic, detailed analysis has been difficult due to the extremely complicated features found in the mixing region. This novel approach, based on robust Morse theoretical techniques, systematically segments the envelope of the mixing interface into bubble structures and represents them with a new multi-resolution model, allowing a multi-scale quantitative analysis of the rate of mixing based on bubble count. This analysis enabled new insights and deeper understanding of this fundamental phenomenon by highlighting and providing precise measures for four fundamental stages in the turbulent mixing process that scientists could previously only observe qualitatively. more

Code Shows Radio Waves Are Hot Enough for ITER

 
AORSA simulation of 3D radio-wave electric field
propagating in the ITER plasma

Using his AORSA application on the Cray XT4 Jaguar supercomputer, ORNL physicist Fritz Jaeger has performed 3D simulations of radio-wave heating in fusion reactors. The simulations demonstrate that radio-wave heating should work effectively for both present experiments and the multibillion-dollar ITER fusion reactor. ITER is being developed as a cooperative effort between nations in Europe and Asia, as well as the United States, to demonstrate the scientific and technological feasibility of fusion power. The reactor will use radio waves to heat the ionized gas (plasma) ten times hotter than the sun, thereby causing atoms in the gas to fuse and release energy. Analytical theory, one- and two-dimensional simulations, and experiments have provided an understanding of the relative success of radio-wave heating on medium-scale experiments and its relative inefficiency on smaller experiments. Jaeger’s simulations verified that radio waves tended to heat the edge of the plasma instead of the center on smaller experiments. However, he also demonstrated that radio-wave heating should work efficiently on the larger ITER reactor, which measures more than 12 meters across and will hold more than 840 cubic meters of plasma.
SciDAC project page

SciDAC Research in National Geographic:

 
The March issue of the National Geographic includes a report on the work of Stan Woosley and Adam Burrows, members of SciDAC’s Computational Astrophysics Consortium, to learn what triggers a star to explode as a supernova.

According to the article, Woosley’s scientific career began when he started mixing chemicals – often with explosive results -- as a teenager in Texas. As head of the SciDAC project, Woosley and others now simulate supernova explosions on DOE supercomputers. While supernovae are common – and provide most of the heavy elements in the universe – scientists still haven’t determined exactly what causes the stars to explode.

The article also describes how Burrows, of the University of Arizona, and others are investigating whether sound waves may provide the boost of energy which causes an unstable star to finally explode in a burst brighter than all other stars.

The issue includes a gallery of images, several of which were created under SciDAC’s Supernova Science Center.
read the National Geographic article

Materials by Design Takes a Closer Look at the Structure of Water


Electronic Density of Water in a Carbon Nanotube

  
Although the structure of water has been probed for more than 100 years, scientists still do not agree on its electronic structure and detailed atomic structure. Understanding water's structure could lead to revolutionary new insights in biology, chemistry, and other fields. Using an INCITE allocation on the Blue Gene/L at the Argonne Leadership Computing Facility, members of the Quantum Simulations of Materials and Nanostructures (Q-SIMAN) SciDAC project are using first principles to examine the structure of water in confined and compressed states. Led by Giullia Galli of the University of California, Davis, the Q-SIMAN project is using the Qbox code to study the behaviour of water molecules in nanotubes and between graphene sheets. As Prof. Galli commented in a 2006 Post-Gazette article, "interactions with the wall of the [nano]tube might change water's structure."

The team is also studying hydration with benzene and hexafluorobenzene. Their 2007 paper in J. Phys. Chem. B reports that the electronic structure of interfacial water molecules differs from that of bulk water, as a result of the interaction with the aromatic solute. These results indicate that the solvation of aromatic species is determined by subtle but important charge transfer and dipole redistribution effects, and cast some doubts on the validity of nonpolarizable models for the study of these systems. These findings also indicate that electronic structure information, as contained in ab initio MD simulations, is an important component in a microscopic description of aromatic hydration.

read the Hydrophobic Hydration article in J. Phys. Chem. B
read the PNAS Hydrophobicity commentary by Giullia Galli

SciDAC Results in Nature Letters: Pulsar spins from an instability in the accretion shock of supernovae

   

Researchers John M. Blondin and Anthony Mezzacappa of the Terascale Supernova Initiatve (Alumni SciDAC project: Shedding New Light on Exploding Stars) have a Letter published in the January 2007 issue of Nature. They report a robust instability of the stalled accretion shock in core-collapse supernovae that is able to generate a strong rotational flow in the vicinity of the accreting proto-neutron star (PNS).
read the nature article (pdf)

The flow vectors highlight two strong rotational flows. On the right the flow is moving clockwise along with the shock pattern, whereas at the bottom left the post-shock flow is being diverted into a narrow stream moving anticlockwise, fueling the accretion of angular momentum onto the PNS.


New Project added to SciDAC-2

   
A new project--"Building a universal energy density functional"--has been added to SciDAC-2. It will bring advanced computing resources to bear on the physics of atomic nuclei. Knowledge in this area is needed for pure science as well as engineering.

Scientists need a better theory understanding astrophysical processes, particularly the creation of elements in the stars. Engineering applications include the design of next-generation power reactors, reactors to burn nuclear waste, and simulations to obviate the need for nuclear weapons testing. The theoretical methods to be applied will make extensive use of "density functional theory", a tool that has been spectacularly successful in chemistry and in materials science for predicting the properties of molecules and material systems. Because of the many computational challenges to construct the theory, the project calls for a collaborative effort between computer scientists and nuclear physicists. It is anticipated that this 5-year project will produce a theory and codes that will dramatically improve the accuracy and reliability of predictions of nuclear properties.

Professor Bertsch Aurel Bulgac
The project team is a consortium of 8 universities and 6 national laboratories with funding of $15M. It is led by Professors George Bertsch (far right) and Aurel Bulgac (near right) in the Institute for Nuclear Theory and the Department of Physics at the University of Washington.
more about the project


SciDAC Project Pushing the Frontiers of Chemistry

 
In most first-year chemistry classes at universities, students begin by learning how atoms of carbon and other elements form bonds with other atoms to form molecules, which in turn combine to form ourselves, our planet and our universe.

Understanding the properties and behavior of molecules, or better yet, being able to predict the behavior, is the driving force behind modern chemistry. Theoretically, quantum mechanics means that all the properties of molecules could be predicted. The problem is that the equations are too complex to actually solve, even using the most powerful supercomputers. Predicting the behavior of just one molecule with one electron requires 1,000,000 calculations, while doing the same for an atom with 20 electrons would require 1, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000 calculations. And since scientists are typically interested in atomic systems with at least a few hundred electrons, a better method is needed.

To meet this requirement, the Advanced Methods for Electronic Structure: Local Coupled Cluster Theory project was designed to develop new methods which strike novel compromises between accuracy and feasibility.

SciDAC Tools Fuel Groundbreaking Combustion Research

Using methods developed by SciDAC’s Algorithmic and Software Framework for Applied Partial Differential Equations (APDEC), computational and combustion scientists at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory have created an unparalleled computer simulation of turbulent flames. The research was featured on the cover article of the July 19, 2005 Proceedings of the National Academy of Sciences. The research led to a three-dimensional combustion simulation of unmatched accuracy, a simulation that closely matches conditions found in laboratory combustion experiment. The code allows the researchers to model a flame about 12 cm in height and consisting of 80 chemical species and more than 300 chemical processes. (MORE) - July 2005

Simulations were computed on the IBM SP at NERSC


Better Understanding of Ignition Front Propagation May Lead to Cleaner Engines

Researchers from the Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry (TSTC) project are seeking better understanding of inhomogeneous autoignition. Numerical experiments on the effect of thermal stratification on controlling burn rate,under homogeneous charge compression ignition (HCCI) engine conditions, show that increasing thermal stratification promotes more flame-like structures and the zonal model deteriorates with increased stratification. MORE - May 2005


Terascale Supernova Initiative Discovers a New Way Neutron Stars May Be Spun Up

A series of 3D hydrodynamic simulations show the flow in a stellar explosion developing into a strong, stable, rotational flow (streamlines wrapped around the inner core). The flow deposits enough angular momentum on the inner core to produce a core spinning with a period of only a few milliseconds. (MORE) - May 2005

Simulations were computed on the Cray X1 in the Leadership Computing Facility at ORNL.


 


Home  |  ASCR  |  Contact Us  |  DOE disclaimer