WSRC-MS-2001-00091

A DOE Computer Code Toolbox: Issues and Opportunities

K. R. O’Kula
Westinghouse Safety Management Solutions LLC
Aiken, SC 29804

D. Y. Chung
U.S. Department of Energy
Germantown, MD 20878

P. R. McClure
Los Alamos National Laboratory
Los Alamos, NM 87545

This document was prepared in conjunction with work accomplished under Contract No. DE-AC09-96SR18500 with the U.S. Department of Energy.

DISCLAIMER

This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof.

This report has been reproduced directly from the best available copy.

Available for sale to the public, in paper, from:  U.S. Department of Commerce, National Technical Information Service, 5285 Port Royal Road, Springfield, VA 22161,  phone: (800) 553-6847,  fax: (703) 605-6900,  email:  orders@ntis.fedworld.gov   online ordering:  http://www.ntis.gov/support/ordering.htm

Available electronically at  http://www.osti.gov/bridge/

Available for a processing fee to U.S. Department of Energy and its contractors, in paper, from: U.S. Department of Energy, Office of Scientific and Technical Information, P.O. Box 62, Oak Ridge, TN 37831-0062,  phone: (865 ) 576-8401,  fax: (865) 576-5728,  email:  reports@adonis.osti.gov

Abstract

The initial activities of a Department of Energy (DOE) Safety Analysis Software Group to establish a Safety Analysis Toolbox of computer models are discussed. The toolbox shall be a DOE Complex repository of verified and validated computer models that are configuration-controlled and made available for specific accident analysis applications. The toolbox concept was recommended by the Defense Nuclear Facilities Safety Board staff as a mechanism to partially address Software Quality Assurance issues. Toolbox candidate codes have been identified through review of a DOE Survey of Software practices and processes, and through consideration of earlier findings of the Accident Phenomenology and Consequence Evaluation program sponsored by the DOE National Nuclear Security Agency/Office of Defense Programs. Planning is described to collect these high-use codes, apply tailored SQA specific to the individual codes, and implement the software toolbox concept. While issues exist such as resource allocation and the interface among code developers, code users, and toolbox maintainers, significant benefits can be achieved through a centralized toolbox and subsequent standardized applications.

Introduction

In January 2000, the Defense Nuclear Facilities Safety Board (DNFSB) issued Technical Report 25, (TECH-25), Quality Assurance for Safety-Related Software at Department of Energy Defense Nuclear Facilities (DNFSB, 2000). TECH-25 identified issues regarding the state of software quality assurance (SQA) in the Department of Energy (DOE) Complex for software used to make safety analysis decisions and to control safety-related systems. Instances were noted in which computer code were either inappropriately applied or were executed with incorrect input data. Of particular concern were inconsistencies in the exercise of SQA from site to site, and from facility to facility, and the variability in guidance and training in the appropriate use of accident analysis software.

One of the corrective measures recommended in TECH-25 software is the development and maintenance of a computer code toolbox of accident and consequence codes. A toolbox of this nature would in principle, contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes that are managed and maintained for DOE-broad application.

This paper summarizes key considerations for establishing a toolbox for accident analysis computer codes, identifies characteristics of the constituent codes, and provides the initial results of a DOE Complex-wide survey of software use. Current planning for the toolbox development is then discussed, along with summary of the current outstanding issues and approaches for resolution.

Considerations for Establishing a Software Toolbox

As noted in the subject DNFSB report, software quality assurance (SQA) is a process for the systematic development, testing, documentation, maintenance, and execution of software. When applied to accident analysis codes, specifically those applied as estimation tools to quantify source terms (from fire, spill, and explosion events) and the subsequent atmospheric dispersion and consequences, robust SQA processes help assure the robustness of facility safety documentation.

A collection of software applied extensively for DOE facility safety, and configuration-controlled by an independent organization, is a key recommendation advised in TECH-25 to enhance SQA throughout the DOE safety analysis community. This collection, a "toolbox" of accident analysis software, ensures elimination of redundancies in error reporting & correction, release of new code versions and accompanying documentation, and provides a single authority for ensuring code improvements. The toolbox maintainer is best conceptualized as a central software organization, composed of software quality experts and accident analyst code users, charged with keeping user groups aware of physical and mathematical assumptions and limitations stemming from use of a particular code. The maintainer would not necessarily be responsible for directly accomplishing software changes, but would work closely with the individual code owners such that user group issues are effectively resolved.

Candidate Software for the Toolbox

DOE has outlined a process for identifying initial codes for the safety analysis toolbox in a response plan to the DNFSB (DOE, 2000b). The process is directed by a Safety Analysis Software Group (SASG). The SASG was formed in early 2001 and is composed of representatives from National Nuclear Security Agency – Defense Program (NNSA-DP) national laboratories and sites, and several line organizations within DOE. The primary task of the SASG shall be to review responses from a survey of SQA practices, processes, and procedures from DOE contractors. A section of data from this survey will identify those codes broadly used for facility accident analysis at DOE laboratories and sites. Once this set is known and the individual codes screened for meeting minimum threshold criteria and compliance with minimum SQA standards, tailored programs will be proposed to bring tool-box constituent codes into an acceptable state of SQA readiness for accident analysis applications.

Several steps in this process have been completed. The results of the DOE Safety Contractor Survey of SQA practices, processes, and procedures have been determined. Several software packages are clearly widely applied in the support of accident analysis. An initial screening has been performed to identify software for inclusion in the Toolbox.

Before these results are discussed, it is helpful to outline the precursor study, the Accident Phenomenology and Consequence Methodology Evaluation project (O’Kula, 1997). Much of the context for the current SASG can be better understood through a description of the earlier program and its process.

Accident Phenomenology and Consequence Evaluation Program

The Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated in 1995 at the request of the Department of Energy (DOE) Office of Defense Programs. The purpose of the Program was to evaluate the accident analysis approaches (i.e., computer models and input assumptions) applied to support authorization basis documentation. This program was formed to address the increasing disparity in the application of codes and engineering methods among DOE Sites documented in Safety Analysis Report (SAR), Basis for Interim Operation (BIO), and hazard analysis documents. An Executive Committee directed the program and guided the evaluation process of software for source term development, in-facility transport, and dispersion/consequence analysis.

APAC Working Groups were formed through identification of the major areas required in accident analysis found in most DOE SARs written following guidance in DOE-STD-3009-94, and related documentation used in concert with the DOE Source Term Handbook, DOE-HDBK-3010-94 (DOE, 1994a and 1994b). The Handbook provided a context for examining source term and dispersion/consequence analysis areas using six working groups:

Source Term Analysis
  1. Fire Working Group (FWG)
  2. Explosions and Energetic Events (EEE)
  3. Spills Analysis (SWG)
  4. In-Facility Transport Analysis

  5. In-Facility Transport (IFT)
  6. Atmospheric Dispersion/Consequence Analysis

  7. Radiological Dispersion/Consequence (RDC)
  8. Chemical Dispersion and Consequence Assessment (CDCA).

Each Working Group reviewed currently used computer models in its technical area. Computer models were identified by reviewing Safety Analysis Reports and other Safety Basis documentation gleaned from the DOE facilities, as well as direct consultation with numerous accident and consequence analysts. Table 1 indicates the number of computer models evaluated by each group, staff participating, and the level of review depth. In total, on the order of fifty analysts supported the APAC in some capacity during the program. Nearly two hundred models were noted, reviewed or evaluated in depth during the course of APAC evaluation.

As noted in Table 1, as many as ten safety analysts, model developers, and other technical support comprised each working group. Although approximately fifty individuals were involved in computer model evaluation, Working Group coordination, Executive Committee, or review capacity, the time-averaged support amounted to about three full-time equivalents per year. The program required approximately four years to complete.

Table 1. Computer Model Evaluation By Working Group

Working Group

Personnel

Decreasing Depth of Review à

Tier 1

Tier 2

Tier 3

1. Fire

6

5

-

-

2. Explosions and Energetic Events

9

-

-

10 (Models and
Methods)

3. Spills

6

5

6

13

4. In-Facility Transport

6

5

-

-

5. Radiological Dispersion/Consequence

10

11

4

-

6. Chemical Dispersion & Consequence Assessment

7

14

11

110

Total Staff & Total Computer Models by Tier

44

40

21

133

Total Evaluated Models

194



Generalized Evaluation Process and Code/Model Recommendations

The generalized evaluation process conducted by each Working Group followed these steps:

    1. Review of the regulatory documentation;
    2. (I.a) Identification of best practices - methods, assumptions, and parameter values to be followed as a norm;

    3. Development of Evaluation Criteria, including (1) General Software Quality and User Interface characteristics; (2) Technical Model Adequacy; (3) Application Environment (source term and range of applicability);
    4. Selection of candidate codes; and
    5. Development of Test Problems. Several groups also conducted a limited ranking of computer models, i.e., quantitative "scoring" of computer models against detailed criteria.
    6. The Working Groups then executed the process on each code until all were evaluated.
    7. Documentation and required rework of any of the process phases concluded the program.

A Tier 1 evaluation consisted of a thorough examination of code attributes and software documentation (approximately phases 1 through 3 above), and running one or more test problems. A Tier 2 evaluation was more limited, and did not include running the code against test problems. Several groups performed "Tier 3" reviews, and in these cases, the review was an acknowledgment of the availability of a computer model for the particular application, but was based on an abbreviated assessment of the code that omitted Tier I test case execution and a Tier II level of evaluation.

Source Term Generation and In-Facility Transport Model Recommendations

The FWG concluded that FIRAC/FIRIN was the best overall computer model of the five fire analysis models reviewed for DOE safety analysis applications based on its ability to calculate not only the fire characteristics but also the potential radioactive and fire by-product source terms. However, because of the many limitations and errors identified in FIRIN, and the better fire compartment modeling capabilities of CFAST, a combination of CFAST/FIRAC, if developed, would be an optimized tool. FPETool was found readily applicable to applications in which no source term calculations are needed, but in which first order-of-magnitude estimates are adequate. This computer model is useful in the context of supporting fire hazard analyses in compliance with regulatory requirements. The FWG determined that while CFAST was the best code for implementing the mass and energy conservation equations within a compartment or zone, it had limitations in its ability to model mechanical ventilation systems. Table 2 summarizes a "first-order" assessment of the applicability of the evaluated fire models in meeting DOE fire hazard requirements by consequence locations, specifically, local, collocated workers and the general public, and facility impacts.

The Explosions and Energetic Events (EEE) Working Group concentrated on methodology recommendations over computer model preferences. This was done in part due to the complexity of the phenomenon (Explosions can lead to combustion products, and toxicological and radiological source terms, that are important in the overall determination of consequences for safety basis documentation. However, detonation and deflagration events can also cause blast wave overpressures, thermal radiation, the generation of debris and projectiles, and potentially knock-on effects.) and its impact on scenario development, resource limitations on the project, the focused needs at most DOE sites, and also in recognition that the many special-purpose methodologies that have been developed would demand more time to thoroughly investigate than was allotted in the APAC program. A survey of DOE Complex hazards and historical data indicated that the Working Group should devote most of its time to methods amenable to treating solid-phase detonation, gas-phase detonation, gas-phase deflagration, and stored-energy releases.

The EEE Working Group recommended using a model that is consistent with the level of sophistication required in the analysis and for the facility in question. Methodologies reviewed included simple handbook-prescribed engineering calculations to sophisticated hydrodynamic/computational fluid dynamics models. It was the consensus of working group members that engineering calculations provide the safety analyst with sufficient information to assess the potential consequences associated with explosions and energetic events in DOE facilities. Engineering calculational methods could be applied to all levels in the graded approach for assessing facility hazards, with the possible exception of some cases involving high hazard facilities and specialized analysis needs, where computational fluid dynamic codes may be required. Table 3 lists recommended modeling approaches for specific explosion types.

Table 2. Fire Model "First-Order" Assessment

Computer Model

Zone (Z)
Field (F)
Model

Local Worker Consequences

Collocated Workers, General Public, and Environmental Consequences

Facility (or Mission) Impacts

FIRAC/FIRIN

Z

YES

YES

YES

FPETool

Z

YES (Note 1)

NO

YES (Note 4)

COMPBRN III

Z

YES (Note 2)

NO

YES

CFAST

Z

YES (Note 3)

NO

YES

VULCAN

F

NO

NO

YES

Note 1. Excludes radiological consequences
Note 2. Has very limited capabilities
Note 3. User must manually input initial airborne source terms and energy functions within the fire room.
Note 4. Must input compartment dimensions, vent locations, fire load, and fire load density.

The Spills Working Group recommended more sophisticated models were generally more applicable for scenarios in facilities with higher hazards. The recommended models are identified in Table 4, and are listed according to spill type and the facility hazard category to which they most reasonably apply. In general, the Working Group concluded that engineering calculations and look-up tables are advised over computer models for most simple problems. Hand calculations may be particularly useful for phenomena such as resuspension, where only a limited number of codes are available. For radiological spills, the SWG recommended use of the DOE Source Term Handbook, DOE-HDBK-3010-94 (DOE, 1994b).

The In-Facility Transport Working Group performed confined their review to CONTAIN, KBERT, and MELCOR codes from Sandia National Laboratories (SNL), and to FIRAC and GASFLOW from Los Alamos National Laboratory (LANL). Based on the evaluation performed, the WG concluded that there were not any advantages of CONTAIN over MELCOR for analysis of in-facility transport problems. Therefore, the WG recommends that MELCOR be used for applications in which agglomeration of aerosols is an important phenomenon. It was found that the aerosol models in MELCOR have been assessed and validated against experimental data and can provide a benchmark for aerosol models in codes such as GASFLOW that do not include agglomeration. GASFLOW is also recommended in modeling situations where multidimensional effects are important. Examples of the latter are: (1) when concentration profiles are of interest in one or more rooms under varying states of ventilation function; and (2) when lower flammability limits (LFLs) are required versus cell-average flammability limits. KBERT was advised in analyses supporting functional classification, and in particular for in-facility worker assessment. However, the safety analyst must input time-varying flow rates in scenarios where the ventilation status is changing.

Table 3. Explosion Model Guide

Explosion Type

Modeling Guidance

Condensed-Phase Explosion (TNT-type)

-Use TNT Model with a yield factor of 1, and no limit placed on near field overpressure.
-Use the hydrodynamic codes, if necessary.

Physical Explosion

-Compute the "stored energy" and select TNT, Baker-Strehlow, or TNO model to calculate the explosion overpressure effects.
-Use the hydrodynamic codes, if necessary

BLEVE

-See "Physical Explosion" guidance above.
-Use fire modeling to calculate the thermal radiation effects of BLEVEs of vessels containing flammable material.

Confined Explosion

-Use discharge and dispersion models to calculate the mass of material in the cloud. Use the TNT, Baker-Strehlow, or TNO model to calculate the explosion overpressure effects. Use of TNO, and Baker-Strehlow methods is particularly appropriate.
-As necessary, use the hydrodynamic codes.

Vapor Cloud Explosion

- Use discharge and dispersion models to calculate the mass of material in the cloud. Use the TNT, Baker-Strehlow, or TNO model to calculate the explosion overpressure effects.
-As necessary, use the hydrodynamic codes.

 

Table 4. Spills Working Group Model Recommendations

Facility Hazard Category

Liquid Chemical Spills and Evaporation

Pressurized Liquid/Gas Releases

Solid Spills and Resuspension/
Sublimation

Resuspension of Material from Spilled Liquids

Low/
Category 3

Tscreen
ADAM
ALOHA

TScreen
ALOHA

HOTSPOT
KBERT

HOTSPOT

Moderate/
Category 2

ADAM
ALOHA
CASRAM
HGSYSTEM

ALOHA
CASRAM
HGSYSTEM

HOTSPOT KBERT

HOTSPOT

High/
Category 1

CASRAM
HGSYSTEM

CASRAM
HGSYSTEM

HOTSPOT
KBERT

HOTSPOT

Note 1. Excludes radiological consequences

Atmospheric Dispersion and Consequence Analysis

The Radiological Dispersion/Consequence Working Group evaluated MATHEW/ADPIC (ARAC System) as the best overall computer model from the fifteen considered, with MACCS/MACCS2, COSYMA, and TRAC RA/HA grouped in the next best ranking. However, of these four models, only MACCS/MACCS2 has the proven requisite availability, performance record, and portability attributes required for consequence analysis at most DOE sites. While it was concluded that these four models cover most dispersion and dose modeling needs, only MACCS/MACCS2 has sufficient application experience. The Working Group ranked highest the following models in terms of the major evaluation categories:

  1. Software Quality Assurance/User Interface: MACCS, RSAC-5, MATTHEW/ADPIC
  2. Technical Model Adequacy: MATHEW/ADPIC, COSYMA, TRAC RA/HA
  3. Source Term Applicability: MATHEW/ADPIC, MACCS2, GENII.
  4. Overall Use Recommendations: MATHEW/ADPIC, MACCS, GENII

Table 5 lists the Working Group recommendations for source term types, specifically, detonations/deflagrations, fires, momentum/buoyancy-driven, spills/evaporation, criticality, and tritium-based releases. The ordering of models is based on the overall ranking and in general, identifies those codes that can adequately represent the initial input and transport conditions associated with the release type.

Table 5. Radiological Consequence Model by Source Term Type

Explosions

Fires

Momentum/
Buoyancy-Driven

Spills/Evaporation

Criticality

Tritium-Based

ERAD

MACCS2

MATHEW/
ADPIC*

MATHEW/
ADPIC*

MACCS2

UFOTRI

MATHEW/
ADPIC*

COSYMA**

BNLGPM*

GXQ

HOTSPOT

COSYMA**

HOTSPOT

MATHEW/
ADPIC*

GXQ

AXAIRQ*

RSAC 5

AXAIRQ*

 

HOTSPOT

RSAC-5

COSYMA**

 

HOTSPOT

 

UFOTRI***

 

MACCS2

 

MACCS2

     

TRAC RA/HA**

 

MATHEW/
ADPIC*

* Portability limits availability of computer model.
** Limited experience with computer model in the U.S.
*** Fire-driven source terms containing tritium only.

The recommendations of the CDCA Working Group are summarized in Table 6. In arriving at these recommendations, the Working Group cited the need to apply a graded modeling approach in chemical dispersion modeling in support of safety basis documentation. The CDCA made their determinations based on review and evaluation of each of the Tier I and II codes, as well as the insights gained from the Tier I test scenarios. EPIcode (Homman and Associates) was noted in terms of widespread use and capabilities, but wasn’t selected since it is a proprietary computer model.

Table 6. Chemical Dispersion and Consequence Assessment WG Model Recommendations

Conditionally Recommended*

Conditionally Recommended for Special-Purpose Safety Basis Applications

Conditionally Recommended for Further Review and Evaluation

Not Recommended for Safety Basis Applications

ALOHA

ADAM

CASRAM-SC

VLSTRACK

DEGADIS

CALPUFF

HOTMAC/RAPTAD

 

HGSYSTEM

FEM3C

SCIPUFF

 

SLAB

INPUFF

   
 

TSCREEN

   

* Codes in boldface have dense gas modeling capabilities.

Summary of APAC Evaluation Working Group Computer Code Recommendations

The SASG used the APAC Evaluation process to identify candidate computer codes for Toolbox consideration. These models, as noted in the discussion and tables above are restated in Table 7.

A screening process will then be applied to rebaseline the APAC-recommended models in light of more recent regulatory requirements and new modifications made by several code organizations.

Table 7. Final APAC Methodology Evaluation Working Group Model Recommendations

APAC Phenomenology Area

Computer Models or Methodologies Appropriate for Accident and Consequence Analysis for Safety Basis Documentation

1. Fire

FIRAC/FIRIN, FPETool, CFAST

2. Explosion

TNT, Baker-Strehlow, or TNO model;
-As necessary, use the hydrodynamic codes.

3. Spill

ALOHA
CASRAM
HGSYSTEM
HOTSPOT
KBERT

4. In-Facility Transport

MELCOR, CONTAIN, GASFLOW

5. Radiological Dispersion and Consequence

MATHEW/ADPIC
MACCS
GENII
UFOTRI (tritium)

6. Chemical Dispersion and Consequence Assessment

ALOHA
DEGADIS
HGSYSTEM
SLAB
EPIcode (Tier II model)



Rebaselining APAC Program Results: DOE SQA Survey Integration

Since the APAC Methodology Evaluation Program concluded in the late 1990s, computer models for accident analysis and usage patterns in the DOE Complex have changed. The SASG decided that although the earlier, comprehensive APAC program provides a useful starting point, it would be necessary to establish a new baseline, i.e., calibrate the older results against current day needs and application patterns on the part of users. The responses of the DOE Survey of Software Quality Processes, Practices, and Procedures (henceforth, DOE SQA Survey) were used in conjunction with "reasonable interpretation" of the APAC evaluations.

The DOE SQA Survey was transmitted in mid-2000 to site and laboratory safety contractors responsible for nuclear facility safety. The response was nearly 100%, and included eighteen organizations at eleven DOE sites. The results identified clear trends in the use of computer models for facility safety basis purposes. However, to best apply the Survey results in light of the earlier APAC Program, it was deemed appropriate to develop a simple set of screening criteria.

The SASG defined screening criteria in terms of two categories. The first, a "go, no-go" set of criteria, are used to answer usage and appropriateness for meeting minimum requirements. If a computer model passed the threshold determinations, it was then evaluated relative to another set of criteria. These criteria are quantitative in terms of adequacy and quality of the software, and draw heavily from the APAC evaluations. The two sets of criteria are listed in Tables 8 and 9, respectively.

Results from the DOE SQA Survey were binned by general categories of accident phenomena, loosely equivalent to those used in the APAC Methodology Evaluation effort. The categories were as follows:

Upon review of the currently available software reported in the DOE SQA Survey, the mode of application of this software, and the analysis requirements imposed on accident analysts for DOE safety basis documentation, several categories were determined outside current scope, and will not be reviewed any further by the SASG. The Criticality software area is not central to the concerns articulated in TECH-25’s core issues of safety analysis. Furthermore, the SQA Survey results showed that this area is addressed mostly with engineering calculations, review of NUREG/CR-6410 (USNRC, 1998), calculations using MCNP and/or SCALE systems of computer codes. Both of these code systems have appropriate SQA programs. It was also concluded that DNFSB Recommendation 97-02, and the DOE response and implementation plans covered the criticality area to a greater requisite depth than could the SASG.

Table 8. Computer Code Screening Criteria – "Threshold Requirements Met"

Criteria

Pass/Fail Level

Code Usage in the DOE Complex

Fail – 1 or no sites use this code
Pass – 2 or more DOE sites use this code

Code meets minimum requirements*

Fail – Code does not meet the minimum model requirements for each code category
Pass – Code meets the minimum model requirements as for each code category


* Example for radiological dispersion (Note SASG would have to define these for each code category):

In the fire hazard and risk analysis area, the orders and standards that cover fire modeling include:

Table 9. Quantitative Ranking Criteria

Criteria

Maximum Points

Scoring Mechanism

Capability and Versatility of Code

40

Assign a score based on APAC (Accident Phenomenology and Consequence (APAC) Code Evaluation Project (1995 - 1998) detailed scoring scheme in APAC reports for each code category or have SASG qualitatively score each code based on input from the group

Suggested scoring: Assign a score of 1 to 40 based on APAC data or qualitative data

Number of Code Users

20

Assign a score based upon SASG survey.

Suggested scoring: Assign a score of 20 if five or more sites use code. Assign a score of 15 if three to four sites use code. Assign a score of 10 if at least two sites use code; 5 for one site.

APAC recommended code

10

Assign a score based on whether the code was recommended for usage by APAC.

Suggested scoring: Assign a score of zero for non-recommended codes, assign a score of 10 for recommended codes

QA status

10

Assign a score based on code QA status

Suggested scoring: Assign a score of zero for no or low documented QA. Assign a score of 5 for moderate QA. Assign a score of 10 for good to excellent QA.

User Interface

10

Assign a score based on a qualitative evaluation of code user interface.

Suggested scoring: Assign a score of 0 for a poor interface. Assign a score of 5 for a good interface. Assign a score of 10 for a very good interface

Origin of Code

5

Assign a score based upon code origin.

Suggested scoring: 5 if the code was developed, at least in part, by a DOE site, 0 if not. A score of 2 was applied for partial DOE site origin.

Code Sponsor

5

Assign a score based upon code sponsor.

Suggested scoring: 5 if the code is sponsored, at least in part, by DOE, 0 if not.

Total Points

100

 


The second category to be omitted and not within the purview of the SASG is that of explosion (deflagration and detonation) computer codes. Several sites, notably Pantex and Lawrence Livermore National Laboratory use specific codes to determine blast effects, explosive yield, and overpressure among other consequences. However, there is not a specific workhorse code that is used at multiple locations that can justify consideration as part of the DOE Toolbox. It should be noted that the APAC Working Group avoided recommendation of specific computer codes due to the complexity of this area. It is the expectation of the SASG that each site requiring use of an explosion code to support its Safety Basis demonstrate that the software has sufficient SQA measures in place.

Software Candidates for DOE Safety Analysis Toolbox

The high-use computer models determined from the DOE SQA Survey, and those recommended from the APAC Methodology Evaluation program are listed in Table 10. If a computer model was high-use (two or more sites) and met minimum performance requirements as interpreted from regulatory standards for that area, they are evaluated using the criteria discussed in Table 9. Examples of the regulatory standards for accident phenomenology include Appendix A of DOE Standard STD-3009-94 for radiological dispersion and consequence codes. Similarly, for fire zone modeling, DOE G-420.1/B-0, G-440.1/E-0, Implementation Guide for use with DOE Orders 420.1 and 440.1 Fire Safety Program, and associated ASTM Standard Guides can be used to provide regulatory requirements for the analysis. (Other regulatory guidance may be determined for remaining areas. It is planned to document all guidance in the subject report later this year.) The specific score by criterion is not shown, but will be documented at a later time by the SASG.

The quantitative criteria are used to identify strengths and weaknesses of the software potentially earmarked for the DOE Toolbox. In this manner, any remedial actions identified by the SASG and recommended to the software owners can be prioritized. Secondly, the quantitative evaluation process facilitates impartial comparison of APAC Program recommended codes. Other than the Radiological Dispersion and Consequence Working Group, quantitative evaluations were not performed previously by other working groups. Finally, use of quantitative criteria allows computer codes that were not reviewed by the APAC program to be considered in the SASG process.

By phenomenological area, Table 10 highlights the codes that are candidates for the DOE Safety Analysis Toolbox including:

Note that the current list contains six models, since both ALOHA and EPIcode have source term and subsequent dispersion modules. Under the Special Purpose category, the proprietary code FLUENT is used at multiple locations to examine mixing phenomena and multidimensional effects. It is not clear whether there is sufficient justification to include FLUENT in the initial collection of computer models.

Table 10. Survey Code Screening

Computer

Code

Threshold Criteria

Quantitative Criteria

 

Usage

Meets Minimum Requirements

Total

 

>=2 =1

Yes = 1

 

Radiological Dispersion and Consequence Analysis

MATHEW/ADPIC

0

1

80

MACCS

MACCS2

1

1

86

GENII

1

1

69

HOTSPOT

1

0

64

UFOTRI

0

1

32

Chemical Dispersion and Consequence Analysis

ALOHA

1

1

75

DEGADIS

0

1

50

HGSYSTEM

0

1

67

SLAB

0

1

59

EPIcode

1

1

65

Fire (Zone Models)

CFAST

1

1

65

FIRAC/FIRIN

0

1

70

FPETool

0

1

40

In-Facility Transport

CONTAIN

0

1

40

MELCOR

1

1

80

GASFLOW

0

1

65

Chemical Release & Spills

ALOHA

1

1

70

CASRAM

0

1

50

EPIcode

1

1

65

HGSYSTEM

0

1

57

Other Special Purpose

FLUENT

1

N/A

See Text.

 

Path Forward for Development of a DOE Safety Analysis Toolbox: Issues

The issues cited in TECH-25 are Complex-wide, and thus any response, including organization of a SASG and the steps toward formulating a toolbox, must also have widespread support among DOE, its laboratories, and nuclear facility contractors. With this in mind, the SASG has been formed with representation from key NNSA-DP sites, laboratories, and DOE field and line organizations.

Initial work has reviewed the results of DOE Software Quality Assurance Survey, and rebaselined the earlier findings of the DOE Accident Phenomenology and Consequence Program relative to updated, screening criteria. Six computer codes, including MACCS/MACCS2, GENII, ALOHA, EPIcode, CFAST, and MELCOR, have been recognized through this process as high-use software, applied to satisfy minimum requirements for accident analysis. These computer codes are the initial candidates for the DOE Safety Analysis Toolbox.

Additional steps still must be taken. Included are:

  1. Extensive examination of the SQA status of each of the six codes, and completion of the documentation containing justification for code designation as toolbox software.
  2. Formal establishment of the DOE Safety Analysis Toolbox, with protocol for accessing codes, maintaining configuration control, addressing user issues, and updating and correcting software.
  3. Determination of whether the SASG should be made into a standing body, transitioned to another status, or discontinued.

The above steps are to be completed by the end of fiscal year 2001. However, it is recognized that several significant issues must still be resolved through cooperation of the SASG, code maintainers, DOE NNSA-DP organizations, and nuclear facility safety contractors. Among the major ones are areas of performing minimum SQA, augmenting training, and costs/resources.

Process for Satisfying Minimum SQA: Each of the individual computer programs nominated for the toolbox is lacking at least in part in its software adequacy, and most of the associated SQA efforts were added later rather than during the initial phase of development. While the supplemental SQA program may be envisioned as demanding more resources than can be readily borne by a single sponsor or maintainer organization, a targeted effort, focusing on key areas is advised by the SASG. Such an approach is recommended in TECH-25 and can be guided by a posteriori activities defined in ANSI and IEEE software quality assurance standards [ANSI/ANS, 1987 and IEEE, 1984]. Both standards contain sections addressing software that is developed outside software "good practices".

Augmenting Training on Use of Safety Analysis Software: The DOE Safety Analysis Working Group conducts an annual workshop each year, and has increased the attention given in the training area. The emphasis has been particularly strong in providing code-specific training in use and applications appropriate to many of the computer models reviewed earlier by the APAC Program, and more recently by the SASG. However, nearly all training modules lack the formalism of enabling objectives and graded testing. The SASG will evaluate the need and requirements for formal training later this year.

Resources and organizational interaction relative to toolbox maintenance: Once institutionalized, the toolbox codes will require one to two full-time equivalents per year to maintain as well as oversee timely dispositioning of user questions and issues. The manner of interaction between the tool-box maintainer and code owners and allocations of resources will deserve particular attention since the codes of interest span the full public domain to proprietary spectrum.

Path Forward for Development of a DOE Safety Analysis Toolbox: Opportunities

The Safety Analysis Toolbox concept and the oversight of the SASG will provide significant cost and resource savings when viewed in the context of safety analysis and associated authorization basis documentation at numerous DOE laboratories and sites. While startup and initial activities may prove difficult, the payoff is appreciable. It must be remembered that the concept represents a fundamental paradigm shift relative to the current general pattern of site-by-site code control.

Chief among the benefits will be:

Most of the SASG activities will conclude within a year of formation. Initial toll-box code identification and preliminary plans for code maintenance should be in place by the end of CY 2001.

Release to and Use by Third Parties

As it pertains to releases of this document to third parties, and the use of or reference to this document by such third parties in whole or in part, neither WSMS, WSRC, DOE, nor their respective officers, directors, employees, agents, consultants or personal services contractors (i) make any warranty, expressed or implied, (ii) assume any legal liability or responsibility for the accuracy, completeness, or usefulness, of any information, apparatus, product or process disclosed herein or (iii) represent that use of the same will not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trademark, name, manufacture or otherwise, does not necessarily constitute or imply endorsement, recommendation, or favoring of the same by WSMS, WSRC, DOE or their respective officers, directors, employees, agents, consultants or personal services contractors. The views and opinions of the authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof.

References

  1. ANSI/ANS (1987). American National Standard Institute/American Nuclear Society: Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, La Grange Park, IL, ANSI/ANS-10.4-1987.
  2. DNFSB Defense Nuclear Facilities Safety Board, (2000). Quality Assurance for Safety-Related Software at Department of Energy Defense Nuclear Facilities, Technical Report DNFSB/TECH-25, (January 2000).
  3. DOE, U.S. Department of Energy (1994a). Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Reports, DOE-STD-3009-94.
  4. DOE, U.S. Department of Energy (1994b). Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Reports, DOE-HDBK-3010-94.
  5. DOE, U.S. Department of Energy (2000a). Appendix A, Evaluation Guideline, DOE-STD-3009-94, Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Reports (January 2000).
  6. DOE, U.S. Department of Energy (2000b). Quality Assurance for Safety-Related Software at Department of Energy Defense Nuclear Facilities, DOE Response to TECH-25, Letter and Report, (October 2000).
  7. IEEE (1984). ANSI IEEE Standard for Software Quality Assurance Plans, New York, 1984.
  8. O’Kula, K. R., et al. (1997), Evaluation of DOE Accident Phenomenology & Consequence Methodologies: Findings, Recommendations, and Path Forward (U), Proceedings of the Energy Facility Contractors Group (EFCOG) Safety Analysis Workshop, Oakland, CA (1997).
  9. USNRC (1998), Nuclear Fuel Cycle Accident Analysis Handbook, U.S. Nuclear Regulatory Commission, Washington, D.C., NUREG/CR-6410.