The PDF version
Display Related Directives to this directive.
Display Reference Documents to this directive.
U.S. Department of Energy GUIDE Washington, D.C. DOE G 413.3-9 9-23-08 U.S. DEPARTMENT OF ENERGY PROJECT REVIEW GUIDE FOR CAPITAL ASSET PROJECTS [This Guide describes suggested non-mandatory approaches for meeting requirements. Guides are not requirements documents and are not to be construed as requirements in any audit or appraisal for compliance with the parent Policy, Order, Notice, or Manual.] FOREWORD This Department of Energy Guide is for use by all DOE elements. This Guide provides acceptable approaches for implementing the project reviews requirements and criteria required by DOE O 413.3A, Program and Project Management for the Acquisition of Capital Assets, dated 7-28-07, and related to the development, methodology and implementation of a Project Reviews and Evaluation Process for the project. This Guide describes suggested non- mandatory approaches for meeting requirements. DOE Guides are part of the DOE Directives System and are issued to provide supplemental information and additional guidance regarding the Department’s expectations of its requirements as contained in rules, Orders, Notices, and regulatory standards. Guides may also provide acceptable methods for implementing these requirements. Guides are not substitutes for requirements, nor do they replace technical standards that are used to describe established practices and procedures for implementing requirements. 1.0 Purpose This guide is a tool for Federal Project Directors (FPD) and Federal Program Managers in planning and preparing for appropriate reviews (as discussed herein below) as an integral part of the project planning and implementation cycle. Reviews and evaluations are essential to maintain confidence that project management processes and technical efforts are integrated and effectively coordinated for the Department of Energy (DOE). Reviews provide peer review and subject matter expert opinion and feedback on the project readiness to proceed to the next stage in the project decision-making process (see the project Critical Decision Process in DOE O 413.3A). Reviews and evaluations are performed by several levels of management at various points in the lifecycle of a project, including the project’s Initiation, Definition, Execution and Transition/Closeout phases. Reviews and evaluations should be planned and structured using a tailored and graded approach which considers project-specific attributes including review/decision objectives, scope, project size, cost, technical complexity; and findings from previous reviews, and emerging or intervening issues which may not have been previously present, defined or evaluated. Reviews and evaluations conducted during the Initiation and Definition phases verify that projects support the Department’s mission goals and strategic plans; and can be successfully performed within the funding range given applicable conditions such as site and installation conditions, safety and security requirements, and the content of NEPA Site-wide Environmental Impact Statements (SWEIS). During these phases the review process will also be evaluating technology alternatives and maturation levels prior to granting approval for the project to proceed into design and execution. It is also the time to conduct reviews to verify that the project scope is matured and well defined prior to proceeding to execute and baseline the project. The purposes of reviews and evaluations during the Execution and Transition/Closeout phases are to: (a) support validation of the project technical scope, cost and schedule baseline; (b) ensure that the project is being successfully executed according to plans and within established cost baselines; (c) ensure understanding of the project risks and management strategies; (d) provide recommendations for improving the project’s technical scope, schedule, and cost performance; (e) support the project process by developing recommendations and necessary supporting data to arrive at decisions to either proceed or not proceed with subsequent project lifecycle phases; and (f) ensure agreed upon project products and deliverables are being provided. All aspects of the review and evaluation (assessment) process allow opportunities for continuous improvement through a feedback process occurring at critical decision points. Information feedback is typically provided on topics including adequacy of project controls, opportunities for improving work definition and planning, adequacy of risk management evaluation and planning, applicability of lessons learned across the department, plans for and probability of internal and external independent oversight, impacts and likelihood of possible regulatory enforcement actions, likelihood of proposed project funding profiles given planned and budgeting for competing interests in each proposed funding year, etc. 2.0 Scope This guide will address the various types of project reviews that are conducted during the life- cycle of a project based on the stage, complexity and duration of a project. This guide will describe typical reviews for DOE projects, the purpose of each review, the timing during the project life-cycle stages, lines of inquiry, and the documentation. This guide also includes a general discussion of the review process, review participants and qualification of reviewers. The guide incorporates lessons learned from recent project reviews, and recommendations of recent studies of DOE’s project management processes by the National Research Council, Government Accountability Office (GAO) and the Inspector General. One or more of the following types/categories of reviews are performed in support of DOE projects: • Regular/Periodic. Involve project status, trends, quality assurance, design and construction progress for systems and interfaces. These reviews include monthly reviews, quarterly reviews, peer reviews for development work and so forth. All are integral part of ongoing project activities. • Areas of Special Concern/Risk-Driven Reviews. Involve critical technology assessments, hazards, special procurements, high risks, etc. Some of these reviews can be planned and budgeted in advance; others will be on an as-needed basis. • Event-Driven/Decision Point. Involves mission validation, safety analysis, conceptual design review and assessment, baseline validation and critical decision readiness reviews. These reviews are necessary to obtain approval to proceed to follow-on project phases. These reviews are an integral part of a project and are planned in advance; most are performed by independent entities. • Unscheduled. Could involve the Government Accountability Office, Nuclear Regulatory Commission (NRC), Defense Nuclear Facilities Safety Board (DNFSB), DOE Headquarters, or the user. Generally performed on projects with high congressional visibility or projects that experience schedule or cost difficulties. For large, visible projects, these reviews may be anticipated and planned, and should include both schedule and cost components. • Status Reviews. Performed to determine the current condition of a project activity. For example, progress towards completion, compliance status, or readiness to proceed. Reviews could include items (project baselines, requirements, subsystems, project end products), or activities (planning, design, or construction). These reviews can involve management and/or the user. Products from these reviews include review plans, review reports, action item lists, and action item resolution reports. • Design Reviews. Design reviews determine if a product (drawings, analysis, or specifications) is correct and will perform its intended functions and meet requirements. Design reviews are an integral part of the project. Beginning at Critical Decision-1 (CD- 1), Alternative Selection and Cost Range, and continuing through the life of the project, as appropriate, design reviews are performed by individuals internal and external to the project. Design reviews should be conducted for all projects and should involve a formalized, structured approach to ensure the reviews are comprehensive, objective, and documented. This guide will provide an overview of various project reviews within the aforementioned type of reviews that may occur during the project life-cycle. These reviews include: 1. Performance Reviews 2. Independent Project Reviews (IPR) 3. External Independent Reviews (EIR) 4. Independent Cost Reviews 5. Technical Reviews 6. Operational Readiness Reviews and Readiness Assessments 3.0 Discussion of Project Reviews Reviews are essential for the FPD to maintain confidence that project systems, processes, and technical efforts are integrated, effectively coordinated and provide the required information. Reviews also help ensure that the project is progressing at an effective and acceptable rate, particularly regarding established baselines. Although reviews are required at key junctures by the Critical Decision process, FPDs may also recommend project reviews at their discretion at any time if they encounter conditions which warrant special review. Each project has phases through which it evolves. A clear understanding of these phases permits better control and use of resources in achieving goals. Regardless of size and complexity, project phases consist of Initiation, Definition, Execution, and Transition/Closeout. The following sections provide an overview of various reviews: 3.1 Regular/Periodic and Status Reviews—Performance Reviews Project management performance reviews presented to senior leadership are performed at least quarterly through the project phases or life-cycle, or more frequently (such as monthly) when the project complexity, cost, or concerns warrant such review. These reviews provide a forum to communicate status and ensure continued support from senior executives within the Department. The reviews provide both information exchange and more detailed information than that provided in status reports. After approval of CD-0, the Acquisition Executive or designee should begin holding quarterly progress reviews through the approval of CD-4 for projects with a Total Project Cost or Environmental Management Total Project Cost greater than or equal to $5M. The Secretarial Acquisition Executive may delegate quarterly reviews for Major System Projects to the Under Secretaries. For Environmental Management Clean-Up Projects, quarterly reviews may be delegated to the Program Secretarial Officer. A performance review can take many forms; generally, the Federal Project Director presents the current program/project status. Performance reviews allow improved communication of detailed project information. These meetings provide opportunities to respond to questions, discuss future activities, identify needed support, and discuss actions by external entities influencing the project (e.g., the Office of Management and Budget, the U.S. Environmental Protection Agency, Congress, Defense Nuclear Facilities Safety Board, and others). Finally, these meetings are a forum for identifying, discussing, and resolving issues (or assigning actions) before issues become a problem. Performance reviews should use a graded approach tailored to project specific attributes, review/decision objectives, project status, size, and complexity. 3.2 Event Driven Reviews—Independent Project Reviews (IPRs) and External Independent Reviews (EIRs) 3.2.1 Independent Project Reviews The overall purpose of Independent Project Reviews are to determine, by a non-proponent body, whether the scope of programs, projects, or activities; the underlying assumptions regarding technology and management; the technical, cost and schedule baselines; safety and security; and the contingency provisions are valid and credible within the budgetary and administrative constraints under which DOE functions. IPRs assist in the process of risk management by assuring existing and potential problems are identified in a timely manner so that an adequate resolution is possible with minimum adverse impact to the project baselines. IPR’s may also be scheduled to meet a specific objective, such as a budget validation, nuclear safety and security, a technology readiness assessment, or a Critical Decision request. Some IPRs are mandatory as delineated in the DOE O 413.3A (see Section 5.0, Table 3). The scope of an IPR is dependent on the type of review, cost/complexity of the project and the project current status. Within the realm of IPR’s (another form) is the Technical Independent Project Review which are reviews conducted to reduce technical risk and uncertainty. These are discussed in more detail in Section 3.4, Areas of Special Concern and Design Reviews/Risk Driven Reviews – Technical IPRs. Non-proponents of the project outside of the specific program and the project being reviewed, conduct an Independent Project Review. The Deputy Secretary as the SAE, or the Program Secretarial Officer, the Operations/Field Office Manager, Program Managers, and Federal Project Directors can authorize Independent Project Reviews. IPR’s can be conducted on all projects under DOE O 413.3A, which means, projects with a TPC equal to or greater than $5M. 3.2.1.1 Project Definition Assessments/Project Definition Rating Index (PDRI). The Project Definition Rating Index (PDRI) is a project management tool that the National Nuclear Security Administration (NNSA) and the Office of Environmental Management (EM) use when conducting IPR’s prior to CD-2 for assessing how well the project scope is defined. The tool is designed to assist the project planners in increasing the likelihood of project success by improving project definition. OECM will use the same methodology to supplement the CD-2 EIR process prior to validating the project baseline to assess whether a consistent and sufficient level of front-end planning has occurred prior to establishing a project baseline. The PDRI tool was originated by the Construction Industry Institute for the purpose of improving the success ratio of their projects; that is, achieving project objectives within the original baseline budget and schedule. Essentially the tool uses a numeric assessment which rates (i.e. scores) a wide range of project elements to determine how well the project is defined. For a better understanding of the implementation of the tool the following references are provided: NNSA, Office of Project Management and System Support, NNSA Project Definition Rating Index (PDRI) Manual, Revision 0, June 2008; DOE, Office of Environmental Management, Project Definition Rating Index (EM-PDRI) Manual, Revision 1, February 2001. The principal purpose of the PDRI tool is to assist the IPTs by identifying key engineering and design elements that are critical to a well defined scope at various phases of a project. For example: it assists in identifying staffing requirements at each project phase; reporting progress on project definition at Quarterly Progress Reviews; assessing readiness for Internal and External Project Reviews; and supporting the Acquisition Executive in approving Critical Decisions. The PDRI review tool focuses on key project scope elements (e.g. Design Criteria, Design Drawings, Safeguards and Security, Hazard Analysis, Quality Assurance requirements, Heat and Material Balances, Constructability, etc.) that the PDRI review team believes will significantly improve project definition if adequately developed – particularly during the up-front planning performed prior to CD-1. 3.2.2 External Independent Reviews The purpose of an EIR, which is conducted prior to approval of CD-2 (Approve Performance Baseline), is to validate that the project can be executed to the proposed performance baseline (scope, cost and schedule). The Performance Baseline EIRs were first mandated by Congressional Language in 1998 and numerous times thereafter to ensure the validity of DOE’s performance baselines prior to construction budget requests. As directed by DOE O 413.3A, for all projects with a Total Project Cost (TPC) or Environmental Management TPC greater or equal to $100M, OECM is responsible for conducting the EIR. Recent OECM and the Office of Environmental Management approved protocol (Reference: Memorandum for Protocol for Environmental Management Cleanup Projects dated April 24, 2007) tailors this requirement for EM cleanup as follows: “An EIR will be conducted on the near-term baseline (see Section 6.1 for the tailored concept near-term baseline) if its cost is equal or greater than $250M, otherwise an IPR will be conducted.” Independent Project Reviews are conducted by the Project Management Support Office to validate the Performance Baseline for projects with a TPC less than $100M and greater than $5M (for EM cleanup projects the IPR is for projects with a TPC less the $250M and greater than $5M). EIRs are more than just a review of cost and schedule (the same would apply to an equivalent IPR). They support validation that: • costs and schedules are firmly supported with sound underlying planning and technical assumptions; • designs are mature enough (which includes technology readiness, nuclear safety, security and quality assurance); • the number, skill set and effectiveness of the Integrated Project Teams (IPTs) are appropriate to successfully execute the project; and • the Acquisition Strategy is appropriate and enhances project delivery. A second common type of EIR is the Construction/Execution Readiness External Independent Review that supports Critical Decision-3 for Major System Projects. DOE O 413.3A requires that OECM perform an EIR for all Major System Projects prior to authorization for CD-3. The purpose of the EIR is to assess the readiness for construction and/or execution and to re-confirm the completeness and accuracy of the performance baseline. In addition to the review elements of the performance baseline review, this EIR focuses on the final drawings, specifications and construction/execution planning. A similar IPR must be performed by the appropriate Program Secretarial Officer for Non-Major System Projects unless justification is provided and a waiver is granted by the Acquisition Executive. 3.3 Event Driven Reviews—Front End Planning Reviews, Operational Readiness Reviews (ORR), and Readiness Assessments. “Front-End Planning Reviews” may be requested by the Acquisition Executive (AE) and/or the Program. The purpose of these reviews, normally conducted sometime between CD-0 and CD-2, are to evaluate project assumptions, validate project requirements, evaluate project scope definition, assess the project risk analysis, verify the IPT project knowledge/participation, and foster quality assurance, to name a few areas. At times it would be appropriate to conduct these type of reviews in advance of a Performance Baseline Review (particularly for a Major System Acquisition) to provide a readiness assessment to conduct a CD-2 EIR. An Operational Readiness Review is an in-depth independent evaluation of the readiness of completed facilities, systems, equipment, procedures, personnel, and supporting and interfacing systems and organizations to begin facility operation. In the case of a facility project, the review focuses on the readiness details associated with turning the facility over to the user, including but not limited to final startup, testing and balancing mechanical systems. Because of the importance of this activity, Operational Readiness Review planning is initiated early in a project’s life cycle. Readiness Assessments (RA) are conducted on nuclear projects that do not require an ORR. An ORR or RA is conducted in accordance with DOE O 425.1C, Startup and Restart of Nuclear Facilities, and DOE STD 3006-2000, Planning and Conduct of Operational Readiness Reviews. 3.4 Areas of Special Concern and Design Reviews/Risk-Driven Reviews— Technical Independent Project Reviews Technical Independent Project Reviews are one of the measures that can be taken to ensure the timely resolution of engineering, system integration, technology readiness assessments, design, quality assurance, operations, maintenance and nuclear safety issues. The purpose of a technical review is to reduce technical risk and uncertainty. Technical risk reduction increases the probability of successful implementation of technical scope. Design Reviews could be conducted independently of IPRs but the qualified technical personnel involved can be internal or external to the project. The results of the Design Reviews are reviewed by the IPRs and the Technical IPRs, if relevant to the scope of the specific review being conducted, such as in support of the Critical Decision Process. Technical reviews are necessary when uncertainty exists concerning the outcome of a key project decision. DOE O 413.3A requires, as part of the Design Review requirements and the Graded Approach for Quality for high-risk, high-hazard, and Hazard Category 1, 2, and 3 nuclear facilities, that a Technical Independent Project Review be conducted prior to CD-1, the focus of which is to determine that the safety documentation is sufficiently conservative and bounding to be relied upon the next phase of the project. DOE-STD-1189 was developed to provide the Department’s expectations for incorporating safety early into the design process for new or major modifications to DOE Hazard Category 1, 2, and 3 nuclear facilities, the intended purpose of which involves the handling of hazardous materials, both radiological and chemical, in a way that provides adequate protection for the public, workers, and the environment. The approach of the Technical Independent Project Review for these facilities must ensure that safety aspects of the design will be thoroughly investigated (it should not be permissible to tailor the technical review for these type of facilities to exclude safety related aspects of the design, otherwise, the objectives of DOE-STD-1189 could not be met). Another example where a technical review can be useful is when a process technology or special equipment (such as one-of-a-kind equipment in development from concept to engineering design) is untried, or unproven, and no standards against which judgments regarding viability can be made, then an in-depth review by appropriately trained and knowledgeable peers is in order. Other types of Technical Independent Project Reviews can include: • Alternative Systems • Constructability • Functions and Requirements • Project Definition (Scope) Assessment • Design (at all stages of design status) • Technology Readiness Assessment (TRA) • System Verification (as part of System Engineering) • Physical Configuration • Test Readiness • Safety and Security • Functional Configuration • Operability and Reliability, Availability, and Maintainability 3.4.1 Technology Readiness Assessments (TRAs) and Technology Maturation Plans (TMPs) Technical Independent Project Reviews can include TRAs to provide an assessment of the maturity level of a new proposed technology prior to insertion into the project design and execution phases to reduce technical risk and uncertainty. A TRA provides a snapshot in time of the maturity of technologies and their readiness for insertion into the project design and execution schedule. A TMP is a planning document that details the steps necessary for developing technologies that are less mature than desired to the point where they are ready for project insertion. TRAs and TMPs are effective management tools for reducing technical risk and minimizing potential for technology driven cost increases and schedule delays. A TRA evaluates technology maturity using the Technology Readiness Level (TRL) scale that was pioneered by the National Aeronautics and Space Administration (NASA) in the 1980s. The TRL scale ranges from 1 (basic principles observed) through 9 (total system used successfully in project operations). See Figure 1 for a schematic of the meaning of the TRLs in the context of DOE EM projects. In 1999 the General Accounting Office (GAO) (GAO/NSIAD-99-162) recommended that the DoD adopt NASA’s TRLs as a means of assessing technology maturity prior to transition. In 2001, the Deputy Undersecretary of Defense for Science and Technology issued a memorandum that endorsed the use of TRLs in new major programs. Subsequently, the DoD developed detailed guidance for performing TRAs using TRLs in the 2003 DoD Technology Readiness Assessment Deskbook (updated in May 2005 [DOD 2005]). Recent legislation (2006) has specified that the DoD must certify to Congress that the technology has been demonstrated in a relevant environment (TRL 6) prior to transition of weapons system technologies to design or justify any waivers. TRL 6 is also often used as the level required for technology insertion into design by NASA. In March of 2007 the GAO recommended that DOE adopt the NASA/DoD methodology for evaluating technology maturity. In March 2008, the DOE Office of Environmental Management issued a process guide to assist individuals and review teams that would be involved in conducting TRAs and TMPs in environmental restoration projects where new technologies are proposed for insertion into the project (Reference: DOE, Office of Environmental Management, Technology Readiness Assessment (TRA)/Technology Maturation Plan (TMP)Process Guide, March 2008). 3.4.1.1 The Relationship of TRAs and TMPs to the DOE Critical Decision Process The TRA/ TMP process can be employed in a variety of situations requiring the determination of the state of technology development. In the realm of project management, TRAs and TMPs can be used as a project management tool to reduce the technical and cost risks associated with the introduction of new technologies into the realm of a project. For example, this management tool can be used to support the Critical Decision Process when assessing readiness and technology maturity levels of a new technology for a project before proceeding to the next development phase of a project. While the TRA/TMP process is not currently required by DOE O 413.3A, in the realm of program and project management, the TRA/TMP process can serve as one of the tools employed in helping to make the Critical Decisions required by DOE O 413.3A effective. The five Critical Decisions are major milestones approved by the Secretarial Acquisition Executive or Acquisition Executive that establish the mission need, recommended alternative, Acquisition Strategy, the Performance Baseline, and other essential elements required to ensure that the project meets applicable mission, design, security, and safety requirements. Each Critical Decision marks an increase in commitment of resources by the Department and requires successful completion of the preceding phase or Critical Decision. Collectively, the Critical Decisions affirm the following: • There is a need that cannot be met through other than material means [CD-0]; • The selected alternative and approach is the optimum solution [CD-1]; • The proposed scope, schedule and cost baseline is achievable [CD-2]; • The project is ready for implementation [CD-3]; and • The project is ready for turnover or transition to operations [CD-4]. The recommended guidance is to conduct TRAs during conceptual design and preliminary design processes; and at least 90 days prior to CD milestones. The review process should follow the system engineering approach to assess proper integration of systems with new technologies into the project (system within systems rather than piecemeal review). Figure 2 shows how TRAs and other key reviews support each of the CDs. (There are numerous additional requirements for each CD. See Table 2 of DOE O 413.3A for a complete listing.) See the figure in the PDF file Figure 1 – Schematic of DOE/EM Technology Readiness Levels See the figure in the PDF file Figure 2 – Suggested Technology Assessments and Other Review Requirements for Critical Decisions Note: The technology reviews, design reviews, and ORR are conducted in advance of the CD milestone to support the milestone decision. The TRL values above (in parenthesis) at each Critical Decision point are recommended values. CD-0, Approve Mission Need: Identification of a mission-related need and translation of this gap into functional requirements for filling the need. The mission need is independent of a particular solution and should not be defined by equipment, facility, technological solution, or physical end item (413.3A). The focus for Technology Assessment, at this stage, is on clear statement of the requirements of the input and the desired output of the process. For waste processing this would include characterization of the waste as well as definition of requirements for the processing and the waste form. A Technology Requirements Review would assess the adequacy of requirements definition and characterization information and determine any additional work necessary, to include an assessment of technology unknowns that need to be further evaluated. If additional work is necessary to adequately define technical scope of the project, a plan should be developed detailing its scope and schedule. CD-1, Alternative Selection and Cost Range: Identification of the preferred technological alternative, preparation of a conceptual design, and development of initial cost estimates. A TRA and a TMP should be performed during conceptual design, to support the CD-1 approval process. The TRA and TMP should be linked to the project risk assessment process as a whole. Prior to CD-1 approval, it is recommended that all Critical Technology Elements (CTEs) of the design should have reached at least TRL 4 and a TMP should have been prepared that details the strategies for bringing all CTEs to TRL 6, if possible. CD-2, Performance Baseline: Completion of preliminary design, development of a performance baseline that contains a detailed scope, schedule, and cost estimate. The process of technology development, in accordance with the approved TMP should support all CTEs reaching TRL 6; attainment of TRL 6 is preferable and indicates that the technology is ready for insertion into detailed design. CD-3, Start of Construction: Completion of essentially all design and engineering and beginning of construction, implementation, procurement, or fabrication. A TRA is only required if there is significant technology modification as detailed design work progresses. If substantial modification of a technology occurs, the TRA should be performed and a focused TMP developed or updated to ensure that the modified technology has attained TRL 6, if possible, prior to its insertion into the detailed design and baseline. CD-4, Start of Operations: Readiness to operate and/or maintain the system, facility, or capability. Successful completion/operation corresponds to attainment of TRL 7/8. 3.5 Unscheduled/Stakeholder Driven Reviews—Independent Cost Reviews (ICRs) Independent Cost Reviews are used primarily to verify project cost and schedule estimates and support the Critical Decision-2 process in establishing project performance baselines. Independent Cost Reviews are part of the Performance Baseline External Independent Review. However, an Independent Cost Review or even an Independent Cost Estimate may be requested at other times for other reasons. DOE O 413.3A requires the development of an Independent Cost Estimate (ICE) or an Independent Cost Review for Major System Projects as part of the Performance Baseline Validation EIR performed by the Office of Engineering and Construction Management. An ICE should be performed where complexity, risk, cost, or other factors create a significant cost exposure for DOE. For Major System Projects, the ICR or ICE required by DOE O 413.3A should be developed by OECM in coordination with the DOE Office of Cost Analysis, as part of the performance cost baseline validation process. Also under the category of Unscheduled/Stakeholder Driven Reviews, the Acquisition Executive or PSO/Deputy Administrator can request any type/category of review for the project as deemed appropriate. These reviews could also be driven by stakeholders such as the Government Accountability Office, NRC, DNFSB, DOE Headquarters, or the user. 4.0 Overview of the EIR/IPR Planning Process The EIR/IPR review process is normally a collaborative process, although the overall coordination of all review activities resides with OECM or the Project Management Support Office (PMSO), as it applies to the type of review and purpose. During the planning phase, project background information is assembled for the review committee or team. The IPR team is always led by a DOE Federal employee. For an EIR, the team is led by an OECM representative (Federal employee) but the review is usually conducted by an external group/contractor. Key project team points of contact at DOE Headquarters and the field are identified. The proposed scope of the review is planned in coordination with the DOE Proponent (program office), program manager and the Federal Project Director. For EIRs, since they are meant to be “external” and “independent”, after receiving scope input from appropriate project stakeholders, ultimately the final scope of work is determined by OECM and their external stakeholders. After determining the scope of the review, it is possible to identify the subject matter expertise that should be represented to staff each review team. OECM or the PMSO identify and arrange for appropriate personnel to staff each review team, in consultation with the requesting organization (personnel are matched with the lines of expertise required). The review team develops a Review Plan to help the review team coordinate activities as it executes the review (see Appendix D for an example of a tailored EIR Review Plan and Appendix E for an example of an IPR Review Plan Template). Table 1 shows the Review Plan main elements. Table 1 – Review Plan Main Elements Purpose Primary reason for the review and type of review. Background Relationship of the project to the sponsoring DOE program element Project Description and Status. Objectives Scope of Review and Lines of Inquiry. The scope of the review (tailored by the type of review and stage of the project) should address technical, cost, and schedule baselines, including management factors and acquisition approach used to justify the project and develop its scope. It should also address risk, safety and security management approaches. Problem/Issues Key obstacles the project faces. Review Logistics Review Process to include dates and site visit schedule. Resources/Team Members Size and configuration of the review team. Biographies of Team Members and assignments for the review. Documentation Project documentation to be prepared and made available prior to or at the review. Budget Possible budget issues for funding the review (not included in EIR plan). Detailed Schedule Primary activities in the review with completion dates for these activities. The typical activities in a review schedule and associated timing are shown in Table 2. The standard deliverables of a review, in addition to the Review Plan, include the Closeout Report (exit briefing), Draft Review Report and the Final Report. Team Members Selection Each review team is configured to satisfy the unique purpose of the review. It is critical that the individuals selected to perform the independent reviews be technically qualified/certified, credible and possess indisputable integrity and independence (contractors can be employed). All members of the review team should have excellent track records of performance under the technical areas assigned to review and recognized professional credibility. The reviewers should have no affiliation with the project being reviewed and are as independent as possible. In addition, the reviewers are not drawn from the responsible program office within the Program Secretarial Office, related contractors from the project office, or a related funding office. The EIR/IPR review chairperson is always a DOE Federal employee, usually from OECM or the Project Management Support Office, as it applies. Team members can be selected from experts from national laboratories, universities, and private industry (contractors) and Federal employees from other sites or offices. EIR team members should be independent (i.e., have no interest and/or equity) of the project to be reviewed. The range of disciplines involved may include project relevant technical disciplines, project management, contract systems, cost engineering, Environment, Safety, Quality and Health. Table 2 – Typical Timeline for a Performance Baseline External Independent Review See the table in the PDF file 5.0 Required and Optional Project Reviews This section provides the scope and project documentation for reviews that may occur during a DOE project’s life cycle. Table 3 provides a summary of the reviews in the order they normally occur and the responsible organizations. Table 3 – Types of Project Reviews in Support of Critical Decision Milestones (DOE O 413.3A) See the table in the PDF file Note 1: The Acquisition Executive may request an EIR in lieu of and IPR through OECM, and must do so if the Acquisition Executive has no Project Management Support Office to perform the review. 5.1 Mission Need Review A Mission Need Review conducted at the direction of the PSO/Deputy Administrator assesses the Mission Need Statement as the translation into functional requirements of the performance gap between current capabilities and those capabilities required to achieve the goals in the strategic plan. The Mission Need Statement should describe the general parameters of the project, why it is critical to the overall accomplishment of the Department mission, including the benefits to be realized. The mission need is independent of a particular solution, and should not be defined by equipment, facility, technological solution, or physical end-item to allow exploration of a variety of potential solutions. This review should be conducted for Major System Projects. A. Scope of Review Key review elements for a Mission Need Review include: • Mission Need Statement. Assess adequacy of documentation confirming that the new project provides a specific capability that the Department currently lacks to meet its assigned mission. • Program/Mission Requirements. Assess whether high-level requirements are sufficiently defined to identify potential alternatives (to be analyzed in the next phase) that are both applicable and capable of meeting project goals. • Total Project Cost and Schedule Ranges. Review basis of the rough order of magnitude cost range and provide an assessment of whether this range reasonably bounds the cost and schedule of alternatives to be analyzed in the next project phase. Review basis of schedule range and assess whether the schedule is consistent with strategic requirements for when this project is required. Also, for projects closely linked to other projects, assess whether schedule results in appropriate integration. B. Required Documentation The required documentation is prescribed by the Review Team as tailored to the specific project. A suggested list not all inclusive is as follows: • Mission Need Statement • Program Requirements Document • Rough order of magnitude cost ranges and schedule • Tailoring Strategy (if required) • Pre-conceptual Integrated Safety Documentation as prescribed by DOE Standard 1189 – Integration of Safety into the Design Process (if applicable). 5.2 Alternative Selection and Cost Range (CD-1) Reviews Except for Technical IPRs for Nuclear Facilities, which are required, CD-1 Reviews are optional IPRs that focus on the analysis supporting the selection of the preferred alternative, ensuring the system functions and requirements are defined, and ensuring the preliminary cost and schedule range is reasonable and justified. The Acquisition Strategy is also an integral part of the review. Technical IPR’s for Nuclear Facilities at CD-1 should also focus on the safety documentation to determine that it is sufficiently conservative and bounding to be relied upon for the next phase of the project. A Design Review should have been conducted of the conceptual design by a qualified technical team composed of members internal and external to the project whose primary function are to determine if the product (drawings, analyses, or specifications) is correct and will perform its intended functions and meet requirements. A. Scope of Review Key review elements for an Alternative Selection and Cost Range Review include: • Alternative Analysis. Assess whether the alternative selection process evaluates a range of appropriate attributes for each alternative including cost, maintainability, safety, technology requirements, risks, and regulatory requirements. Assess whether the analysis process for recommending a preferred alternative is reasonable and provides best value to the government. Asses the results of the Design Review of the conceptual design as to meeting its intended function and will support the Critical Decision 1 process. • System Functions. Assess whether functions and requirements are provided in sufficient detail that the preliminary design can be initiated with an unambiguous statement of work. • Acquisition Strategy. Assess whether the acquisition strategy has considered the full range of acquisition alternatives to achieve project objectives within specified constraints. Assess the Acquisition Strategy ability to meet the mission need in the most effective, economical, and timely manner. • Risk Management. Assess whether the key risks for the recommended alternative have been identified with mitigation steps defined. Assess whether the preliminary cost and schedule estimates reflect cost contingency and schedule contingency needed to address risks. • Hazard Analysis. Assess whether the hazard analysis is comprehensive and identifies key hazards and corresponding safety measures needed to be incorporated into the preliminary design. Technical IPRs and Design Reviews for Hazard Category 1, 2, and 3 facilities should identify substantial safety issues earlier in the design process, determine a satisfactory resolution for these issues, and ensure this task will be performed/accomplished prior to CD-1. • Preliminary Cost and Schedule Estimates. Review basis of preliminary cost and schedule estimates for reasonableness and executability. Assess whether the preliminary cost and schedule estimates include cost contingency and schedule contingency appropriate for the project. B. Required Documentation The required documentation is prescribed by the Review Team as tailored to the specific project. A suggested list not all inclusive is as follows: • Conceptual Design Report (including Alternative Analysis, Hazard Analysis, site selection criteria, NEPA documentation, system functions and requirements, preliminary cost and schedule estimates) • Risk Management Assessment • Acquisition strategy • Integrated safety documentation (as required by DOE Standard 1189 – Integration of Safety into the Design Process). It should not be permissible for Technical IPRs for Hazard Category 1, 2, and 3 nuclear facilities to be tailored to exclude safety related aspects of the design; otherwise, the objectives of DOE-STD-1189 could not be met. • High Performance Sustainability Building considerations • Preliminary Security Vulnerability Assessment Report • Preliminary Hazard Analysis Report • Environmental Documents • Quality Assurance Plan 5.3 Performance Baseline Review (CD-2) and Baseline Change Proposals This is an External Independent Review performed by OECM for projects with a TPC of $100M or greater and by the PMSO for projects with a TPC less than $100M (unless otherwise requested by the Acquisition Executive). The primary purpose of this review is to support the validation of the Performance Baseline, and provide reasonable assurance that the project can be successfully executed. For projects with a TPC of $100M or greater OECM is responsible for developing and finalizing the scope for External Independent Reviews. The draft scope of work (Review Plan) will be provided to the FPD and the project team one week after receipt of the required documentation. This will allow the project team time to identify specific activities to be covered in the review. The project team at the site will then have one week to review, comment, and provide recommendations on the scope of the review for OECM consideration. Below is a discussion of the 21 core elements that will generally form the scope of the Performance Baseline review, as well as the required documentation for this review. Additional elements or lines of inquiry beyond those presented in this document may be included in the scope of the Performance Baseline review based on unique aspects of the project being reviewed. It is important to recognize that both the scope and required documentation may vary for specific projects depending on the type of project and any tailoring that may be applied to the EIR (see Section 6 for a further discussion of tailoring). On a project by project basis, one of more of the 21 core elements may also be deleted from the review. The focus areas will vary with each project. A. Scope of Review The following are the normal elements and standard Lines of Inquiry (LOIs) that an EIR team should address. Elements may be added or deleted during the EIR scoping process, and LOIs will be further clarified and documented in the review plan. (1) Basis of Scope (As defined in the Work Breakdown Structure(WBS), System Functions and Requirements) • Assess whether the Work Breakdown Structure (WBS) and WBS dictionary incorporate all project work scope, and that the defined work scope and system requirements are derived from and consistent with the approved Mission Need. • Assess whether the Resource Loaded Schedule is consistent with the WBS for the project work scope. • Assess if the WBS represents a reasonable breakdown of the project work scope, and whether it product oriented. • Identify and assess the basis for and reasonableness of key programmatic, economic and project scope assumptions as related to the quality and completeness of the WBS, technical and design requirements, and risk management planning and contingency requirements. Identify all underlying technical assumptions and assess whether they are sound and/or appropriately addressed within the Risk Management Plan and adequately supported with funded contingency, particularly for new technologies that have never been developed and/or prototyped within the proposed environment. • Assess whether it is reasonable to divide the work scope presented into more than one discrete project. If applicable, identify the basis for managing such discrete projects in an integrated program. • Review the Program Requirements Document (PRD), or equivalent, and assess if project planning reflects the PRD and is consistent with the Mission Need. • Assess whether "design-to" functions are complete and have a sound technical basis (The EIR Team should include in their assessment safety and external requirements such as permits, licenses, and regulatory approvals). • Assess whether the requirements have been defined well enough to establish a firm performance baseline. • Assess whether the CD-4 (project completion) activities and requirements, and project key performance parameters (KPPs) are clearly defined in the Requirements Document. Assess whether these activities and requirements are sufficiently defined, under change control and not expected to change, quantified, measurable, and can reasonably be determined as complete. Identify the CD-4 requirements/activities/KPPs in a separate table in the EIR report, including summary analysis results. • Assess adequacy and completeness of standards and requirements to include DOE Directives (e.g., Policies, Orders, Standards and Guides to include DOE O 413.3A, DOE-STD-1189, etc.) identified as being applicable and appropriate to the project either due to the nature of the project or contract requirements. Identify any areas of non compliance with the identified standards and requirements. (2) Basis of Cost (As defined in the Resource Loaded Schedule (RLS)) • For selected Work Breakdown Structure (WBS) elements (typically, those constituting significant cost and/or risk), evaluate the detailed basis for the cost estimate. • Assess the method of estimation and the strengths/weaknesses of the estimates for each WBS element reviewed. • Identify and assess the basis for and reasonableness of key programmatic, economic and project cost assumptions as related to the quality of estimates for each WBS element, and risk management planning and contingency requirements. • Develop an Independent Cost Estimate (ICE) or perform an Independent Cost Review (ICR) for Major System Projects as part of the EIR baseline validation process as required by DOE O 413.3A (the ICR or ICE should be developed by OECM in coordination with the DOE Office of Cost Analysis. • Assess the amount of and basis for economic escalation. • Assess reasonableness of resource loading, including what resources are loaded. • Develop summary baseline cost tables of the proposed costs (i.e., PED, TEC, OPC, TPC, PMB, MR, Fee, DOE Direct Costs, and Contingency) for the EIR report. Verify whether the estimated costs for the project are reasonable based on professional expertise, parametric estimates, historical data, etc. • Verify that the cost value of schedule contingency is included in the TPC. (3) Basis of Schedule (As defined in the Resource Loaded Schedule) • For the selected Work Breakdown Structure (WBS) elements, evaluate the detailed basis of schedule estimate. Assess the total schedule for reasonableness and completeness. • Assess the method of estimation and the strengths/weaknesses of estimates. • Review and assess schedule assumptions and evaluate the reasonableness of these assumptions as related to the quality estimates for each WBS. • Develop summary baseline schedule tables of the proposed milestones (i.e., Critical Decision dates and other significant or critical project dates) for the EIR report. Identify whether the estimated schedule for the project is reasonable based on professional expertise, parametric estimates, historical data, etc. • Determine if schedule contingency is derived quantitatively and if the calculated duration is placed between the end of the last project critical path activity and the “Submit Request for CD-4” milestone. (4) Funding Profile and Budget • Review and provide the basis for the Funding Profile (e.g. latest Project Data Sheet). Compare the annual budget with the cost requirements, and assess whether the costs and budget are reasonably linked. Evaluate any significant disconnects between the Performance Baseline requirements and budget/out-year funding. Evaluate the reasonableness of the Budget Authority versus Budget Obligation profiles. (5) Critical Path • Assess whether the Critical Path is reasonably defined. Assess whether the Critical Path reflects an integrated schedule and schedule durations are reasonable. • Review the duration between the Critical Path completion date and the Project Completion date (CD-4). Assess whether the schedule contingency (float) is reasonable for this type of project. • Evaluate if there is a clearly defined critical path leading to submission of the CD-4 request. • Assess the critical path schedule for level of effort activities. • Verify that “near critical paths” are clearly identified. (6) Risk and Contingency Management • Review the approach used to identify project risks and assess adequacy of this approach. • Assess adequacy and completeness of both DOE and contractor risk management planning including the method(s) used to identify risks, and whether a reasonably complete list of potential risks was developed for analysis. o Review key risks (e.g., programmatic, economic, those resulting from assumptions, technical including those associated with use of critical technologies, etc.) and the risk rankings, and provide the EIR Team’s assessment of the risk evaluation process and conclusions. • Assess whether the appropriate risk handling strategies/actions, including accepted risks and residual risks, have been incorporated into the performance baseline. • Review and assess cost and schedule contingency (both contractor and DOE). o Provide an assessment of whether the analysis for and basis of contingency is reasonable for this type of project and its associated risks. o Ensure contingency analysis and allowances are tied to risk assessments. o Ensure contingency accounts for estimate uncertainty, which is directly tied to design maturity and estimating methodologies used. • Assess adequacy of the qualitative analysis and rating (high, medium, or low) of current risks (including site specific factors such as availability of contractors) for probability of occurrence and for consequence of occurrence. • Evaluate the extent and adequacy of quantitative risk analysis. • Evaluate whether the risk watch list and risk assessment sheets appear to be complete. • Evaluate the adequacy of the management control process for risk status/updating. (7) Hazards Analysis/ Safety • Review the functional make-up of the hazards analysis/safety project team, and provide an assessment of the overall staffing mix and expertise of the team. • Assess whether the hazards identified and the accident scenarios represent a reasonably comprehensive list. Determine if controls are capable of mitigating defined accidents and if confinement/containment of radioactive material is addressed. • Assess expectations for facility level systems, structures, and components (SSCs). Determine whether SSCs for worker and public safety, and safety class/safety significant (SC/SS) equipment and components, have been incorporated into the design and Performance Baseline. • Review the Integrated Safety Management System implementation and assess whether safety has been appropriately addressed throughout the lifecycle of the project. • Assess the relevant change control process relative to required documentation and necessary SSCs. • Assess the hazard analysis process, including the use of internal and external safety reviews. • As applicable, review any Defense Nuclear Facilities Safety Board (DNFSB) and/or Nuclear Regulatory Commission (NRC) interface and discuss the status of their involvement. Assess whether DNFSB/ NRC issues have been reasonably considered and addressed. If not, review the outstanding issues, assess when they will be resolved, and determine what risks they pose. • Assess status of and resolution of corrective actions by the contractor, including incorporation of any additional identified safety requirements. Note: The following Lines of Inquiry (LOIs) are applicable to Hazard Category 1, 2, and 3 nuclear facilities. • Review if the hazard analysis incorporates expectations from the Safety Design Strategy (SDS). • Review the Preliminary Safety Design Report (PSDR), SDS and Fire Hazards Analysis (FHA). Assess whether these documents are complementary, reflect continuously refined analyses based on evolving design and safety integration activities during preliminary design, address all required elements in accord with DOE-STD-1189, and have been evaluated by appropriate individuals and organizations. • Assess whether the SDS addresses the following three main attributes of safety integration as the project progresses through project planning and execution: o The guiding philosophies or assumptions to be used to develop the design; o The safety-in-design and safety goal considerations for the project; and o The approach to developing the overall safety basis for the project. • Ensure a Preliminary Safety Validation Report (PSVR) has been completed; and • Assess whether it adequately addresses the required review of the PSDR or Preliminary Documented Safety Analysis (PDSA). (8) System Functions and Requirements; Basis of Design • Assess whether “design-to” functions are complete and have a sound technical basis (The review team should include safety and external requirements such as permits, licenses, and regulatory approvals). • Review all underlying technical assumptions and assess whether they are sound and/or appropriately addressed within the Risk Management Plan and adequately supported with funded contingency, particularly for new technologies that have never been developed and/or prototyped within the proposed environment. • Assess whether the requirements have been defined well enough to establish a firm performance baseline. • Assess whether system requirements are derived from and consistent with the Mission Need. • Assess whether the CD-4 (project completion) activities are clearly defined in the Requirements Document, and whether these activities are quantified and measurable, or can otherwise be reasonably determined as complete. • Review the basis of design and assess the reasonableness of the design requirements and output for each function/operation. Review the unit operation, the design parameters, and the basis of the design parameters and whether the design basis is reasonable. • Review the technology readiness assessment and technology maturation plans and assess the status of the technology to support baseline validation. • Ensure safety requirements resulting from review of safety documents (e.g., PSDR and PSVR) are incorporated into the design and baseline. • Review the surrogate tests, as applicable, and whether the surrogate composition reasonably represents the full range of feed streams and whether the design basis incorporates results of the tests. • Review process and material balance flow sheets to assess the reasonableness of the input and output parameters for each unit operation, and adequacy to support environmental permitting, licensing and other regulatory decisions. • Ensure that the design addresses results of reliability, availability, maintainability, and inspect ability (RAMI) analyses. (9) Preliminary Design Review and Comment Disposition • Assess whether the design has progressed far enough (is “mature” enough) to support the proposed performance baseline. • Confirm that a Design Review has been performed by a qualified team, to ensure the adequacy of the preliminary design including adequacy of the drawings and specifications, and assess whether they are consistent with system functions, requirements, and key performance parameters. o Review the disciplines and experience of the Project Design Review Team. Assess whether the Design Review team had appropriate experience and technical disciplines on the team. • Review the Design Review comments and responses. Based on a reasonable sample, assess whether these comments have been incorporated into the design, and whether the costs and schedule associated with design changes have been incorporated into the Performance Baseline. (10) Start-Up Planning and Operations Readiness • Assess whether the start-up test plan identifies how tests will be determined to be successful, and evaluate whether the associated equipment and instrumentation has been included in the preliminary design. • Review the startup and operational readiness test requirements and plans and assess whether they represent: o The acceptance and operational system tests required to demonstrate that the system meets design performance specifications, safety requirements, and key performance parameters; and o Sufficient scope definition to enable reasonable estimates of cost, schedule, and resources. • Assess traceability of functional, operational, and safety requirements into the start-up test plan. • Determine and evaluate any exceptions taken by a potential construction contractor or project consultants in meeting startup test specifications. • Assess whether cost, time and resources estimates are reasonable to accomplish the required startup activities and have been included in the performance baseline. • Assess whether the start-up plan has been fully integrated with existing functional organizations including security. • Assess whether results of tests (e.g., equipment tests, process tests, surrogate tests, etc.) have been factored into startup and operational readiness planning. • Assess whether risks of the test program are provided in the project risks and that there is sufficient cost and schedule contingency assigned, if appropriate, for problems with test and equipment failure during start-up testing. (11) Project Controls/Earned Value Management System (EVMS) Note: The EIR Team review of a contractor’s Earned Value Management System does not constitute an EVMS Certification Review or Surveillance review, unless made part of the EIR scope during the scoping meeting. • Assess the status of the contractor’s project control system to include the Earned Value Management System relative to the requirements of the contract and DOE O 413.3A. • Assess whether project control systems and reports are being used to report project performance, whether the data is being analyzed by the Federal IPT and contractor management, and that management action is taking place as an outcome of the analysis function. • Assess the project control process to determine as to how the project incorporate formal changes, conduct internal re-planning, and adjust present and future information to accommodate changes. Determine if changes, including acceptable retroactive changes (correcting errors, routine accounting adjustments, or improving accuracy of the performance measurement data), are documented, justified, and explained. • If the project contractor has a certified Earned Value Management System, assess whether a surveillance system is in place to maintain the system for continued compliance with the American National Standards Institute EVMS Standard (ANSI/EIA-748-A- 1998). o Review the contractor’s EVMS system/project control description. o Assess the contractor’s surveillance program. • If the project contractor does not have a certified EVMS, assess the likelihood of the EVMS being certified prior to by CD-2 and no later than CD-3. o Determine if there is an EVMS certification review scheduled to occur within sufficient time to permit EVMS certification, and assess the status of efforts and management focus on ensuring the EVMS is ready for certification review. o If a certification review is in process, assess the status of efforts and management focus on resolving open issues to obtain certification within sufficient time preceding the baseline Critical Decision dates. (12) Quality Control/Assurance • Assess the applicability, completeness, adequacy, and flow-down of the Project Quality Assurance Program, including software quality assurance (SQA), based on DOE Order 414.1C and 10 CFR 830 Subpart A. The review team will review the record of QA audits performed on the Project and the disposition of the audit findings. • Assure that the QA/QC Plan and implementing procedures address personnel training and qualifications, quality improvement programs, document and record management, work processes, management and independent assessments, acceptance test planning and implementation, and the process for disposition of field changes. • Assess QA/QC requirements for construction planning. • Assess whether QA requirements (NQA-1 if applicable) have been appropriately incorporated into the “Design-to” functions, and costs, time and resources adequately estimated and included in the baseline. (13) Value Management/Engineering. • Assess the applicability of Value Management/Engineering, and whether a Value Management/Engineering (VM/E) analysis has been performed with results being incorporated into the baseline. • Assess the Value Management/Engineering process for this project including whether the VM/E team had a reasonable skill mix and experience background. • Assess whether life cycle cost analysis was reasonably performed as part of the trade-off studies and various alternatives reviewed. (14) Project Execution Plan. • Review the Project Execution Plan (PEP) and determine if it establishes a plan for successful execution of the project, if the project is being managed and executed in accordance with the PEP, and if it is consistent with other project documents. Determine if it has been reviewed by appropriate site and Headquarters’ organizations and comment resolution agreed to. • Determine if there is a program for integrated regulatory oversight and assess if applicable Federal, state, and local government permits, licenses, and regulatory approvals, including strategies and requirements necessary to construct and operate a facility or to initiate and perform project activities are identified and will be obtained when needed to continue project execution on schedule or milestone dates established. Identify if schedule for receipt of authorization from regulators is realistic and based on experience, and that requirements and milestone dates are updated as necessary and kept current. • Assess key inter-site coordination issues and determine if they are identified, addressed and resolved or appropriate plans in place to accomplish resolution. • Assess key on-site coordination issues and determine if they are identified, addressed and resolved or appropriate plans in place to accomplish resolution. • Determine if all stakeholders are identified, and assess if their relationship to the project is evaluated, project impacts on them and their interests identified, and required interfaces with external organizations or authorities addressed. • Determine if an appropriate Public Participation Plan is in place based on available stakeholder information and size and scope of project, and if specific stakeholder group issues are addressed relative to project goals and objectives, technical issues, project risk, and environmental strategies. • Identify applicable GAO, IG, and other oversight body reports and determine if issues or concerns have been resolved or otherwise adequately addressed. Similarly, identify and assess relevant Congressional language in authorization and appropriation bills. (15) Acquisition Strategy • Review the Acquisition Strategy (AS) to determine if it is consistent with the way the project is being executed. • Assess whether there are provisions in the AS for adequate contractor incentives (and disincentives) to enhance project execution. • Identify any changes from previously approved AS, and assess whether the changes have been properly reviewed and authorized and if the current AS still represents the best value to the Government. (16) Integrated Project Team (IPT) • Review Federal and contractor IPT Charters and determine if all appropriate disciplines are included. • Confirm that the FPD is certified to manage this project. • Assess both Federal and contractor project management staffing in terms of number of personnel, skill set, effectiveness, quality, organizational structure, division of roles/responsibilities, and processes for assigning work and measuring performance. (Differentiate between full and part-time IPT members.) • Assess whether the federal and contractor project teams can successfully execute the project. • Ensure IPT membership includes appropriate safety experts. Identify if the Federal IPT nuclear safety expert is validated as qualified by the Chief of Nuclear Safety/Chief of Defense Nuclear Safety in accordance with DOE O 413.3A. • Assess the span of control (in terms of not only supervisory responsibility but also management of dollars and project issues) of key project management personnel, including the FPD, to determine whether they can successfully perform their duties. • Evaluated any deficiencies in the federal or contractor IPTs that could hinder successful execution of the project. (17) Sustainable Design • Assess whether the project has identified sustainable design features, in accordance with the Energy Policy Act of 2005, Executive Order 13423, and DOE O 450.1 chg 3, and that these features have been properly accounted for within the performance baseline. • Assess whether the project is eligible for Leadership in Energy and Environmental Design (LEED) certification. (18) Safeguards and Security • Assess whether a Preliminary Security Vulnerability Assessment Report as defined in DOE M 470.4-1 has been updated as required by DOE O 413.3A. • Assess the completeness and accuracy of the applicable safeguards and security requirements, the methods selected to satisfy those requirements, and any potential risk acceptance issues applied to the project and their incorporation into the project. • Assess adequacy of incorporation of Design Basis Threat requirements into the baseline. • Review the Performance Baseline to ensure that cost, schedule, and integration aspects of safeguards and security are appropriately addressed. • Assess whether all feasible risk mitigation has been identified and that the safeguards and security concerns for which explicit line management risk acceptance will be required are appropriately supported. (19) New Technology and Technology Readiness • Review all technology decisions that have been made to date and determine whether the project is incorporating new technologies or existing technologies in new applications. • Assess the plans for and results of tests of new technologies or new applications of existing technology. Determine if the scale of the test is adequate to mitigate risks and/or safety concerns. • Assess whether the identified technologies are at a sufficient level of maturity to be incorporated into the design and baseline. To the extent possible, provide an analysis of the Technology Readiness Level for the applicable technologies identified [Government Accountability Office Report 07-336, Major Construction Projects Need a Consistent Approach for Assessing Technology Readiness to Help Avoid Cost Increases and Delays, March 2007]. • Assess whether the current baseline adequately provides for sufficient cost and schedule to accomplish required research, development, testing, and implementation of these new technologies or new applications of existing technologies. • Determine if the Risk Management Plan accounts for risks associated with new technologies or new applications of existing technologies, and that adequate contingency has been included. (20) Contract Management • Assess the current contract including cost, schedule and work scope against the proposed baseline and identify any potential contract and project integration issues. o Determine whether the terms of the current contract support the project as currently planned and identify any gaps between the current contract and planned performance baseline. o Assess effectiveness of integrated change control and use of change control boards by both federal and contractor organizations. o Likewise, assess any planned contract modifications and requests for equitable adjustments relative to the proposed performance baseline. • Evaluate the status of contract management, and if applicable, plans and schedule to bring the contract up to date. • Assess project plans to self-perform construction and operations readiness versus subcontracting that work. • Assess draft documents to be provided to the services (e.g., construction) and product (e.g., purchased materials and equipment) subcontractors including submittal of documents by the subcontractors required before notice to proceed (e.g., design requirements, EVMS, and systems testing and turnover requirements). (21) Documentation and Incorporation of Lessons Learned • Assess whether the project team is documenting and sharing lessons learned from their project internally and externally. • Assess whether the project team is reviewing and incorporating lessons learned from this and other projects. B. Required Documentation In general, the following documents (or equivalents) are normally required for the Performance Baseline (EIR). Other associated material may be requested by OECM and the Review Team to ensure a complete and accurate review is performed. Documentation required can also be tailored to the EIR plan. • CD-0 Documents (e.g., Mission Need Statement, Approval of Mission Need) • CD-1 Documents (e.g., Approval of Alternative Selection and Cost Range) • Work Breakdown Structure (WBS) and WBS Dictionary • Detailed Cost Estimate, including Basis of Estimate for selected elements • Detailed Resource Loaded Schedule (in native format and pdf format) • Project Data Sheet; identified/approved funding profile • Program Requirements Document (or equivalent) • Critical Path and Near Critical Path Schedules • System Functions and Requirements Document (also referred to as the "Design-to" requirements or Design Criteria) • Results of and Responses to Project Design Reviews and Technical Independent Project Reviews (copy of reports); Design Reviews and Technical Reviews personnel and brief resumes. • Design documents including drawings, specifications and design lists; process flow diagrams, material balance(s) • Design change logs • Process test plan including plans for surrogate testing; proof-of-process tests • Technology Readiness Assessment(s) and Technology Maturation Plan • Conceptual Design Report • Project Execution/Management Plans • Preliminary Construction Execution/Management Plans • Integrated Project Team Charter (assignment letters as appropriate) • Documented IPT Processes • FPD Certification status and Integrated Project Team qualifications (resumes as appropriate) • Start-up Test Plan and other operations readiness plans (as appropriate) • Hazards Analysis/Hazard Analysis Report • DNFSB and NRC Reports and correspondence identifying any project issues and resolution • Responses to DNFSB and NRC reports • Preliminary Safety Design Report (Hazard Category 1, 2, or 3 nuclear facilities) • Preliminary Security Vulnerability Assessment Report • Preliminary Safety Validation Report (Hazard Category 1, 2, or 3 nuclear facilities) • Final National Environmental Policy Act documentation • Risk Management Assessment(s)/Risk Management Plan(s) with identified federal/contractor contingency • Risk Watch List • Contingency/Monte Carlo Analyses and Contingency Plan • Acquisition Strategy • Value Management/Engineering Report(s) • Quality Control/Assurance Plan • Interface Documentation (procedures, MOU/MOA with site M&O) • Reports and Corrective Action Plans (CAPs) from previous internal and external project reviews (if applicable) • Sustainable design documentation including LEED certification eligibility • EVMS or project controls reports and project monthly/quarterly reports (last three) • Trend reports • Project Control System description • Change Control Process • Monthly and Quarterly Progress reports for past year • Contracts applicable to the project • Contract Management Plan • Pending contract modifications/Requests for Equitable Adjustment • Project Funding Profile (Program budget/planning office should identify if this profile is within the Program target budget profile) 5.4 Approve Start of Construction Review (CD-3) The purpose of the Construction or Execution Readiness Review is to assess the readiness for construction or execution and to confirm the completeness and accuracy of the Performance Baseline. An External Independent Review is performed by the Office of Engineering and Construction Management on Major System Projects to verify execution readiness. A similar Independent Project Review should be performed by the appropriate Program Secretarial Office for Non-Major System Projects unless justification is provided and a waiver is granted by the Acquisition Executive. The Scope of review for an EIR in support of CD-3 has several elements specific to construction readiness, but retains many of the elements contained in the Performance Baseline Review. Below is a discussion of 19 core elements that will generally form the scope of the Construction or Execution Readiness Review, as well as the required documentation for this review. Additional elements or lines of inquiry beyond those presented in this document may be included in the scope of the Construction or Execution Readiness Review based on unique aspects of the project being reviewed. It is important to recognize that both the scope and required documentation may vary for specific projects depending on the type of project and any tailoring that may be applied to the EIR. On a project-by-project basis, one or more of the core elements may be deleted from the review while others areas are added. The focus areas may also vary if partial CD-3 phases (e.g., CD-3A, CD- 3B) for long lead procurements or early site work are being reviewed and approved in advance of the complete CD-3 EIR. In addition, if the project is requesting a CD-3A at the time of CD-2, applicable elements and LOI from the following list should be included in the scope and Review Plan for a combined CD-2/CD-3A EIR. A. Scope of Review The following are the normal elements and standard Lines of Inquiry (LOIs) that an EIR Team should address. Elements may be added or deleted during the EIR scoping process, and LOIs will be further clarified and documented in the review plan. (1) Basis of Scope (As Defined in the Work Breakdown Structure, the Final Drawings and Specifications, the Final Design Functions and Requirements, and the Site Final Design Review) • Identify and assess any changes to the project mission need, scope, or Work Breakdown Structure (WBS) since CD-2. • Identify and assess any changes to the basis for and reasonableness of key programmatic, economic and project scope assumptions as related to the quality and completeness of the WBS, technical and design requirements, and risk management planning and contingency requirements since CD-2. • Identify and assess any changes to the CD-4 (project completion) activities and requirements and project KPPs since CD-2. • Assess completeness and quality of drawings and design specifications. Review selected construction elements or systems, including the key project elements posing the more difficult construction challenges. • Assess whether bid packages are sufficiently clear and well defined as to be ready for bid. • Assess whether all final design functions and requirements are reflected in the Performance Baseline, including safety SSCs and external requirements such as permits, licenses, and regulatory approvals. • Assess whether all required changes from the Site Final Design Review are incorporated into the Performance Baseline, and assess whether the technical scope elements of the Performance Baseline remain consistent with that approved at CD-2. • Assess whether any design activities that are planned to be performed during construction as appropriate as to type and amount of designed to be performed. Assess the design basis for this additional work, what organization will perform the design work, and the basis to proceed with construction. • Assess the technology readiness status level to proceed with design and determine if any process testing is planned and the potential impact to design. (2) Basis of Cost and Schedule (As defined in the RLS) • Identify and assess substantive changes to the RLS since CD-2 relative to its consistency with the approved Performance Baseline (TPC, CD-4 completion schedule). • For selected WBS elements (typically, those constituting significant cost, schedule and/or risk), evaluate the detailed basis for the cost or schedule estimate. Assess strengths/weaknesses of the estimates reviewed. • Identify and assess any changes since CD-2 to the basis for and reasonableness of key programmatic, economic and project cost assumptions as related to the quality of estimates, and risk management planning and contingency requirements. • Assess the amount of and basis for economic escalation. Identify changes since CD-2. • Assess reasonableness of resource loading, including what resources are loaded. Determine if resource requirements factor in project performance since CD-2 or performance of other similar projects in execution. • Assess whether approved changes are included in the project master cost and schedule and evaluate if trends or planned changes are adequately identified to provide a firm basis to proceed with construction. (3) Construction/Execution Planning • Assess adequacy of construction/project execution planning. o Review the adequacy of constructability reviews to assess whether construction documents have been reviewed for accuracy, completeness, and systems coordination issues. o Assess status of logistics including interface with operating facilities and maintenance organizations, infrastructure interfaces, adequacy of lay-down areas, temporary construction facilities, security and badging readiness, and other logistical elements. o Assess potential coordination issues, missed details, time delays, potential liability, or inter-contractor coordination items. • Assess adequacy of the Federal IPT, Site M&O/Prime Contractor and/or Construction Management Organization (as applicable), and construction contractor staffing for construction execution to ensure adequate oversight of the work, including safety, performance, and quality. o Assess oversight and management of the construction contractor by IPT and site prime contractor. • Assess the updated Startup Test Plan to ensure that sufficient information is provided for contractor commissioning test and system/facility acceptance tests including cold and hot testing and plans for ORR/RA, as appropriate. (4) Funding Profile and Budget • Review and assess the basis for the Funding Profile (e.g. latest Project Data Sheet). • Compare the annual budget with the cost requirements, and provide an assessment of whether the costs and budget are reasonably linked. Identify any significant disconnects between the Performance Baseline requirements and budget/out-year funding. Assess the reasonableness of the Budget Authority versus Budget Obligation profiles. (5) Critical Path • Assess whether the Critical Path is reasonably defined. Identify any changes since CD-2. • Review the Resource Loaded Schedule relative to the Critical Path and assess whether the Critical Path reflects an integrated schedule and schedule durations which are reasonable. • Assess the duration between the Critical Path completion date and the Project Completion Date (CD-4). Assess whether the schedule contingency (float) is reasonable for this type of project. • Assess if there is a clearly defined critical path leading to submission of the CD-4 request. • Assess the critical path schedule for level of effort activities. • Verify that “near critical paths” are clearly identified. (6) Hazards Analysis/ Safety • Identify changes to the hazards analysis and safety basis since CD-2. Assess whether these changes are reflected in the performance baseline scope, cost and schedule. • Review the functional make-up of the hazards analysis/safety project team, and provide an assessment of the overall staffing mix and expertise of the team. • Assess the Hazard Analysis (HA) process, including the use of internal and external safety reviews. • As applicable, review any Defense Nuclear Facilities Safety Board (DNFSB) and/or Nuclear Regulatory Commission (NRC) interface and discuss with the local representatives the status of their involvement. Assess whether DNFSB/NRC issues have been reasonably considered and addressed. If not, identify the outstanding issues, assess when they will be resolved and determine what risks they pose. • Review the Integrated Safety Management System and assess whether safety has been appropriately addressed throughout the lifecycle of the project. • Assess whether the hazards identified and the accident scenarios represent a reasonably comprehensive list. Determine if controls are capable of mitigating defined accidents and if confinement/containment of radioactive material is addressed. • Assess expectations for facility level systems, structures, and components (SSCs). Determine whether SSCs for worker and public safety, and safety class/safety significant (SC/SS) equipment and components, have been incorporated into the design and Performance Baseline. • Assess the relevant change control process relative to required documentation and necessary SSCs. • Assess status of and resolution of corrective actions by the contractor, including incorporation of any additional identified safety requirements. Note: The following LOIs are applicable to Hazard Category 1, 2, and 3 nuclear facilities. • Identify if the HA incorporates expectations from the Safety Design Strategy (SDS). • Review the Preliminary Documented Safety Analysis (PDSA) and SDS. Assess whether these documents are complementary, reflect continuously refined analyses based on evolving design and safety integration activities during preliminary design, address all required elements in accord with DOE-STD-1189, and have been evaluated by appropriate individuals and organizations. • Assess whether the SDS addresses the following three main attributes of safety integration as the project progresses through project planning and execution: o The guiding philosophies or assumptions to be used to develop the design; o The safety-in-design and safety goal considerations for the project; and o The approach to developing the overall safety basis for the project. • Ensure a Safety Evaluation Report (SER) has been completed and assess whether it adequately addresses the required review of the PDSA. (7) Risk Management • Identify and assess any substantive changes to the Federal and contractor risk and contingency management plans or processes since CD-2. • Assess whether the risk assessment and management plan have been updated, as appropriate, to address any new risks identified in final design and evaluate the adequacy of the management control process for risk status/updating. • Evaluate whether the risk watch list appears to be complete. • Assess whether all appropriate risk handling strategies/actions, including accepted risks, and residual risks have been incorporated into the performance baseline including cost and schedule contingency. • Identify and assess cost and schedule contingency. Provide an assessment of whether the basis of contingency is reasonable for this type of project and its associated risks, and whether cost and schedule contingency, including value/cost associated with schedule contingency, remains sufficient for project risks. • Assess MR/contingency drawdown and utilization history for reasonableness, and determine if sufficient contingency remains. (8) Value Management/Engineering • Assess the application of Value Management/Engineering during Final Design, and if results have been incorporated into the Performance Baseline. (9) Acquisition Strategy • Review the Acquisition strategy to determine if there have been any significant changes and if the acquisition approach continues to represent the best value to the government. (10) Project Execution Plan • Review the Project Execution Plan and assess if the project is being managed and executed in accordance with it. It should be updated to reflect any changes as a result of Final Design and be consistent with the other project documents. • Identify and assess any changes to the integrated regulatory oversight program since CD- 2. Assess if applicable Federal, state, and local government permits, licenses, and regulatory approvals, including strategies and requirements necessary to construct and operate a facility or to initiate and perform project activities are being obtained when needed to continue project execution on schedule or milestone dates established. Assess if schedule for receipt of authorization from regulators is updated and kept current. • Identify and assess any changes since CD-2 to key inter-site or on-site coordination issues, or stakeholder relationships. Assess if they are identified, addressed and resolved or appropriate plans in place to accomplish resolution. • Identify and assess if any new GAO, IG, or other oversight body reports are available since CD-2 and determine if issues or concerns are adequately addressed. Similarly, identify and assess relevant Congressional language in authorization and appropriation bills. (11) Project Controls/Earned Value Management System • Assess the status of the contractor’s project control system to include the Earned Value Management System relative to the requirements of the contract and DOE O 413.3A. • Assess whether project control systems and reports are being used to report project performance, whether the data is being analyzed by the Federal IPT and contractor management, and that management action is taking place as an outcome of the analysis function. • Evaluate the control process whereby projects incorporate formal changes, conduct internal re-planning, and adjust present and future information to accommodate changes. Determine if changes, including acceptable retroactive changes (correcting errors, routine accounting adjustments, or improving accuracy of the performance measurement data), are documented, justified, and explained. • If the project has a certified Earned Value Management System, assess whether a surveillance system is in place to maintain the system for continued compliance with the American National Standards Institute EVMS Standard (ANSI/EIA-748). o Review the project’s EVMS system/project control description. o Assess the project’s surveillance program. • If the project does not have a certified EVMS, but a certification review is in process of being completed, assess the status of efforts and management focus on resolving open issues to obtain certification consistent with the baseline CD-3 date. (12) Integrated Project Team • Review Federal and contractor IPT Charters and determine if all appropriate disciplines are included. • Confirm that the FPD is certified to manage this project. • Assess both Federal and contractor project and construction management staffing in terms of number of personnel, skill set, effectiveness, quality, organizational structure, division of roles/responsibilities, and processes for assigning work and measuring performance. (Differentiate between full and part-time IPT members.) • Assess whether the federal and contractor project and construction management teams can successfully execute the project. • Ensure IPT membership includes appropriate safety experts. Identify if the Federal IPT nuclear safety expert is validated as qualified by the Chief of Nuclear Safety/Chief of Defense Nuclear Safety in accord with DOE O 413.3A. • Assess the span of control (in terms of not only supervisory responsibility but also management of dollars and project issues) of key project management personnel, including the FPD, to determine whether they can successfully perform their duties. • Assess any deficiencies in the federal or contractor IPTs that could hinder successful construction or project execution. (13) Safeguards and Security • Assess whether a Preliminary Security Vulnerability Assessment Report as defined in DOE M 470.4-1 has been updated as required by DOE O 413.3A. • Assess the completeness and accuracy of the applicable safeguards and security requirements to include Design Basis Threat requirements, the methods selected to satisfy those requirements, and any potential risk acceptance issues applied to the project and their incorporation into the project. • Assess whether all feasible risk mitigation has been identified and that the safeguards and security concerns for which explicit line management risk acceptance will be required are appropriately supported. • Assess any changes to safeguards and security requirements since CD-2 and whether there is any impact to the project’s performance baseline. (14) Contract Management • Assess the current contract including cost, schedule and work scope relative to the baseline at CD-3 and identify any potential contract and project integration issues. o Assess whether the terms of the current contract support the project as currently planned and identify any gaps between the current contract and planned performance baseline. o Assess effectiveness of integrated change control and use of change control boards by both federal and contractor organizations. • Assess any planned contract modifications and requests for equitable adjustments relative to the performance baseline at CD-3. • Evaluate the status of contract management, and if applicable, plans and schedule to bring the contract up to date. • Assess project plans to self-perform construction and operations readiness versus subcontracting that work. • Assess draft documents to be provided to the services (e.g., construction) and product (e.g., purchased materials and equipment) subcontractors including submittal of documents by the subcontractors required before notice to proceed (e.g., design requirements, EVMS, and systems testing and turnover requirements). (15) Start-Up Planning and Operational Readiness • Identify and assess any changes to the start-up and operations readiness plan since CD-2. • Ensure the start-up test plan identifies how tests will be determined to be successful, and that associated equipment and instrumentation has been included in the preliminary design. • Review the startup and operational readiness test requirements and plans and assess whether they represent: o The acceptance and operational system tests required to demonstrate that the system meets design performance specifications and safety requirements, and o Sufficient scope definition to enable reasonable estimates of cost, schedule, and resources. • Ensure traceability of functional, operational, and safety requirements into the start-up test plan. • Identify and assess any exceptions taken by potential construction contractor or project consultants in meeting startup test specifications. • Assess whether cost, time and resources estimates are defensible to accomplish the required startup activities and have been included in the performance baseline. • Assess whether there is sufficient cost and schedule contingency for test and equipment failure during start-up testing. • Assess whether the start-up plan has been fully integrated with existing functional organizations including security. • Assess whether results of tests (e.g., equipment tests, process tests, surrogate tests, etc.) have been factored into startup and operational readiness planning. (16) Quality Control/Assurance • Identify and assess any changes to the Quality Control and Quality Assurance plan since CD-2. • Assess the applicability, completeness, adequacy, and flow-down of the Project Quality Assurance Program, including software quality assurance (SQA), based on DOE Order 414.1C and 10 CFR 830 Subpart A. Review and assess the record of QA audits performed on the Project and the disposition of the audit findings. • Assure that the QA/QC Plan and implementing procedures address personnel training and qualifications, quality improvement programs, document and record management, work processes, management and independent assessments, acceptance test planning and implementation, and the process for disposition of field changes. • Assess QA/QC requirements for construction planning. • Assess whether QA requirements (NQA-1 if applicable) have been appropriately incorporated into the “Design-to” functions, and costs, time and resources adequately estimated and included in the baseline. (17) Sustainable Design • Identify and assess any changes to sustainable design requirements and plans since CD-2. • Assess whether the project team has identified sustainable design features, in accordance with the Energy Policy Act of 2005, Executive Order 13423, and DOE O 450.1 chg 3, and that these features have been properly accounted for within the proposed performance baseline. • Assess whether the project is eligible for LEED certification. (18) New Technology and Technology Readiness • Identify and assess any changes to technology readiness since CD-2. • Assess whether the identified technologies are at an in increased and sufficient level of maturity to be included in construction. To the extent possible, provide an analysis of the Technology Readiness Level for the applicable technologies identified [Government Accountability Office Report 07-336, Major Construction Projects Need a Consistent Approach for Assessing Technology Readiness to Help Avoid Cost Increases and Delays, March 2007]. • Assess whether the performance baseline adequately provides for sufficient cost and schedule to implement these new technologies or new applications of existing technologies. • Assess if the Risk Management Plan accounts for risks associated with new technologies or new applications of existing technologies, and that adequate contingency has been included. (19) Documentation and Incorporation of Lessons Learned • Assess whether the project team is documenting and sharing lessons learned from their project internally and externally. • Assess whether the project team is reviewing and incorporating lessons learned from this and other projects. B. Required Documentation In general, the following documents (or equivalents) are normally required for the Construction or Execution Readiness Review. Other associated material may be requested by OECM and the Review Team to ensure a complete and accurate review is performed. • CD-0 Documents (e.g., Mission Need Statement, Approval of Mission Need) • CD-1 Documents (e.g., Approval of Alternative Selection and Cost Range) • CD-2 Documents (e.g., Approval of Performance Baseline) • Work Breakdown Structure (WBS) and WBS Dictionary • Program Requirements Document (or equivalent) • All Baseline Change Proposal and disposition documentation • Final Design Documents (including drawings, specifications, design lists); construction bid packages • Design Review Team resumes • Conceptual Design Report • Results of and Responses to project Design Reviews and Technical Independent Project Reviews • Construction Execution/Management Plans • Project Execution/Management Plans • Detailed Resource Loaded Schedule • Detailed bottoms-up Cost and Schedule Estimates based on the completed design (includes bases of estimate and assumptions) • Contingency Analysis/Contingency Plan • Critical Path and Near Critical Path Schedules • System Functions and Requirements Document (also referred to as the "Design-to" requirements or Design Criteria) • Integrated Project Team Charter (assignment letters as appropriate) • Documented IPT Processes • FPD Certification status and Integrated Project Team qualifications (resumes as appropriate) • Risk Management Plan/Process • Risk Watch List • Risk Assessment • Contingency/Monte Carlo Analyses and Contingency Plan • Safety Documentation including: o Preliminary Documented Safety Analysis Report o Safety Evaluation Report o Hazards Analysis/Hazard Analysis Report o Preliminary Safety Design Report (Hazard Category 1, 2, or 3 nuclear facilities) o Preliminary Safety Validation Report (Hazard Category 1, 2, or 3 nuclear facilities) o Construction Project Safety and Health Plan • Preliminary Security Vulnerability Assessment Report • DNFSB and NRC Reports and correspondence • Responses to DNFSB and NRC reports • Acquisition Strategy • Value Management/Engineering Report • Start-up Test Plan and other operations readiness plans (as appropriate) • Final National Environmental Policy Act documentation • Quality Control/Assurance Plan • Interface Documentation (procedures, MOU/MOA with site M&O) • Reports and CAPs from previous internal and external project reviews (if applicable) • Project Control System description • Change Control Process • Change Log; trend logs or reports • Monthly and Quarterly Progress reports for past year • Contracts applicable to the project • Contract Management Plan • Pending contract modifications/Requests for Equitable Adjustment • Project Data Sheets • Project Funding Profile (Program budget/planning office should identify if this profile is within the Program target budget profile) 5.5 Technical Reviews Technical Reviews are required at CD-1 for high risk, high-hazard, Hazard Category 1, 2, and 3 nuclear facilities. Other types of technical reviews may be required by a stakeholder if a design includes cutting edge technology and standards for the design do not exist, then a review by appropriately trained and knowledgeable experts is in order. A. Scope of Review Specific types of reviews can include the following lines of inquiry (list is not all inclusive): • CD-0, CD-1and CD-2 documentation (as applicable depending on project status) • Functions and Requirements • WBS and WBS Dictionary • Alternative Systems • Basis of Scope, Cost and Schedule • IPT membership and qualifications; to include nuclear safety experts, as applicable • Basis for Design • Preliminary Design • Detailed Design • Hazard Analysis/Safety • Project Definition Readiness Assessment • Technology Readiness Assessment • Risk Assessment/Management • System Verification • Physical Configuration • Constructability • Test Readiness • Nuclear Safety Integration • Functional Configuration • Reliability, availability, and maintainability considerations for maintenance B. Documentation The documentation required for Technical Reviews is tailored to the specific technical review to be performed by the Review Team. Documentation required depends of the size and complexity of the project and the particular stage of the project lifecycle, and any project issues that need to be evaluated. 6.0 Tailoring Tailoring is an essential component of EIRs and IPRs. Tailoring can apply to any project’s EIR and IPR. Tailoring will also be used by OECM when validating Performance Baselines as part of a Baseline Change Proposal following a deviation. It should not be permissible to tailor the Technical IPR or EIR for High Hazard Category 1, 2, and 3 nuclear facilities to exclude safety related aspects of the design, otherwise, the objectives of DOE-STD-1189 could not be met. A key consideration when tailoring the Performance Baseline EIR is to ensure that the EIR supports OECM validation of the Performance Baseline. Tailoring strategies should be well defined and supportable. Tailoring may include: • Documented EIR/IPR Tailoring Strategy • Use of summary level Resource Loaded Schedules • Use of summary cost and schedule supporting documents • Conducting limited interviews with selected members of the Integrated Project Team over the telephone or in a videoconference • Reviewing key documents with minimal site visit, if any • Reducing the scope of review requirements • Inclusion of additional review elements and/or lines of inquiry 6.1 Tailored EIR’s for the Office of Environmental Management’s (EM) Cleanup Projects The nature of environmental restoration and closure activities performed by the Office of Environmental Management (EM) requires a tailored approach to EIRs. This tailored approach has been formalized between OECM and EM into a protocol that will govern the review and validation of the baseline for EM cleanup projects. These projects are referred to as EM operations funded project baselines (PBSs- Performance Baseline Summaries) and they present unique challenges not typically found in capital asset construction projects. The duration of many of the operations funded project baselines are in excess of 10 years with many continuing to the year 2035, and beyond. Cost estimates for cleanup activities and operations that far into the future are highly dependent upon a set of assumptions for escalation rates, current and emerging technologies, regulatory issues, and success of near-term activities. The cost and schedule basis of near-term work typically has a higher confidence level than work far into the future. For EM PBSs, the EIR team will review the performance baseline for the near-term of each PBS and for the life-cycle closure period of the PBS. The near-term is generally defined as a minimum of five years or for the contract performance period if the current contract exceeds five years. For projects that are ending within a couple years or so after the five year period, the near term should encompass the entire remaining lifecycle baseline. The EIR team will provide a separate recommendation for the near term assessment and for the life-cycle closure assessment. A more detailed discussion on the scope, lines of inquiry and required documentation for these reviews is found in the DOE, OECM, External Independent Review, Standard Operating Procedure (SOP), dated July 2008. Appendix A: Glossary 1. Acquisition Executive. The individual designated by the Secretary of Energy to integrate and unify the management system for a program portfolio of projects, and implement prescribed policies and practices. He/she is the approving authority for a project’s Critical Decisions, per DOE O 423.3A. 2. External Independent Review. A project review conducted by individuals outside DOE. The Office of Engineering and Construction Management selects the appropriate contractor to perform these reviews. One of the most common types of External Independent Reviews is the Performance Baseline External Independent Review that is utilized to support validation of the Performance Baseline for Critical Decision-2. A second common type is the Construction/Execution Readiness External Independent Review that supports Critical Decision-3, approve start of construction, for Major System Projects. 3. Independent Cost Estimate. A documented independent cost estimate prepared by an entity outside the proponent program and project being reviewed that has the express purpose of serving as an analytical tool to validate, crosscheck, or analyze cost estimates developed by the project proponents. The key attribute of independent cost estimates is that they are prepared independently of the project proponent estimate. 4. Independent Cost Review. A project management tool used to analyze and validate an estimate of project costs by individuals having no direct responsibility for project performance. 5. Independent Project Review. A project management tool that serves to verify the project’s mission, organization, development, processes, technical requirements, baselines, progress, and/or readiness to proceed to the next successive phase in the DOE’s Acquisition Management System. 6. Project Definition Rating Index (PDRI). This is a project management tool used for assessing how well the project scope is defined. The tool uses a numeric assessment which rates a wide range of project elements to determine how well the project is defined. 7. Technical Independent Project Review. An independent project review conducted prior to obtaining Critical Decision-1, Alternative Selection and Cost Range, for high risk, high hazard, and Hazard Category 1, 2, and 3 nuclear facilities. As a minimum, the focus of this review is to determine the safety documentation is sufficiently conservative and bounding to be relied upon for the next phase of the project. 8. Technology Maturation Plan (TMP). A TMP detail the steps necessary for developing technologies that are less mature than desired to the point where they are ready for project insertion. 9. Technology Readiness Assessment (TRA) Review. A TRA is an assessment of how far technology development has proceeded. It provides a snapshot in time of the maturity of technologies and their readiness for insertion into the project design and execution schedule. Appendix B: Acronyms AE Acquisition Executive ANSI American National Standards Institute AS Acquisition Strategy AP Acquisition Plan CBB Contract Budget Baseline CD Critical Decision CDR Conceptual Design Report CFR Code of Federal Regulations CO Contracting Officer CY Calendar Year DEAR Department of Energy Acquisition Regulation DoD U.S. Department of Defense DOE U.S. Department of Energy CTE Critical Technology Element EIR External Independent Review EIS Environmental Impact Statement EM Environmental Management EPA U.S. Environmental Protection Agency ESAAB Energy Systems Acquisition Advisory Board EVMS Earned Value Management System FAR Federal Acquisition Regulations FONSI Finding of No Significant Impact FY Fiscal Year GPRA Government Performance and Results Act HA Hazard Assessment ICE Independent Cost Estimate ICR Independent Cost Review IMS Integrated Master Schedule IOC Initial Operating Capability IPL Integrated Priority List IPR Independent Project Review IPS Integrated Project Schedule IPT Integrated Project Team ICE Independent Cost Estimate IPR Independent Project Review ISM Integration Safety Management ISMS Integrated Safety Management System ISO International Standards Organization IT Information Technology KPP Key Performance Parameter LEED Leadership in Energy and Environmental Design LOI Lines of Inquiry (EIR and IPR related) MNS Mission Need Statement MR Management Reserve MS Major System Project NEPA National Environmental Policy Act NNSA National Nuclear Security Administration NQA-1 Nuclear Quality Assurance Standard – 1 (ANSI/ASME standard) NRC National Research Council OBS Organizational Breakdown Structure OECM Office of Engineering and Construction Management OMB Office of Management and Budget OMBE Office of Management, Budget and Evaluation (DOE) OPC Other Project Costs ORR Operational Readiness Review OSHA Occupational Safety and Health Administration PARS Project Assessment and Reporting System PB Performance Baseline PBC Performance-Based Contract PBS Performance Baseline Summary PDS Project Data Sheet PED Project Engineering and Design PEP Project Execution Plan PMB Performance Measurement Baseline PPBES Planning, Programming, Budgeting and Execution System PSO Program Secretarial Office PMSO Project Management Support Office QA Quality Assurance QAP Quality Assurance Plan QAPP Quality Assurance Program Plan QC Quality Control RCRA Resource Conservation and Recovery Act RD Requirements Document RFP Request for Proposal RLS Resource Loaded Schedule ROM Rough Order of Magnitude SAE Secretarial Acquisition Executive SOW Scope of Work TEC Total Estimated Cost (Capital) TPC Total Project Cost VM Value Management WA Work Authorization Appendix C: References 10 CFR 830, Subpart A, Quality Assurance Requirements. 10 CFR 830, Subpart B, Safety Basis Requirements. 10 CFR 830.206, Preliminary Documented Safety Analysis. 10 CFR 851, Worker Safety and Health Program. 29 CFR 1910.119, Process Safety Management of Highly Hazardous Substances. 40 CFR 68, Chemical Accident Prevention Provisions. ANSI-EIA-649, National Consensus Standard for Configuration Management. ANSI-EIA-748-B-2007, Earned Value Management Systems. DEAR 48 CFR 970.5204-2, Integration of Environmental, Safety, and Health into Work Planning and Execution. DOE P 413.1, Program and Project Management Policy for the Planning, Programming, Budgeting, and Acquisition of Capital Assets, dated 6-10-00. DOE O 425.1C, Startup and Restart of Nuclear Facilities, dated 3-13-03. DOE O 451.1B, Chg 1, National Environmental Policy Act Compliance Program, dated 9-28- 01. DOE P 470.1, Integrated Safeguards and Security Management, dated 5-8-01. DOE P 450.4, Safety Management System Policy, dated 10-15-96. DOE P 226.1A, Department of Energy Oversight Policy, dated 5-25-07. DOE O 205.1A, Department of Energy Cyber Security Management, dated 12-4-06. DOE O 413.3A, Program and Project Management for the Acquisition of Capital Assets, dated 7-28-06. DOE O 414.1C, Quality Assurance, dated 6-17-05. DOE O 420.1B, Facility Safety, dated 12-22-05. DOE O 450.1A, Environmental Protection Program, dated 6-4-08. DOE M 470.4-1 Chg 1, Safeguards and Security Program Planning and Management, dated 3- 7-06. DOE-STD-1189-2008, Integration of Safety into the Design Process, dated April 2008 DOE-STD-3006-2000, Planning and Conduct of Operational Readiness Reviews (ORR), dated June 2000 DOE Office of Engineering and Construction Management (OECM), Project Management Practices, Integrated Safety, Revision E, dated June 2003. DOE Office of Engineering and Construction Management (OECM), External Independent Review (EIR), Standard Operating Procedure (SOP), dated July 2008. DOE Office of Environmental Management, Technology Readiness Assessment / Technology Maturation Plan Process Guide, dated March 2008. DOE Office of Environmental Management, Project Definition Rating Index (EM-PDRI) Manual, Revision 1, dated February 2001. DOE Office of Management, Budget and Evaluation, Reviews, Evaluations, and Lessons Learned, Rev E, dated June 2003. DOE Office of Science, Independent Review Handbook, dated May 2007. House Report 109-86, Energy and Water Development Appropriations Bill, 2006. National Research Council, Improving Project Management in the Department of Energy, 1999. National Research Council, Progress in Improving Project Management in the Department of Energy, 2001. National Research Council, Progress in Improving Project Management in the Department of Energy, 2003. National Research Council, Assessment of the Results of External Independent Reviews for U.S. Department of Energy Projects, 2007. NNSA Policy Letter: BOP-50.003 (DOE O 413.3A), Establishment of NNSA Independent Project Review Policy, dated June 2007. NNSA, Office of Project Management and System Support, Project Definition Rating Index (PDRI) Manual, Revision 0, dated June 2008. Appendix D: Sample of a Tailored External Independent Review Plan REVIEW PLAN EXTERNAL INDEPENDENT REVIEW OF PROJECT TITLE: PROJECT NUMBER: APPROVE PERFORMANCE BASELINE AT THE LOS ALAMOS NATIONAL LABORATORY LOS ALAMOS, NM NOVEMBER 28 – DECEMBER 6, 2007 (SITE VISIT) TABLE OF CONTENTS ACRONYMS 7 Section 1 - Review Overview 9 Section 2 – Background 19 Section 3 – Review Logistics 21 Section 4 – Team Members and Assignments 25 Appendix A – Information Requested 26 ACRONYMS A/E Architect/Engineer AS Acquisition Strategy BCP Baseline Change Proposal BPD Basis of Preliminary Design CD Critical Decision CDR Conceptual Design Report CM Construction Management CMO Construction Management Organization CP Critical Path CPDS Construction Project Data Sheet DOE Department of Energy DR Design Review EIR External Independent Review EVMS Earned Value Management System F Finding F&OR Functional and Operational Requirements FY Fiscal Year G&A General and Administrative GFE Government Furnished Equipment HA Hazard Analysis IPT Integrated Project Team LANL Los Alamos National Laboratory LANS Los Alamos National Security, LLC LASO Los Alamos Site Office LOI Line of Inquiry M Million MF Major Finding NFPA National Fire Protection Association NMSSUP II Nuclear Materials Safeguards and Security Upgrade Project Phase II NNSA National Nuclear Security Administration NRC Nuclear Regulatory Commission O Observation OECM Office of Engineering and Construction Management OPC Other Project Costs OR Operations Readiness P3e Primavera Planning and Scheduling Program PDS Project Data Sheet PED Project Engineering and Design PEP Project Execution Plan PHA Preliminary Hazard Analysis PIDADS Perimeter Intrusion Detection, Assessment, and Delay System PIDAS Perimeter Intrusion Detection and Assessment System PM Project Management PMP Project Management Professional PRD Program Requirements Document QA Quality Assurance QC Quality Control R&D Research and Development RLS Resource-Loaded Schedule RM Risk Management RMP Risk Management Plan SER Safety Evaluation Report SF Square Feet SFR Systems Functions and Requirements SME Subject Matter Expert SNM Special Nuclear Material SOP Standard Operating Procedure SSC Structures, Systems, and Components TA Technical Area TEC Total Estimated Cost TIAZ Technical Area Isolation Zone TPRA Technical and Programmatic Risk Analysis TPC Total Project Cost VE Value Engineering VM Value Management VM/E Value Management/Engineering WBS Work Breakdown Structure Section 1 - Review Overview The following sections identify the type of review, define the scope and purpose of the review to be performed, and establish the objectives of the review. 1.1 Type of Review In accordance with DOE O 413.3A, an External Independent Review (EIR) must be conducted for all capital asset projects greater than $100 million prior to Critical Decision (CD) – 2, Approve Performance Baseline. An EIR will be conducted for the Nuclear Materials Safeguards and Security Upgrade Project Phase II at the Los Alamos National Laboratory, Los Alamos, NM. The site visit is scheduled for October 9-15, 2007. The EIR Team will review Project documentation prior to and during the site visit, interview Project participants, and prepare the draft EIR Report (including the Corrective Action Plan shell) for factual accuracy review by the Project. The EIR report format focuses on the review elements described in Section 1.3 below. The EIR Team will identify major findings, findings, and observations in the body of the EIR Report. The EIR Team will insert recommendations corresponding to all major findings, findings, and observations in a Corrective Action Plan (CAP) shell that will be an appendix to the report. Major findings and findings will be discussed during the site visit. Key definitions are: • Major Finding - Any finding that has a significant scope, cost or schedule impact and, in the professional judgment of the EIR Team, needs to be satisfactorily addressed prior to an EIR team recommendation to validate the baseline. Major Findings also include findings that significantly impact safety or the ability of the project team to successfully execute the baseline. • Finding - Any deficiency that can impact the estimated project cost or schedule. In general, findings include deficiencies in the hazards analysis, design, risk assessment, scope definition, system requirements, and start-up. Findings also include safety concerns or the ability of the project team to successfully execute the baseline. • Observation - A comment, not of sufficient gravity to question the validity of the project baseline, but identifies an opportunity for project improvement, or an item of exceptionally good practice for which the project should be commended. 1.2. Objectives of Review The primary objective of this review is to support the Department of Energy (DOE), Office of Engineering and Construction Management’s (OECM) validation of the performance baseline. To accomplish this objective, the EIR Team will review available technical and project management documents to confirm the completeness and accuracy of the performance (scope, cost, and schedule) baselines for the NMSSUP II. The review will be conducted in accordance with the OECM Statement of Work and consistent with requirements provided in DOE O 413.3A [Section 5.h.(3) Performance Baseline Validation Review and Table 2 Critical Decision Requirements]; and subsequent OECM direction and guidance. The review areas and associated lines of inquiry shown below will be addressed for the NMSSUP II review using a tailored approach. Information requested is identified in Appendix A. 1.3.1. Basis of Scope (As defined in the Work Breakdown Structure, System Functions and Requirements) The EIR Team will: • Assess whether the WBS incorporates all Project work and whether the WBS represents a reasonable breakdown of the Project work scope and is product oriented. • Assess whether a WBS dictionary adequately describes the Project work scope. • Review the Program Requirements Document (PRD) and assess if project planning is consistent with the PRD and the Mission Need. • Assess whether the overall integrated Project Resource-Loaded Schedule (RLS) is consistent with the WBS for the Project work scope. 1.3.2. Basis of Cost (As defined in the Resource Loaded Schedule (RLS)) For all WBS elements the EIR Team will: • For Work Breakdown Structure (WBS) elements constituting significant cost and/or risk, assess the estimating method(s), quality and confidence level, and summarize the detailed basis for the cost estimate including escalation and equipment procurements. • For escalation - Determine the cost escalation rate for the out years; - Determine whether the risk of higher cost escalation is addressed in risk management; - Determine how much management reserve/contingency is provided to cover increased cost escalation above what was planned. • Identify and assess key cost assumptions and evaluate the reasonableness of these assumptions as related to the quality of estimates for each WBS and determine whether these assumptions are factored into the Risk Management Plan and resulting contingency requirements. • Determine the cost to get to CD-3. 1.3.3. Basis of Schedule (As defined in the RLS) The EIR Team will: • For Work Breakdown Structure (WBS) elements constituting significant cost and/or risk, assess the schedule development method(s), quality and confidence level; summarize the detailed basis of schedule estimate. • Identify and assess schedule assumptions and evaluate the reasonableness of these assumptions as related to the quality estimates for each WBS and determine whether these assumptions are factored into the Risk Management Plan and resulting contingency requirements. • Assess the reasonableness of the schedule relative to the critical path. • Assess that work packages are organized based on dependencies, interdependencies, constraints and other factors into a time-phased sequence that will fit within the boundaries established by mission dates and available budget. 1.3.4. Funding Profile and Budget The EIR Team will: • Review/confirm the basis for the Funding Profile (e.g. latest Project Data Sheet). • Compare the annual budget with the estimated costs and obligation requirements, and provide an assessment of whether the projected costs and budget are reasonably linked. • Identify any significant disconnects between the Performance Baseline requirements and budget/out-year funding. • Develop a comparison of the performance baseline (across the life of the project) to the latest (budget) funding profile. 1.3.5. Critical Path The EIR Team will: • Assess the method(s) employed to develop the Critical Path and whether the Critical Path is reasonably defined, reflects a fully integrated schedule and schedule durations are reasonable. • Assess whether the schedule contingency (float) is reasonable for this type of project. 1.3.6. Risk and Contingency Management The EIR Team will: • Assess adequacy of the risk management plan and of the method(s) used to identify risks, including evaluation of assumptions, and whether a reasonably complete list of potential risks was developed for analysis. • Ensure that all programmatic, technical, cost, and schedule assumptions were properly addressed in the risk assessment and assess the reasonableness of the assumptions. • Assess adequacy of the qualitative analysis and rating (high, medium, or low) of current risks (including site specific factors such as availability of contractors) for probability of occurrence and for consequence of occurrence. • Evaluate the extent and adequacy of quantitative risk analysis. • Evaluate whether the risk watch list and risk assessment sheets appear to be complete. • Evaluate the adequacy of the management control process for risk status/updating. • Identify cost and schedule contingency/management reserve and provide an assessment of whether the basis of the NNSA contingency and National Security Technologies (NSTec) management reserve are reasonable for this type of project and its associated risks. • Assess whether all appropriate risk handling actions, including accepted risks and residual risks, and cost and schedule contingency/management reserve have been incorporated into the performance baseline including. 1.3.7. Hazards Analysis/Safety The EIR Team will: • Identify the functional make-up of the Preliminary Hazards Analysis (PHA) team, and provide an assessment of the overall staffing mix and expertise of the team. • Assess the PHA analysis process and results, including the use of internal LASO and external NNSA Service Center safety subject matter expert (SME). • Assess whether the PHA and NNSA Safety Evaluation Report (SER) identified the accident scenarios and represents a reasonably comprehensive list. • Assess whether all systems, structures, and components (SSCs) needed for worker and public safety have been incorporated into the Performance Baseline. • Review the SER and NNSA designation of the NMSSUP Phase II as a non-major modification to TA-55, and assess whether safety has been (1) appropriately integrated into the design efforts, (2) appropriately addressed throughout the lifecycle of the project, and (3) reflects continuously refined analysis based on evolving design and safety integration activities during the design process and planned construction. • Review any Defense Nuclear Facilities Safety Board (DNFSB) interfaces. Discuss the status of their involvement and determine whether DNFSB issues were reasonably considered and addressed. 1.3.8. Systems Functions and Requirements The EIR Team will: • Assess whether system requirements are derived from and consistent with the Mission Need, Program Requirements Document (PRD), CD-1 Acquisition Execution approval and direction, and NNSA direction for the 3-Corner and Inner-Dog-Leg reconfiguration design and construction planning. • Assess whether systems functions and requirements have been defined well enough to establish a firm performance baseline and properly documented, e.g., requirements document, system design descriptions, facility design description, or equivalent documentation. • Assess whether "design-to" functions are complete and have a sound technical basis including safety and external requirements such as permits, licenses, and regulatory approvals. • Identify all underlying technical assumptions and assess whether they are sound and/or appropriately addressed within the Risk Management Plan and adequately supported with funded contingency. • Assess whether the CD-4 (project completion) requirements/key performance parameters (KPPs) are clearly defined in the Project Execution Plan, and whether these requirements are locked down (not subject to change), quantified and measurable, or can otherwise be reasonably determined as satisfied. 1.3.9. Basis of Design The EIR Team will • Review the basis of design and assess the reasonableness of the design requirements and output for each function/operation. Summarize the assessment by providing a description of the unit operation, the design parameters, the basis of the design parameters and an assessment of whether the design basis is reasonable. 1.3.10. Preliminary Design, Design Review and Comment Disposition The EIR Team will: • Review the adequacy of the preliminary design including adequacy of the drawings and specifications, and assess whether they are consistent with system functions and requirements. • Assess whether all designated safety structures, systems, and components (SSCs) are incorporated into the preliminary design. • Assess whether the design documents support the proposed performance baseline. • Assess whether the NSTec Project Design Review team had appropriate experience and technical disciplines. • Assess whether the design review process is adequate. • Assess, based on a reasonable sample, whether the 30%, 60% and 90% complete LANL design review comments were incorporated into the design and whether the costs and schedule changes associated with design changes were incorporated into the performance baseline. 1.3.11. Value Management/Engineering The EIR Team will: • Assess the applicability of Value Management/Engineering, and whether a Value Management Assessment and a Value Engineering Study were performed with results being incorporated into the baseline. • Assess the Value Management/Engineering process including whether the VM team has a reasonable skill mix and experience background. • Assess whether life cycle cost analysis was reasonably performed for the trade-off studies and various alternatives reviewed. 1.3.12. Project Controls/Earned Value Management System The EIR Team will: • Determine status of obtaining EVMS certification and the extent to which the EVMS is compliant with the ANSI/EIA-748-A-1998 standard. • Assess whether the project is reporting and analyzing earned value management information and that management action is taking place as an outcome of the analysis function. • Assess the project EVMS deliverables to NNSA for usefulness, content and quality. • Assess the system/methodology for analyzing and managing the critical path schedule. • Assess the methodology for determining Estimate at Complete for the Project activities. • Evaluate the Federal and contractor control processes whereby projects incorporate formal changes, conduct internal replanning, and adjust information to accommodate changes, including control and use of management reserve and contingency. 1.3.13. Acquisition Strategy The EIR Team will: • Determine whether the project is being executed consistent with the AS. • Assess whether there are adequate contractor incentives (and disincentives) to enhance project execution. • Evaluate changes from the previously approved AS and whether the current AS represents best value to the government. 1.3.14. Project Execution Plan (PEP) The EIR Team will determine whether the PEP: • Is complete and current and signed by at least the FPD. • Reflects and supports the way the Project is being and will be managed, establishes a plan for successful execution of the Project, and is consistent with the other Project documents. 1.3.15. Integrated Project Team (IPT) The EIR Team will: • Assess the Federal IPT Charter, staffing (number and skill mix of full and part-time members), organizational structure, division of roles/responsibilities, and processes for assigning work and measuring performance, to determine whether the IPT is properly constituted to successfully execute the project within the proposed performance baseline. • Assess whether the contractor project management staffing level is appropriate, determine if appropriate disciplines are included on the contractor project management team, and identify any deficiencies that could hinder successful Project execution. • Assess the span of control (in terms of not only supervisory responsibility but also management of dollars and project issues) of key project management personnel, including the FPD, to determine whether they can successfully perform their duties. • Identify any deficiencies in the IPT or overall program/project management structure (federal and contractor) that could hinder successful execution of the project. 1.3.16. Start-Up Planning and Operations Readiness The EIR Team will: • Assess whether the startup test requirements represent acceptance and operational system tests required to demonstrate that the system meets design performance specifications and safety requirements, and • Ensure that the startup test plan identifies how tests will be determined to be successful and that associated equipment and instrumentation were included in the design and construction documents. • Review key tests to ensure that sufficient description is provided to estimate cost and schedule durations associated with these tests, including sufficient cost management reserve and schedule contingency for test and equipment failure during startup testing. • Assess whether cost and schedule included in the performance baseline are defensible to accomplish the required startup activities. • Assess whether the start-up plan has been fully integrated with existing LANL functional organizations. 1.3.17. Quality Control/Assurance The EIR Team will: • Assess if the Quality Assurance Plan includes all the processes required to ensure that the project meets DOE QA Criteria and produces the following outcomes: - Project development and execution has and or will occur in a controlled manner - Components, systems, and processes have and or will be designed, developed, constructed, tested, operated, and maintained according to engineering standards and technical specifications. - Resulting technical data are valid and retrievable. • Assess the adequacy in the QA Plan of roles and responsibilities for quality Management. 1.3.18. Safeguards and Security The EIR Team will: • Assess whether a Preliminary Security Vulnerability Assessment Report as defined in DOE M 470.4-1 has been updated as required by DOE O 413.3A. • Assess the completeness and accuracy of the applicable safeguards and security requirements, the methods selected to satisfy those requirements, and any potential risk acceptance issues applied to the project and their incorporation into the project. • Assess adequacy of incorporation of Design Basis Threat requirements into the baseline. • Review the Performance Baseline to ensure that cost, schedule, and integration aspects of safeguards and security are appropriately addressed. • Assess whether all feasible risk mitigation has been identified and that the safeguards and security concerns for which explicit line management risk acceptance will be required are appropriately supported. Section 2 – Background This section includes a description of the project, cost data, and status of the project. A detailed description of the Project is located in the PEP. 2.1 Project description and status The NMSSUP Phase II will provide a modern, state-of-the-art, exterior physical security protection system designed to meet NNSA and DOE protection requirements of current and future Category I (CAT-I) special nuclear material (SNM) facilities sited within LANL Technical Area (TA) 55. The NMSSUP Phase II scope of work consists of an exterior Perimeter Intrusion Detection, Assessment, and Delay System (PIDADS) replacing the existing Perimeter Intrusion Detection and Assessment System (PIDAS) with a re-configured Perimeter Intrusion Detection, Assessment and Delay System (3-Corner PIDADS) having an Inner Dog-Leg PIDADS to support future Chemistry and Metallurgical Research Replacement Facility (CMRR) construction, an Airborne Mitigation System, a Technical Area Isolation Zone (TAIZ), two Entry Control Facilities (EFC) (East ECF for pedestrian and vehicle entry and West ECF for vehicle traffic only), and the Limited Area fencing for three existing Nuclear Material Technology Division support facilities located adjacent to the PIDADS, including the CMRR Radiological Laboratory Utility Office Building. NMSSUP Phase II is presently under design using multiple time and material subcontracts and will use multiple fixed-price construction subcontracts for delivery of the project. Table 2-1: Total Project Cost ($K)/Schedule Summary* NMSSUP II Proposed Baseline Performance Measurement Baseline Management Reserve Fee Other Project Costs Contingency Performance Baseline/TPC CD-4 Schedule Life Cycle Cost Lifecycle Schedule * Estimates from ___________. Table 2-2: Major Milestones for LANL NMSSUP II Milestones Description Planned Date Actual Date CD-0 Approved mission need CD-1 Approved preliminary baseline range OECM CD-2 EIR visit External Independent Review site visit to support establishment of performance measurement baseline 11/28-12/6/2007 CD-2 Establish performance measurement baseline, authorize some early site work and some long lead procurements CD-31 Authorize construction CD-41 Approve project completion/start of operations 1 Dates are from________. Section 3 – Review Logistics 3.1 Dates of site visit The EIR Team plans to visit the project organization in Los Alamos during the period November 5-9, 2007, to review Project documentation, interview Project participants and present results of the review at an out-brief. The draft EIR Report, including a Corrective Action Plan shell, for factual accuracy review will be prepared immediately after the visit. 3.2 Site visit Schedule The schedule for the NMSSUP II EIR Team site visit is shown in Table 3-1. Table 3-1: EIR Site Visit Schedule (Preliminary) See the table in the PDF file The OECM Representative, EIR Team Lead, and FPD will meet at the end of each day to discuss potential high- level issues regarding the Project and to identify what additional documents or interviews may be required for the EIR Team to complete their task. Because the EIR will still be in progress and the EIR Team will not have had enough time to synthesize all the information gathered, the discussion elements are understood to be preliminary indicators but do not necessarily reflect the end conclusions of the EIR Team or the preliminary findings to be presented at the closeout briefing. Table 3-2 Focus Area Interview Schedule See the table in the PDF file 3.3 Review Process The EIR Team will analyze Project documentation and interview information and compile a draft EIR report that will document major findings, findings, observations, and associated recommendations. The EIR Team will structure the EIR report around the review areas contained in Section 1.3 of this document. The report will contain appendices as required, including a listing of documents reviewed, summary resumes of the EIR Team, and a corrective actions plan (CAP) shell that contains a complete listing of the recommendations associated with major findings and findings. Prior to the site visit, the EIR Team will determine whether the project baseline documentation is sufficiently complete to conduct a meaningful EIR. The Team leader will provide his determination in writing to the Contracting Officer’s Representative (COR) and the Office of Engineering and Construction Management (OECM). The recommendation may be to proceed as planned, conduct a partial review, or suspend the review. If the determination is to proceed with the EIR, the EIR Team will conduct desktop reviews of documents until the start of the site visit. At the end of the site visit a copy of the out brief will be left with Project officials. After issuance of the draft EIR Report including CAP shell and completion of the factual accuracy review by the Project Team, the EIR Team submit the final draft report to OECM for acceptance. OECM, with the assistance of the EIR Team, will review responses to the corrective action plan until the Major Findings and Findings have been satisfactorily addressed to support validation of the performance baseline. These milestones are summarized below. • Draft EIR Review Plan October 19, 2007 • Receipt of Project Materials for Sufficiency Review November 9, 2007 • Final EIR Review Plan November 14, 2007 • EIR Team Desktop Review of Documents November 9-27, 2007 • On-Site Review November 28 – December 6, 2007 • Draft EIR Report including CAP shell December 6, 2007 • Factual Accuracy Review by Project Team December 7-10, 2007 • Final EIR Report (early finish) December 14, 2007 3.4 Report Distribution (after OECM approval) OECM (12) NETL, COR (2) Section 4 – Team Members and Assignments 4.1 Review Team Assignments The EIR Team members and their principal areas of focus for the reviews or the NMSSUP II are shown in the table below. Team member biographies are also provided. Table 4-1: Review Team Assignments See the table in the PDF file 4.2 Biographies of EIR Team List names with biographies and specialty areas as applicable to the functional areas assigned in the Review Plan. Appendix A – Information Requested See the table in the PDF file Appendix E: Shell Independent Project Review Plan REVIEW PLAN INDEPENDENT PROJECT REVIEW In Preparation for Critical Decision-«CDX» (CD-«CDX») «ReviewType» «Title» («TitleAcronym») Project No. «ProjectNum» at «Site» («SiteAcronym») «ReviewDate» «DateofDocument» Approved by: Office of Project Management and Systems Support, NA-54 INDEPENDENT PROJECT REVIEW In Preparation for Critical Decision-«CDX» (CD-«CDX») «ReviewType» «Title» («TitleAcronym») Project No. «ProjectNum» at «Site» («SiteAcronym») TABLE OF CONTENTS (Note: Table of Contents left blank intentionally; this is a sample template document) INDEPENDENT PROJECT REVIEW In Preparation For Critical Decision-«CDX» (CD-«CDX») «ReviewType» «Title» («TitleAcronym») Project No. «ProjectNum» At «Site» («SiteAcronym») 1.0 BASIS FOR REVIEW <> CD-0 The «Office» «Office1» has requested that a Mission Validation «TailoredFull» Independent Project Review (IPR) be performed in preparation for CD-0, «ReviewType», of «Title», «ProjectNum» at «SiteAcronym». Approval of CD-0 constitutes approval to proceed with Conceptual Design and a request for Project Engineering and Design (PE&D) funds for the Preliminary and Final designs. CD-1 The «Office» «Office1» has requested a «TailoredFull» Independent Project Review (IPR) of documents produced during the Conceptual Design prior to approving proceeding to Preliminary Design and approving CD-1, «ReviewType». Approval of CD-1 allows for the expenditure of Project Engineering and Design (PE&D) funds for design. CD-2 The «Office» «Office1» has requested a «TailoredFull» Independent Project Review (IPR) of documents produced during the Preliminary Design prior to approving proceeding to Final Design and approving CD-21, «ReviewType». CD-2 establishes a Performance and Budget Baseline for the Project. It allows the design to continue and is required for the request of constructions funds. CD-3 The«Office» «Office1» has requested an Execution Readiness Independent Project Review (IPR) prior to CD-3, «TailoredFull» «ReviewType». This is a general review of the project prior to CD-3 that verifies the readiness of the project to proceed into construction. The Project Execution Plan (PEP) and performance baseline will be updated, if required, and the final design and procurement packages are to be completed. FIRP In accordance with ongoing management initiatives of the Office of Infrastructure and Facilities Management (NA-52), The Office of Engineering and Systems Support (NA-54) has been requested to conduct a «TailoredFull» Independent Project Review (IPR) to determine if FIRP projects at «SiteAcronym» are sufficiently baselined and that «SiteAcronym» has the capability to execute them successfully. To meet this request, the Office of Project Management and Systems Support, NA-54, will conduct a «TailoredFull» IPR utilizing NNSA’s IPR procedures and infrastructure support. The draft Independent Review Process for Construction Programs, National Nuclear Security Administration (July 7, 2003, Draft) describes this review as: < > Mission Validation IPR (Pre CD-0 Review) A Mission Validation IPR is a review of the project prior to CD-0, Approve Mission Need. It assures the project has clear objectives, strongly linked to mission; identifies major risks; evaluates the acquisition and conceptual planning relative to those risks; and validates the funding request. Alternative Selection and/or Cost Range Review (Pre CD-1 Review) The purpose of an Alternative Selection and/or Cost Range Review (previously a Readiness to Proceed Into Preliminary (Title I) Design Review) is to examine in depth the readiness of the Project to proceed with Title I Design and evaluates the planning for the design phase. Performance Baseline Independent Project Review (IPR) prior to Critical Decision -2 IPRs are conducted for Capital Asset Projects under $20M prior to Critical Decision 2. This requirement applies regardless of whether the project is capital or expense funded. An IPR may be required to validate a new baseline resulting from a Baseline Change Proposal due to a Performance Baseline Deviation. The purpose of the Performance Baseline IPR is to support validation of the Performance Baseline and to provide reasonable assurance that the project can be successfully executed. The Performance Baseline validation provides confirmation to the Deputy Secretary, the Chief Financial Officer, OMB, and Congress that the project scope and key performance parameters are well defined and the project can be completed for the Total Project Cost and schedule associated with the Performance Baseline. I Execution Readiness IPR (Pre CD-3 Review) An Execution Readiness IPR is a general review of the project that may range from an abridged review of specific areas to a comprehensive review of the entire project. As a minimum, it must verify the readiness of the project to proceed into construction or remedial action, and evaluation of prospective procurement packages. NA-54 may elect to delegate IPR responsibility for projects in which the ESAAB authority for CD-3 resides in the field, if the capability and processes exist for the proper execution of IPRs in the field. Long Lead Procurement Review (Pre CD-3A Review) A Long Lead Procurement (LLP) Review (CD-3A) may be conducted to determine whether procurement of a long lead item is justified and whether the project is ready to proceed with the requested procurement. The bases for implementing LLP Reviews are found in DOE Order 413.3A (7/26/06) • DOE O 413.3A Chap. III.3.b: "Where long lead procurement is required, a tailored may be sed, subject to prior budget approval and funding availability." • DOE O 413.3A Chap. III.3.c: "For long lead procurement, a separate budget request [BR] or capital funds (non-PED) may be submitted prior to CD-2 for a partial CD-3 etermination." • DOE O 413.3A Attach. 6: "If long lead procurement (LLP) is required, a BR for LLP funding hould be approved as a partial CD-3 during the conceptual design phase and submitted into he budget cycle to ensure timely receipt of LLP funds." Procurement Package Review (A-E, Design-Build, or Construction) A Procurement Package Review is conducted when the Project is ready to proceed with a major procurement of engineering, construction, or design-build services. Corrective Action Plan Closure Review The purpose of a CAP Closure Review is to ensure that issues raised during prior reviews have been adequately resolved, including: conditions which do not satisfy applicable Federal regulations, DOE Orders, or agreements with regulatory agencies; actions that must be taken before the Project can have a reasonable expectation of achieving its documented objectives. This review generally addresses all previous reviews (EIR and IPR.) When possible the CAP Closure Review should involve the Review Team Leader(s) of as many of the past independent review(s) as possible. Cost and/or Schedule Review Cost and/or Schedule Reviews focus on the process used by the Project in preparing the cost estimates and schedule. Cost and/or Schedule Reviews are generally conducted as a portion of an overall Project Review; however, cost/schedule may also be the focus of a Review. The Review Team will determine whether the Project has applied sound and accepted cost estimating processes and whether they are likely to represent the actual cost/schedule. The Cost and/or Schedule Review will also evaluate the schedule and scope to ensure consistency. Cost Reviews should look at the Critical Decision being considered and the items that must be addressed in that time frame, pursuant to DOE Cost Estimating Guide, DOE G 430.1, Chapter 6, Table 6-1. Ad Hoc (For Cause) Reviews Ad Hoc (For Cause) Reviews may be requested by the Administrator, Deputy Administrator(s), other NNSA AEs and Site Office Manager(s) or Program Managers, or the FPD with the concurrence of the Program Office. The Review will be developed by the requesting Program Office as well as the Review Team Lead to meet the specific needs of the requestor. Value Engineering Reviews Value Engineering reviews and/or exercises are generally conducted as part of an overall Project Review, however, value engineering may also be the focus of a Review. Value Engineering Reviews evaluate the project to identify ways of improving performance, reliability, quality, safety, and life cycle costs of products, systems, or procedures to achieve "best value". A Value Engineering exercise may be conducted as part of an IPR or separately; however such an exercise is not to be confused with a review of Value Engineering plans/implementations conducted as part of a normal IPR. A full Value Engineering Review will include a Certified Value Engineering Specialist (CVS). System Reviews The Administrator, Deputy Administrators, other NNSA Acquisition Executives, Program Office, or the FPDs may request System Reviews. A System Review evaluates a project or projects to determine the status of a system or systems at a site or within a particular organization. Reviews Pursuant to a Charge Memorandum The Administrator, Deputy Administrators, other NNSA Acquisition Executives, Program Office, or the FPDs may request reviews with the pursuant to a Charge Memorandum. The Charge Memorandum will identify the scope and focus of the Review and it will be tailored accordingly. Capability Reviews Capability reviews assess the systems, procedures, organizations, personal qualifications and other institutional elements to determine the degree of organizational readiness to plan, manage and execute projects. Performance Reviews Performance reviews of projects are conducted during the execution phase (i.e. post CD-3) to analyze variances, forecasts, potential problems, etc… 2.0 GENERAL REVIEW INFORMATION 2.1 Points of Contact and Logistical Information This review will be conducted at «SiteAcronym» and «Walkdown_Willwont» require a walkdown of the site or facility. The Review Team and FPD are requested to review/follow/adhere to all Attachments for facilitation and coordination of this review. See the graphic in the PDF file Review Team Members are requested to stay at the «Hotel» in «Hotel_City», «Hotel_State». The Telephone number is «Hotel_phone». The Integrator has reserved a block of rooms for this review. When you check in, please change the reservation to your name. The confirmation number is _______. If you prefer to make your own reservations, you can find a hotel that offers government rates at www.government- traveler.com. 2.2 Documentation for Review Documentation requested by the Review Team is listed in Attachment 2, Documentation Required for Review. The FPD is required to return this form to NA-54 (Attachment 2), together with requested documents, in accordance with the Review Plan Schedule. The FPD also is required to comply with Attachment 5, Federal Project Director Responsibilities. The Project is requested to directly email the Review information to the NA-54 Review Interface and the Review Integrator. The Review Integrator will coordinate reviewer access to the information. Advance documentation that cannot be emailed should be sent by the Project directly to each Review Team Member identified in Section 4.0, pending classification issues. If documentation cannot be sent to the Review Team Members for classification reasons, send a sufficient number of copies for all Review Team Members to the Review Integrator. Send all documents by overnight express. 2.3 Preliminary Review Schedule See the table in the PDF file Not more than five days following receipt of factual accuracy comments, NA-54 will issue the Final Report to the FPD and Program Officer with a Corrective Action Plan (CAP) shell. 3.0 REVIEW FOCUS AREAS AND LINES OF INQUIRY The review will cover the following focus areas utilizing a graded approach tailored to the Project. The Project is requested to address all these focus areas in its Kick-off Overview Presentation and assign Project Team POCs. The numbers below are to be used by the Reviewers in their write-ups (see Attachment 6, Exhibit 1). Delete or add as many as required 3.1 Mission Need and Project Goals 3.2 Management Systems, Controls, and Planning 3.3 Acquisition Strategy 3.4 Safeguards and Security 3.5 Scope and Technical Considerations 3.6 Cost Estimates and Funding 3.7 Schedule 3.8 Risk and Contingency Management 3.9 Environment, Waste Management and Energy Conservation 3.10 Occupational Safety and Health 3.11 Nuclear Safety 3.12 Waste Minimization, Energy Conservation, and Pollution Prevention 3.13 Value Engineering 3.14 Quality Assurance/Quality Control 3.15 Evaluation of Corrective Actions from Previous Reviews Review Focus Areas are listed above while the specific Lines of Inquiry for these focus areas are listed in Attachment 1. The Reviewers shall, by reading documentation and/or discussions with Project personnel, evaluate the Focus Areas using the Lines of Inquiry as a guide. The evaluations shall be recorded in the IPR Report as findings: Significant Concerns, Other Concerns and Observations in accordance with Section 5.1. 4.0 REVIEW TEAM 4.1 Team Members and Focus Areas See the table in the PDF file 4.2 Review Team Responsibilities The Review Team is requested to follow the guidelines for Review Team Responsibilities and Review Team Leader Responsibilities in Attachments 3 and 4, respectively. 5.0 REVIEW REPORT 5.1 Definitions Independence During the formulation of Independent Project Review Teams, the Independence of the Review Team members must be maintained. Two independence standards must be utilized, one for the Review Team Leader and a slightly less rigorous standard for the remaining Team members. Team Leader -For qualifications as an NNSA Independent Review Team Leader or Chair of a Technical Independent Project Review, an individual can have no present or prior participation in the project to be reviewed. Participation means direct responsibility for or assignment to a given project team, including NNSA, other Federal and state agencies, contractors and sub-tier contractors. Team Leaders may not review projects from their current line program [Deputy and Associate Administrators] or field site. Review Team Members - To qualify as a member of a Review Team or Committee, an individual can have no current involvement in the project to be reviewed. In addition to a personal standard, the organization to which an individual is currently assigned cannot be a participant in the project. Organization is defined to be a corporation, non-profit entity, state or Federal agency, laboratory, or NNSA program/project office (at the Deputy and Associate Administrators and level). Headquarters program personnel may participate as observers or in limited non-critical roles. Other exclusion criteria may apply: • An individual or the organization to which he/she belongs, has a relationship to the roject being reviewed which would cause, or could be perceived to cause, them to be iased in their assessment of the project. • An individual is a member of a corporation which is an active competitor to the orporation performing the project which is being reviewed. In matters of judgment, the bias shall be towards erring on the side of conservatism in order to ensure the integrity of the review process. That is, the bias shall be to exclude individuals and entities which may have the potential to have a conflict of interest. In cases of dispute, the decisions of the Review Team Leader, in consultation with the Director, Office of Project Management and Systems Support shall make a determination. In extreme cases where there is a disagreement, the decision may be appealed to the Associate Administrator of Infrastructure and Environment, whose decision shall be final. In some cases, it may be necessary to document the basis for inclusion/exclusion of members of the Review Team. Findings The IPR results will be classified into four (4) categories: Significant Findings (also known as Significant Concerns), Other Concerns, Opportunities for Improvement and Positive Observations. Significant Findings or Significant Concerns (SC): A Significant Concern (SC) represents a finding that, in the opinion of the Review Team, needs to be addressed by the Acquisition Executive prior to approval of the pertinent Critical Decision. In cases where NA-54 is responsible for validation of the Performance Baseline (i.e. CD-2 for projects less than $100M), the actions taken to address SCs will require concurrence by NA-54. The actions to be taken to address SCs must be documented in the Corrective Action Plan. In general, SCs have the potential to significantly impact the cost, schedule, or scope of the project or potentially have a significant impact on safety or security. There should be a minimal number of SCs (never more than 10, and normally 3-6). Resolution of the SCs requires the direct involvement of the AE as part of the ESAAB/ ESAAB-equivalent approval process. In the case where the Review Team has numerous such findings, the Project is likely not ready for an IPR and there should be an overarching SC which so states and provides examples of why additional preparation is needed. SCs can be rolled up to reflect a common finding of a similar deficiency by the Review Team (e.g. an IPT may require additional expertise -- each skill area would not be a separate SC). The following criteria are consistent with DOE O 413.3A and will be applied to determine whether or not a finding is a Significant Concern. In the past, Lessons Learned from historic projects show the SC has resulted from one of the following criteria: • An increase in excess of the lesser of $25M or 25% (cumulative) of the original CD-2 ost baseline, • A delay of six-months or greater (cumulative) from the original project completion ate, or • A change in scope that affects the ability to satisfy the mission need, an inability to eet a Key Performance Parameter, or non-conformance with the current approved roject Execution Plan. Other than DOE O 413.3 a Significant Concern could describe a condition that: • Will potentially result in a significant release to the environment, human exposure to a adiological or toxic substance, or a significant accident/injury • Is a significant deviation from the DOE Order 413.3A or other DOE Order equirements or is not in compliance with Federal, state or local regulations • Could result in a breach of security that could put the asset at risk The SCs shall be satisfactorily resolved prior to conducting an ESAAB/ESAAB Equivalent. For projects with TPC of $20-100M a Performance Baseline Validation IPR and an ICE or an ICR by NA-54 is required. According to DOE O 413.3A, Program and Project Management for the Acquisition of Capital Assets, July 28, 2006, the Project Management Support Office (NA-54) must issue a Performance Baseline Validation Letter to the Program Secretarial Officer that describes the cost, schedule, and scope being validated thus, requiring concurrence of corrective actions prior to the ESAAB. Significant findings from all IPRs/T-IPRs must be addressed at the ESAAB/ESAB-Equivalent meeting. If there is a disagreement between the Project Team and the Review Team on the resolution of a Significant Finding, the finding can be taken to the Acquisition Executive for final disposition. Other Concerns (OC): Any finding, in the opinion of the Review team, which if unresolved, could pose a risk to the successful completion of the project within the established baseline or reported cost and schedule range estimate. (i.e. CPDS, cost, scope, and schedule). The resolution of the OCs is not necessarily required prior to the next Critical Decision but should be addressed in the Corrective Action Plan. Opportunities for Improvement (OFI): Recommendations by the Review Team in areas where project enhancements could occur that could yield improvements in the overall project performance and execution. OFI typically yield the most effective results if considered prior to the next Critical Decision. Positive Observations (PO): Positive Observations (PO): Observations of Project activities that in the judgment of the Review Team demonstrate a best practice, proactive implementation of requirements, or where the basic expectations and requirements have been exceeded. The POs will be included in the NNSA NA-54 Bulletin and the NA-54 Lessons Learned Database for distribution to all NNSA Site Offices and Program. As soon as a Team Member identifies a deficiency that he or she perceives should be elevated to a Significant Concern, he or she will identify that concern to the Focus Area Lead, who, in turn, will provide the input to the Review Team Leader. The Team Member will prepare a brief summary of the concern and be prepared to discuss the concern at the daily Review Team status meetings. If Team agreement is reached on a Significant Concern, the Team Member will prepare a complete discussion and Recommendation for the Significant Concern for inclusion in the final report and in the Exit Briefing. The proposed Recommendation and written discussion also requires review and consensus by the entire Team. Team members will generate Observations reflecting their particular area of expertise within the Focus Area to which they have been assigned. Team Members will document these Observations and any Recommendations, as appropriate, as soon as they have developed them and will then submit them to the Focus Area Lead. The Focus Area Lead will review these Observations and share them with other reviewers for information and possible applicability to particular activities that they may be reviewing. The Exit Briefing, performed on the last day of the review, has a slightly different format than the final report. Only those Observations that are deemed Notable Achievements will be included in the Exit Briefing, but not the report. All other observations will be contained only in the report. The Exit Briefing also will contain Significant Concerns and Other Concerns, but will not contain Recommendations. 5.2 Reports and Corrective Action Plan (CAP) In accordance with the schedule listed in Section 2.3, the Review Team will provide a Factual Accuracy Report to the FPD and Program Officer. After receipt of factual accuracy comments from the FPD, NA-54 will issue a final report which will include a draft CAP Shell. NA-54 will coordinate the issuance of all reports. The CAP shell can be used by the Project to organize and document their responses to the report Recommendations. It is the FPD’s responsibility to ensure that all Recommendations in the CAP shell are addressed and that NA-54 receives a copy of the completed CAP. It also is the FPD’s responsibility to update the PARS database once the Final Report has been issued. 6.0 GENERAL LINES OF INQUIRY There are many reasons that a project might fail to meet its objectives or fulfill its mission. Some are beyond the Project’s control - market forces, funding cuts, extreme complexity, policy changes, etc. Some reasons are common to a wide variety of projects, regardless of the stage of design or completion they are in. These potential problems may be project-specific, site-specific or systemic to NNSA. There are certain indicators that can be used to assess the overall likelihood that a particular project might become a victim of common shortcomings. In Attachment 1 are the Lines of Inquiry that are pursued at each Critical Decision, organized by Focus Area. Figure 1, however, is a Project Measures of Success Checklist that each Reviewer should complete, regardless of the Critical Decision point at which the IPR is performed. The checklist in Figure 1 can help to identify overall weakness in the Project or supporting management and assist Project or Program managers in averting problems before they grow to become threats to a Project’s success. This checklist documents the general impressions of each Reviewer as to the overall health of a project. Thus, Figure 1 is a checklist that must be completed by each Reviewer and a copy provided to the Review Team Leader prior to the Exist Briefing. Because the answers are based on opinion, no supporting documentation is required. 7.0 PROJECT DEFINITION RATING INDEX (PDRI) The Project Team should perform a self-assessment using the PDRI scoring sheets and send the completed form to the Integrator prior to the on-site review. Each Reviewer will evaluate the Project before leaving the site on the PDRI scoring sheets that will be distributed to the Review Team prior to the Exit Briefing. Instructions as to the proper way to complete the forms will be provided by the Integrator at the time the forms are distributed. The completed forms will be handed in to the Integrator before the Reviewer leaves the site. The Integrator will analyze the data and include it in the final report. The Independent Reviewer score will be used to verify the self-assessment and determine whether or not the Project score meets or exceeds the recommended threshold for the appropriate level of maturity. Note to Reviewers: Review Project documents as soon after you receive them as possible. Alert the Team Leader or the Integrator to any critical information is missing or incomplete as soon as you discover it. See the table in the PDF file LIST OF SUPPLEMENTAL MATERIALS Note: Attachments and Exhibit 1 are not included herein ATTACHMENT 1 - SPECIFIC LINES OF INQUIRY CRITICAL DECISION 0 - APPROVE MISSION NEED CRITICAL DECISION 1 - APPROVE ALTERNATIVE SELECTION AND COST RANGE CRITICAL DECISION 2 - APPROVE PERFORMANCE BASELINE CRITICAL DECISION 3 - APPROVE START OF CONSTRUCTION ATTACHMENT 2 - DOCUMENTATION REQUIRED FOR REVIEW ATTACHMENT 3 - REVIEW TEAM RESPONSIBILITIES ATTACHMENT 4 - REVIEW TEAM LEADER RESPONSIBILITIES ATTACHMENT 5 - FEDERAL PROJECT DIRECTOR RESPONSIBILITIES ATTACHMENT 6 - IPR STYLE GUIDE FOR REVIEWERS EXHIBIT 1 – REVIEWER DELIVERABLE FORM