This is the accessible text file for GAO report number GAO-09-403T 
entitled 'Defense Acquisitions: Charting a Course for Improved Missile 
Defense Testing' which was released on February 25, 2009.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Testimony: 

Before the Subcommittee on Strategic Forces, Committee on Armed 
Services, House of Representatives: 

United States Government Accountability Office: 
GAO: 

For Release on Delivery: 
Expected at 1:00 p.m. EDT:
Wednesday, February 25, 2009: 

Defense Acquisitions: 

Charting a Course for Improved Missile Defense Testing: 

Statement of Paul Francis: 
Director, Acquisition and Sourcing Management: 

GAO-09-403T: 

GAO Highlights: 

Highlights of GAO-09-403T, a testimony before the Subcommittee on 
Strategic Forces, Committee on Armed Services, House of 
Representatives. 

Why GAO Did This Study: 

The Missile Defense Agency (MDA) has spent about $56 billion and will 
spend about $50 billion more through 2013 to develop a Ballistic 
Missile Defense System (BMDS). This testimony is based on two reviews 
GAO was directed to conduct in 2008. In addition to our annual review 
assessing the annual cost, testing, schedule, and performance progress 
MDA made in developing BMDS, we have also reported on MDA’s targets 
program. In this testimony we discuss (1) the productivity of MDA’s 
recent test program, (2) the consequences of the testing shortfalls, 
and (3) key factors that should be considered as MDA revises its 
approach to testing. 

GAO assessed contractor cost, schedule, and performance; tests 
completed; and the assets fielded during 2008. GAO also reviewed 
pertinent sections of the U.S. Code, acquisition policy, and the 
activities of a new missile defense board. 

What GAO Found: 

The scale, complexity, cost and safety associated with testing the 
missile defense system constitute a unique challenge for MDA, test 
agencies and other oversight organizations. This challenge is 
heightened by the fact that missile defense assets are developed, 
produced, and fielded concurrently. Overall, during fiscal year 2008, 
testing has been less productive than planned. While MDA completed 
several key tests that demonstrated enhanced performance of BMDS, all 
elements of the system had test delays and shortfalls, in part due to 
problems with the availability and performance of target missiles. GMD 
in particular was unable to conduct either of its two planned intercept 
attempts in fiscal year 2008. While it did subsequently conduct one in 
December 2008, it was not able to achieve all primary objectives 
because the target failed to release its countermeasures. As a result, 
aspects of the fielded ground-launched kill vehicles may not be 
demonstrated since no more flight tests have been approved. Target 
missiles continue as a persistent problem in fiscal year 2008 as poor 
target performance caused several tests to either fail in part or in 
whole. 

Testing shortfalls have had several consequences. First, they have 
delayed the validation of models and simulations, which are needed to 
assess the system’s overall performance. As a result, the performance 
of the fielded BMDS as a whole cannot yet be determined. Second, the 
production and fielding of assets has continued and in some cases has 
gotten ahead of testing. For example, enhanced Exoatmospheric Kill 
Vehicles will now be produced and delivered before they are flight 
tested. Third, MDA has relied on a reduced basis—fewer test, model, and 
simulation results—to declare capabilities as operational in the field. 

MDA has undertaken a three-phase review of the entire BMDS test program 
that involves identifying critical variables that have not been proven 
to date, determining what test scenarios are needed to collect the 
data, and developing an affordable, prioritized schedule of flight and 
ground tests. This review, as long as it continues to involve test and 
evaluation organizations, appears to offer a sound approach for closing 
the gaps that exist between testing, modeling, and simulation. Critical 
to being able to implement the approach will be addressing the factors 
that have limited the productivity of the current test approach, such 
as the availability and performance of targets. An additional 
consideration in a new testing approach must be to ensure that assets 
are sufficiently tested before they are produced and fielded. An 
important consideration in this regard is for modeling, simulation, and 
testing events to be re-synchronized so that they properly inform 
decisions on producing, fielding, and declaring assets operational. 
Contingency plans could then be formed for adjusting the pace of these 
decisions should shortfalls occur in modeling, simulation, or testing. 
Because MDA has indicated implementation will take time, managing the 
transition may need to include reassessing the ambitious fiscal year 
2009 test plan. In the mean time, MDA will have to be prudent in making 
decisions to produce and field assets. 

What GAO Recommends: 

We have previously made recommendations to improve the MDA’s testing 
and targets programs that include establishing a revised business case 
for providing targets for a robust flight test program as well as 
adding sufficient scope to tests to enable an assessment of the BMDS’ 
suitability and effectiveness, but MDA only partially agreed. We also 
have a draft report that is currently with DOD for comment that 
includes additional recommendations regarding testing. 

View [hyperlink, http://www.gao.gov/products/GAO-09-403T] or key 
components. For more information, contact Paul Francis, 202-512-4841, 
Francisp@gao.gov. 

[End of section] 

Madame Chairman and Members of the Subcommittee: 

I am pleased to be here today to discuss the future of the Missile 
Defense Agency's (MDA's) testing program. 

MDA has been charged with developing and fielding the Ballistic Missile 
Defense System (BMDS), a system expected to be capable of defending the 
United States, deployed troops, friends, and allies against ballistic 
missiles of all ranges in all phases of flight. In fulfilling this 
charge, MDA placed an initial set of missile defense components in the 
field in December 2005. 

The National Defense Authorization Acts for fiscal years 2002, 2007 and 
2008 mandated that we prepare annual assessments of MDA's ongoing cost, 
schedule, testing, and performance progress. In March 2009, we plan to 
issue our report covering MDA's progress toward achieving its goals 
during fiscal year 2008 as well as its efforts to improve transparency, 
accountability, and oversight. Additionally, in September 2008, we 
issued a report on MDA's Target Program. My statement today will focus 
on the testing-related issues covered in both reports. We conducted 
these performance audits from February 2008 to February 2009 in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

Background: 

The Missile Defense Agency's mission is to develop an integrated and 
layered BMDS to defend the United States, its deployed forces, allies, 
and friends. In order to meet this mission, MDA is developing a highly 
complex system of systems--land, sea and space based sensors, 
interceptors and battle management. Since its initiation in 2002, MDA 
has been given a significant amount of flexibility in executing the 
development and fielding of the BMDS. To enable MDA to field and 
enhance a missile defense system quickly, the Secretary of Defense in 
2002 delayed the entry of the BMDS program into the Department of 
Defense's traditional acquisition process until a mature capability was 
ready to be handed over to a military service for production and 
operation. Therefore, the program concurrently develops, tests and 
fields assets. This approach helped MDA rapidly deploy an initial 
capability. On the other hand, because MDA can field assets before all 
testing is completed, it has fielded some assets whose capability is 
uncertain. 

Because MDA develops and fields assets continuously, it combines 
developmental testing with operational testing. In general, 
developmental testing is aimed at determining whether the system design 
will satisfy the desired capabilities; operational testing determines 
whether the system is effective, survivable, and suitable in the hands 
of the user. MDA conducts testing both on the ground and in flight. The 
most complex of these is an end-to-end flight test that involves a test 
of all phases of an engagement including detecting, tracking and 
destroying a target with an interceptor missile. An end-to-end 
intercept involves more than one MDA element. For example, a recent 
intercept test involved a target flown out of Kodiak, Alaska, tracked 
by the AN/TPY-2 radar located in Alaska, and the Beale upgraded early 
warning radar located in California, the Sea-based X-band radar and an 
Aegis radar located at different points in the Pacific. All of the 
radars communicated with fire control centers in Alaska to guide an 
interceptor launched from California to hit the target over the Pacific 
Ocean. 

Due to the complexity, scale, safety constraints, and cost involved, 
MDA is unable to conduct a sufficient number of flight tests to fully 
understand the performance of the system. Therefore, MDA utilizes 
models and simulations, anchored by flight tests, to understand both 
the developmental and operational performance of the system. To ensure 
confidence in the accuracy of modeling and simulation the program goes 
through a process called accreditation. The models are validated 
individually using flight and other test data and accredited for their 
intended use. Models and simulations are used prior to a flight test to 
predict performance, the flight test is then run to gather data and 
verify the models, and then data is analyzed after the flight and 
reconstructed using the models and simulations to confirm their 
accuracy. 

MDA intends to group these models into system-level representations 
according to user needs. One such grouping is the annual performance 
assessment, a system-level end-to-end simulation that assesses the 
performance of the BMDS configuration as it exists in the field. The 
performance assessment integrates element-specific models into a 
coherent representation of the BMDS. Fundamentally, performance 
assessments anchored by flight tests are a comprehensive means to fully 
understand the performance capabilities and limitations of the BMDS. 

In addition to testing, modeling and simulation, and performance 
assessments, MDA also has a formal process for determining when a newly 
fielded asset or group of assets can be declared operational--that is, 
cleared for use by the warfighter in operational situations. MDA uses a 
variety of information as a basis to assess a new capability for 
declaration. For example, MDA will define in advance tests, models, and 
simulations it will use to base a specific decision on whether an asset 
or capability can be declared ready for fielding. Each capability 
designation so designated represents upgraded capacity to support the 
overall function of BMDS in its mission as well as the level of MDA 
confidence in the system's performance. 

To assess testing related progress in fiscal year 2008, we examined the 
accomplishments of ten BMDS elements that MDA is developing and 
fielding. Our work included examining documents such as Program 
Execution Reviews, test plans and reports, and production plans. We 
also interviewed officials within each element program office and 
within MDA functional directorates. In addition, we discussed each 
element's test program and its results with DOD's Office of the 
Director, Operational Test and Evaluation. We also interviewed 
officials from the Office of the Under Secretary of Defense for 
Acquisition, Technology, and Logistics. 

Test, Targets and Performance Challenges Continue During Fiscal Year 
2008: 

MDA continues to experience difficulties achieving its goals for 
testing. During fiscal year 2008, while several tests showed progress 
in individual elements and some system level capabilities, all BMDS 
elements experienced test delays or shortfalls. Most were unable to 
accomplish all objectives and performance challenges continued for 
many. Table 1 summarizes test results and target performance for the 
BMDS elements during the year. 

Table 1: Fiscal Year 2008 Test and Targets Issues: 

Element: Airborne Laser; 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: Yes; 
Target Issues: N/A. 

Element: Aegis Ballistic Missile Defense (BMD); 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: Target availability delayed key test from 2008 until at 
least third quarter fiscal year 2009. 

Element: Command, Control, Battle Management and Communications; 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: N/A. 

Element: Ground-based Midcourse Defense (GMD); 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: Target failed to release countermeasures during December 
2008 flight test--FTG-05.[A] 

Element: Kinetic Energy Interceptor; 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: N/A. 

Element: Multiple Kill Vehicle (MKV); 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No[B]; 
Target Issues: N/A. 

Element: Sensors; 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: Target failed to release countermeasures during July 
2008 testing (FTX-03). 

Element: Space Tracking and Surveillance System; 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: N/A. 

Element: Targets and Countermeasures; 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: Flexible Target Family delivery delayed and experienced 
cost growth. 

Element: Terminal High Altitude Area Defense (THAAD); 
Tests/Activities Conducted as Scheduled: No; 
All Objectives Achieved: No; 
Target Issues: Target experienced anomaly during a September flight 
test resulting in a no-test. 

Source: GAO (presentation); MDA (data). 

[A] This flight test was originally scheduled for fiscal year 2008, but 
was later executed in fiscal year 2009. 

[B] The MKV program was able to achieve this test objective in the 
first quarter of fiscal year 2009. 

[End of table] 

Because of delays in flight test and a key ground test, MDA was unable 
to achieve any of the six knowledge points the MDA Director had 
scheduled for fiscal year 2008. In May 2007, the MDA Director 
established key system-level and element-level knowledge points, each 
based on an event that was to provide critical information--or 
knowledge--for a decision requiring his approval. For example, two 
knowledge points that MDA had to defer because of testing problems were 
confirmation of a new target's performance and assessment of the SM-3 
Block 1A missile's ability to engage and intercept a long range target. 

GMD in particular continues to experience testing problems and delays. 
Based on its September 2006 plan, MDA had expected to conduct 7 GMD 
interceptor flight tests from the start of fiscal year 2007 through the 
first quarter of fiscal year 2009. MDA however was only able to conduct 
two, as shown in figure 1. 

Figure 1: GMD Reduction in Flight Test from January 2006 to March 2010: 

[Refer to PDF for image] 

This figure is a chart depicting the following data: 

GMD Reduction in Flight Test from January 2006 to March 2010: 

As of September 2006: 

Integrated flight tests planned: 

Date: FY 2006, Q1; 
Flight: FT-1, CE-I EKV (achieved); 

Date: FY 2006, Q4; 
Flight: FTG-2, CE-I EKV (achieved); 

Date: FY 2007, Q1; 
Flight: FTG-3, CE-I EKV; 

Date: FY 2007, Q3; 
Flight: FTG-4, CE-I EKV; 

Date: FY 2007, Q4; 
Flight: FTG-5, CE-I EKV; 

Date: FY 2008, Q1; 
Flight: FTG-6, CE-I EKV; 

Date: FY 2008, Q2; 
Flight: FTG-7, CE-I EKV; 

Date: FY 2008, Q4; 
Flight: FTG-8, CE-I EKV; 

Date: FY 2009, Q1; 
Flight: FTG-9, CE-I EKV; 

Date: FY 2009, Q1; 
Flight: FTG-9, CE-II EKV New processor. 

As of January 2009: 

Date: FY 2006, Q1; 
Flight: FT-1, CE-I EKV (achieved); 

Date: FY 2006, Q4; 
Flight: FTG-2, CE-I EKV (achieved); 

Date: FY 2007, Q4; 
Flight: FTG-3a, CE-I EKV (achieved); 

Date: FY 2009, Q2; 
Flight: FTG-5, CE-I EKV (achieved); 

Date: FY 2009, Q4; 
Flight: FTG-6, CE-II EKV New processor. 

Source: GAO analysis of GMD's flight test and interceptor fielding 
schedule as of 2/25/07 and updated as of 1/31/09. 

[End of figure] 

GMD was unable to conduct either of its planned intercept attempts 
during fiscal year 2008 - FTG-04 and FTG-05. MDA first delayed and then 
later canceled the FTG-04 test in May 2008 due to a problem with a 
telemetry component in the interceptor's Exoatmospheric Kill Vehicle. 
The cancellation of FTG-04 removed an important opportunity to obtain 
end-game performance data needed to develop GMD models and to verify 
the capability of the fielded Capability Enhancement I (CE-I) EKV. 
Moreover, MDA planned to test the CE-I EKV against a dynamic target 
scene with countermeasures in both the FTG-04 and FTG-05 flight tests. 
However, since FTG-04 was canceled and the target failed to release the 
countermeasure in FTG-05, the fielded CE-I's ability against 
countermeasures still has not been verified. According to MDA no more 
CE-I EKV flight tests have been approved. 

The test delays led MDA to restructure its flight test plan for fiscal 
year 2009, increasing the number of tests, compressing the amount of 
time to analyze and prepare for subsequent tests, and increasing the 
scope of individual tests. For example, MDA plans to conduct 14 of 18 
flight tests in the third and fourth quarter of fiscal year 2009. Past 
testing performance raises questions about whether this is realistic. 
In fiscal year 2008, MDA had planned to conduct 18 flight tests, but it 
only accomplished 10, and delayed several flight tests into 2009. In 
the next GMD end-to-end flight test--FTG-06 in fourth quarter fiscal 
year 2009 to first quarter fiscal year 2010 --MDA is accepting a higher 
level of risk than it previously expected in conducting this first test 
of an enhanced configuration of the Kill Vehicle called the Capability 
Enhancement II (CE-II)[Footnote 1] because it will include several 
objectives that had planned to be previously tested, but have not been. 
For example, the FTG-06 flight test will be the first GMD test 
assessing both a CE-II EKV and a complex target scene. Adding to the 
risk, it will be only the second test using a newly developed FTF LV-2 
target. Moreover, MDA in January 2008 had merged FTG-06 and FTG-07, 
thereby eliminating an additional opportunity to gather important 
information from an intercept. FTG-07 will instead be an intercept test 
of the two-stage interceptor intended for the European site. 

Poor Target Missile Performance Continues to Hamper BMDS Testing: 

Problems with the reliability and availability of targets (which are 
themselves ballistic missiles) have increasingly affected BMDS 
development and testing since 2006. As MDA recently acknowledged, 
target availability became, in some cases, a pacing item for the 
overall test program. As was noted in Table 1, problems with targets 
have reduced testing of GMD, Sensors, and THAAD during 2008. 

Repeated target problems and test cancellations have particularly 
reduced opportunities to demonstrate the ability of sensors to 
discriminate the real target from countermeasures. In the mid-course of 
flight, a more sophisticated threat missile could use countermeasures 
in an attempt to deceive BMDS radars and interceptor sensors as to 
which is the actual reentry vehicle. In order to improve the 
effectiveness of the BMDS against evolving threats, MDA elements are 
developing advanced discrimination software in their component's 
sensors to distinguish the threat reentry vehicle from countermeasures 
and debris. The cancellation of FTG-04 and subsequent target problems 
during FTX-03 and FTG-05 prevented opportunities to gather data to test 
how well discrimination software performs in an operational 
environment. The current fielded configuration of the GMD kill vehicle 
has not been tested against countermeasures. 

To address the growing need for more sophisticated and reliable targets 
for the future BMDS test program, MDA has been developing a new set of 
targets called the Flexible Target Family (FTF), which was intended to 
provide new short, medium, and long-range targets with ground, air, and 
sea launch capabilities. It was viewed as a family in the sense that 
the different target sizes and the variants within those sizes would 
use common components. MDA embarked on this major development without 
estimating the cost to develop the family of target missiles. MDA 
proceeded to develop and even to produce some FTF targets without a 
sound business case and, consequently, their acquisition has not gone 
as planned. The funds required for the FTF were spent sooner than 
expected and were insufficient for the development. 

Development of all FTF sizes and variants has been discontinued except 
for the 72-inch diameter ground-launched target, referred to as the 
LV-2. With guidance from the Missile Defense Executive Board, MDA is 
currently conducting a comprehensive review of the targets program to 
determine the best acquisition strategy for future BMDS targets. It is 
expected to be completed in mid-2009. Whether or not MDA decides to 
restart the acquisition of the 52-inch diameter targets, or other FTF 
variants, depends on the results of this review. 

The process of qualifying FTF target components for the LV-2 was more 
difficult than expected. While many of the LV-2's components are found 
on existing systems, their form, fit, function, and the environment 
they must fly in are different. Consequently, many critical components 
initially failed shock and vibration testing and other qualification 
tests and had to be redesigned. MDA has acknowledged that the component 
qualification effort ran in parallel with design completion and initial 
manufacturing. So far, the resultant delays in the LV-2 target have had 
two consequences. First, a planned test flight of the LV-2 itself for 
the Space Tracking and Space Surveillance program was delayed and 
instead its first flight will be as an actual target for an Aegis BMD 
intercept. Second, because the LV-2 was not ready, that Aegis intercept 
test was deferred from fiscal year 2008 to third quarter fiscal year 
2009. 

Other Consequences of Less Productive Testing: 

In addition to delaying progress on individual elements, testing 
problems have had other consequences for BMDS. Specifically, the 
reduced productivity of testing has delayed understanding the overall 
performance of BMDS, production and fielding have in some cases gotten 
ahead of testing, and declarations of capabilities ready for fielding 
have been made based on fewer tests and less modeling and simulation 
than planned. 

Overall Performance of BMDS Can Not Yet Be Assessed: 

The overall performance of the BMDS cannot yet be assessed because MDA 
lacks a fully accredited end-to-end model and simulation capability 
and, according to the BMDS Operational Test Agency, it will not have 
that capability until 2011 at the earliest. The lack of sufficient 
flight test data has inhibited the validation of the models and 
simulations needed for the ground tests and the simulation. MDA's 
modeling and simulation program enables it to assess the capabilities 
and limitations of how BMDS performs under a wider variety of 
conditions than can be accomplished through the limited number of 
flight tests conducted. Flight tests alone are insufficient because 
they only demonstrate a single collection data point of element and 
system performance. Flight tests are, however, an essential tool used 
to both validate performance of the BMDS and to anchor the models and 
simulations to ensure they accurately reflect real performance. 
Computer models of individual elements replicate how those elements 
function. These models are then aggregated into various combinations 
that simulate the BMDS engagement of enemy ballistic missiles. 

Developing an end-to-end system-level model and simulation has been 
difficult. MDA's first effort to bring together different element 
models and simulations to produce a fully accredited, end-to-end model 
and simulation was for the first annual performance assessment of the 
fielded BMDS configuration in 2007. Performance Assessment 2007 was 
unsuccessful primarily because of inadequate data, particularly flight 
test data, for verification and validation to support accreditation. 
Instead, Performance Assessment 2007 used several models and 
simulations that represented different aspects of the BMD system and 
were not fully integrated. Consequently, acting on a joint 
recommendation between MDA and the Operational Test Agency, MDA 
officials canceled the 2008 performance assessment in April 2008 
because of developmental risks associated with modeling and 
simulations, focusing instead on testing and models for Performance 
Assessment 2009. 

According to the BMDS Operational Test Agency's January 2009 Modeling 
and Simulation accreditation report, confidence in MDA's Modeling and 
Simulation efforts remains low although progress was made during the 
year. Out of 40 models, the BMDS Operational Test Agency recommended in 
January 2009 full accreditation for only 6 models, partial 
accreditation for 9 models, and no accreditation for 25 models. MDA is 
now exercising stronger central leadership to provide guidance and 
resources as they coordinate the development of verified and validated 
models and simulations. 

MDA intends to verify and validate models and simulations by December 
2009 for Performance Assessment 2009. However, BMDS Operational Test 
Agency officials stated that there is a high risk that the performance 
assessment 2009 analysis will be delayed because of remaining 
challenges and MDA's delayed progress in accreditation. MDA does not 
expect to have a single end-to-end simulation for use in performance 
assessments until 2010. 

Production and Fielding Proceed Despite Delays in Testing and 
Assessments: 

Testing problems have contributed to a concurrent development, 
manufacturing and fielding strategy in which assets are produced and 
fielded before they are fully demonstrated through testing and 
modeling. For example, although a test of the ability of the SM-3 Block 
1A missile to engage and intercept a long range ballistic target was 
delayed until the third quarter of fiscal year 2009, MDA purchased 20 
of the missiles in fiscal year 2008 ahead of schedule. 

While the GMD program has only been able to conduct two intercepts 
since 2006 for assessing the fielded configuration, the production of 
interceptors has continued. From the beginning of fiscal year 2007 
through the first quarter of fiscal year 2009, MDA planned to conduct 7 
flight tests and field 16 new ground-based interceptors. The plan 
included a test that would utilize two ground-based interceptors 
against a single target, known as a salvo test. By January 2009, GMD 
had conducted only 2 flight tests and dropped the salvo test; yet it 
fielded 13 ground-based interceptors. 

Moreover, the GMD program had planned to conduct an intercept test to 
assess the enhanced version of the EKV called the Capability 
Enhancement II (CE-II) in the first quarter of fiscal year 2008, months 
before emplacing any interceptors with this configuration. However, 
developmental problems with the new configuration's inertial 
measurement unit and the target delayed the first flight test with the 
CE-II configuration--FTG-06--until at least fourth quarter fiscal year 
2009. Despite these delays, emplacements will proceed; MDA expects to 
have emplaced five CE-II interceptors before this flight test. More 
importantly, GMD projects that the contractor will have manufactured 
and delivered 10 CE-II EKVs before that first flight test demonstrates 
the CE-II capability. This amounts to over half of the CE-II EKV 
deliveries that are currently under contract. 

Declaration of Capabilities Proceed with Reduced Levels of Information: 

When MDA determines that a capability can be considered for operational 
use it does so through a formal declaration. MDA bases its declarations 
on, among other things, a combination of models and simulations--such 
as end-to-end performance assessments (from missile launch to attempted 
intercept)--and ground tests all anchored to flight test data. 

In fiscal year 2008, MDA declared it had fielded 7 of 17 BMDS 
capabilities planned for 2008 (postponing 10). In doing so MDA largely 
reduced the basis for the declarations due in part to test problems and 
delays. Specifically, MDA had intended to use a GMD flight test that 
was canceled, a key ground test that was delayed and a performance 
assessment that was canceled. MDA had to shift the basis of the 7 
declarations to previous flight and ground tests. 

Review of BMDS Modeling and Testing Holds Promise, but Must Anticipate 
Contingences: 

MDA has undertaken a three-phase review of the entire BMDS modeling, 
simulation, and test program. According to MDA, the three phases 
involve identifying critical variables that have not been proven to 
date, determining what test scenarios are needed to collect the data, 
and developing an affordable and prioritized schedule of flight and 
ground tests. MDA intends to complete all three phases of the review by 
May 2009. At this point, our knowledge of the review is limited, as we 
have only had an introductory briefing on it. Nonetheless, the review 
appears to offer a sound approach for closing the gaps that exist 
between testing, modeling, and simulation. Further, the involvement of 
test and evaluation organizations is encouraging. 

While sound, the success of this approach hinges on providing 
sufficient resources, ensuring robustness, and anticipating 
contingencies. In addition to linking the critical modeling and 
simulation variables with test events, the review will have to address 
the factors that have limited the productivity of the current test 
approach, such as the availability and performance of targets. MDA's 
current approach to testing could be characterized as a just-in-time 
approach to having the test assets, such as targets, ready. This left 
little margin to solve issues that arise leading up to the tests. 
Accordingly, the third phase of MDA's new approach--properly resourcing 
the tests with sufficient time, funding and reliable targets--will be 
key. MDA has indicated that its revision will result in a more robust 
test plan, providing more margin to conduct the tests through, for 
example, having spare interceptors and targets available. 

Other contingencies that a new approach to modeling, simulation, and 
testing should anticipate include unexpected or incomplete test 
results, and problems in accrediting the models that are needed for 
aggregated simulations, such as performance assessments. An important 
consideration in this regard is for modeling, simulation, and testing 
events to be re-synchronized so that they properly inform decisions on 
producing, fielding, and declaring assets operational. Contingency 
plans could then be formed for adjusting the pace of these decisions 
should shortfalls occur in modeling, simulation, or testing. 

MDA has indicated that this new approach to testing will take time to 
implement, with partial implementation in fiscal year 2010 and full 
implementation not occurring until fiscal year 2011. Therefore, MDA 
must manage the transition to the new testing approach. In particular, 
the ambitious fiscal year 2009 flight test plan may need to be 
reassessed with the goal of establishing a robust series of tests that 
can withstand some delays without causing wholesale changes to the test 
plan during the transition. In the mean time, MDA will have to be 
prudent in making decisions to produce and field additional assets. 

Our annual report on missile defense is in draft and with DOD for 
comment. It will be issued in final by March 13, 2009. In that report, 
we are recommending additional steps to further improve the 
transparency, accountability, and oversight of the missile defense 
program. Our recommendations include actions to improve cost reporting 
as well as testing and evaluation. DOD is in the process of preparing a 
formal response to the report and its recommendations. 

Madame Chairman, this concludes my statement. I would be pleased to 
respond to any questions you or members of the subcommittee may have. 

Contacts and Staff Acknowledgments: 

For questions about this statement, please contact me at (202) 512-4841 
or Francisp@gao.gov. Individuals making key contributions to this 
statement include David B. Best, Assistant Director; Steven B. Stern; 
LaTonya D. Miller; Thomas Mahalek; Ivy Hübler; Meredith Allen Kimmett; 
Kenneth E. Patton; and Alyssa Weir. 

[End of section] 

Footnote: 

[1] The CE-II was intended to replace obsolescent parts, but it has 
demonstrated improved performance. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: