This is the accessible text file for GAO report number GAO-08-674T 
entitled 'Defense Acquisitions: Results of Annual Assessment of DOD 
Weapon Programs' which was released on April 29, 2008.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Testimony: 

Before the Committee on Oversight and Government Reform and the 
Subcommittee on National Security and Foreign Affairs, House of 
Representatives: 

United States Government Accountability Office: 
GAO: 

For Release on Delivery: 
Expected at 10:00 a.m. EDT: 
Tuesday, April 29, 2008: 

Defense Acquisitions: 

Results of Annual Assessment of DOD Weapon Programs: 

Statement of Michael J. Sullivan: 
Director: 
Acquisition and Sourcing Management: 

GAO-08-674T: 

GAO Highlights: 

Highlights of GAO-08-674T, a testimony before the Committee on 
Oversight and Government Reform and the Subcommittee on National 
Security and Foreign Affairs, House of Representatives. 

Why GAO Did This Study: 

DOD’s investment in weapon systems represents one of the largest 
discretionary items in the budget. The department expects to invest 
about $900 billion (fiscal year 2008 dollars) over the next 5 years on 
development and procurement with more than $335 billion invested 
specifically in major defense acquisition programs. Every dollar spent 
inefficiently in acquiring weapon systems is less money available for 
other budget priorities—such as the global war on terror and growing 
entitlement programs. 

This testimony focuses on (1) the overall performance of DOD’s weapon 
system investment portfolio; (2) our assessment of 72 weapon programs 
against best practices standards for successful product developments; 
and (3) potential solutions and recent DOD actions to improve weapon 
program outcomes. It is based on GAO-08-467SP, which included our 
analysis of broad trends in the performance of the programs in DOD’s 
weapon acquisition portfolio and our assessment of 72 defense programs, 
and recommendations made in past GAO reports. 

DOD was provided a draft of GAO-08-467SP and had no comments on the 
overall report, but did provide technical comments on individual 
assessments. The comments, along with the agency comments received on 
the individual assessments, were included as appropriate. 

What GAO Found: 

We recently released our sixth annual assessment of selected DOD weapon 
programs. The assessment indicates that cost and schedule outcomes for 
major weapon programs are not improving. Although well-conceived 
acquisition policy changes occurred in 2003 that reflect many best 
practices we have reported on in the past, these policy changes have 
not yet translated into practice at the program level. 

Table: Analysis of DOD Major Defense Acquisition Program Portfolios 
(fiscal year [FY] 2008 dollars): 

Portfolio size: Number of programs; 
FY 2000 Portfolio: 75; 
FY 2005 Portfolio: 91; 
FY 2007 Portfolio: 95. 

Portfolio size: Total planned commitments; 
FY 2000 Portfolio: $790 Billion; 
FY 2005 Portfolio: $1.5 Trillion; 
FY 2007 Portfolio: $1.6 Trillion. 

Portfolio size: Commitments outstanding; 
FY 2000 Portfolio: $380 Billion; 
FY 2005 Portfolio: $887 Billion; 
FY 2007 Portfolio: $858 Billion. 
 
Portfolio performance: Change in total acquisition cost from first 
estimate; 
FY 2000 Portfolio: 6 percent; 
FY 2005 Portfolio: 18 percent; 
FY 2007 Portfolio: 26 percent. 

Portfolio performance: Estimated total acquisition cost growth; 
FY 2000 Portfolio: $42 Billion; 
FY 2005 Portfolio: $202 Billion; 
FY 2007 Portfolio: $295 Billion. 

Portfolio performance: Share of programs with 25 percent or more 
increase in program acquisition unit cost; 
FY 2000 Portfolio: 37 percent; 
FY 2005 Portfolio: 44 percent; 
FY 2007 Portfolio: 44 percent. 

Portfolio performance: Average schedule delay in delivering initial 
capabilities; 
FY 2000 Portfolio: 16 months; 
FY 2005 Portfolio: 17 months; 
FY 2007 Portfolio: 21 months. 

Source: GAO analysis of DOD data. 

[End of table] 

None of the weapon programs we assessed this year had proceeded through 
system development meeting the best practices standards for mature 
technologies, stable design, and mature production processes—all 
prerequisites for achieving planned cost, schedule, and performance 
outcomes. In addition, only a small percentage of programs used two key 
systems engineering tools—preliminary design reviews and prototypes to 
demonstrate the maturity of the product’s design by critical junctures. 
This lack of disciplined systems engineering affects DOD’s ability to 
develop sound, executable business cases for programs. 

Our work shows that acquisition problems will likely persist until DOD 
provides a better foundation for buying the right things, the right 
way. This involves making tough decisions as to which programs should 
be pursued, and more importantly, not pursued; making sure programs are 
executable; locking in requirements before programs are ever started; 
and making it clear who is responsible for what and holding people 
accountable when responsibilities are not fulfilled. Moreover, the 
environment and incentives that lead DOD and the military services to 
overpromise on capability and underestimate costs in order to sell new 
programs and capture funding will need to change. Based in part on GAO 
recommendations and congressional direction, DOD has begun several 
initiatives that, if adopted and implemented properly, could provide a 
foundation for establishing sound, knowledge-based business cases for 
individual acquisition programs and improving outcomes. 

To view the full product, including the scope and methodology, click on 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-674T]. For more 
information, contact Michael J. Sullivan at (202) 512-4841 or 
sullivanm@gao.gov. 

[End of section] 

Mr. Chairmen and Members of the Committee and Subcommittee: 

I am pleased to be here today to discuss the Department of Defense's 
(DOD) management of its weapon system acquisitions--an area that has 
been part of GAO's high risk list since 1990. We have recently released 
our sixth annual assessment of selected DOD weapon programs. The 
assessment indicates that cost and schedule outcomes for the DOD's 
major weapon system programs are not improving. 

Continuing poor acquisition outcomes have implications for DOD and the 
government as a whole. DOD's investment in weapon systems represents 
one of the largest discretionary items in the budget. While overall 
discretionary funding is declining, DOD's budget continues to demand a 
larger portion of what is available, thereby leaving a smaller 
percentage for other activities. Investment in weapon acquisition 
programs is now at its highest level in two decades. The department 
expects to invest about $900 billion (fiscal year 2008 dollars) over 
the next 5 years on development and procurement with more than $335 
billion invested specifically in major defense acquisition programs. 
Every dollar spent inefficiently in acquiring weapon systems is less 
money available for other budget priorities--such as the global war on 
terror and growing entitlement programs. 

My statement today focuses on (1) the overall performance of DOD's 
weapon system investment portfolio; (2) our assessment of 72 weapon 
programs against best practices standards for successful product 
developments; and (3) potential solutions and recent DOD actions to 
improve weapon program outcomes. It is drawn mostly from our annual 
assessment of selected DOD weapon programs, as well as recommendations 
made in past GAO reports. Our assessment provided information on 72 
individual weapon programs and analyzed overall trends in DOD 
acquisition outcomes. The programs assessed--most of which are 
considered major acquisitions by DOD--were selected using several 
factors: high dollar value, acquisition stage, and congressional 
interest.[Footnote 1] We conducted this performance audit from June 
2007 to March 2008 in accordance with generally accepted government 
auditing standards. Those standards require that we plan and perform 
the audit to obtain sufficient, appropriate evidence to provide a 
reasonable basis for our findings and conclusions based on our audit 
objectives. We believe that the evidence obtained provides a reasonable 
basis for our findings and conclusions based on our audit objectives. 

Summary: 

Since fiscal year 2000, DOD significantly increased the number of major 
defense acquisition programs and its overall investment in them. During 
this same time period, acquisition outcomes have not improved. Based on 
our analysis, total acquisition costs for the fiscal year 2007 
portfolio of major defense acquisition programs increased 26 percent 
and development costs increased by 40 percent from first estimates-- 
both of which are higher than the corresponding increases in DOD's 
fiscal year 2000 portfolio. In most cases, programs also failed to 
deliver capabilities when promised--often forcing warfighters to spend 
additional funds on maintaining legacy systems. Our analysis shows that 
current programs are experiencing, on average, a 21-month delay in 
delivering initial capabilities to the warfighter, a 5-month increase 
over fiscal year 2000 programs. 

At the program level, none of the weapon programs we assessed had 
proceeded through system development meeting the best practices 
standards for mature technologies, stable design, and mature production 
processes--all prerequisites for achieving planned cost, schedule, and 
performance outcomes.[Footnote 2] In addition, only a small percentage 
of programs used two key systems engineering tools--preliminary design 
reviews and prototypes to demonstrate the maturity of the product's 
design by critical junctures. This lack of disciplined systems 
engineering, especially prior to starting system development, affects 
DOD's ability to develop sound business cases for programs and can 
contribute to contract cost increases and long development cycle times. 
In addition, we found four factors that have the potential to impact 
acquisition outcomes on individual programs: (1) unsettled requirements 
in acquisition programs can create significant turbulence including 
increased cost growth; (2) frequent program manager turnover during 
system development challenges continuity and accountability; (3) 
extensive reliance on contractors to perform roles that have in the 
past been performed by government employees raises questions about 
whether DOD has the appropriate mix of staff and capabilities within 
its workforce to effectively manage programs; and (4) difficulty 
managing software, as evidenced by changes to the amount of software 
that needs to be developed, indicates the potential for cost and 
schedule problems. 

There is some reason for optimism. Based in part on GAO recommendations 
and congressional direction, DOD has begun to develop several 
initiatives that, if adopted and implemented properly, could provide a 
foundation for establishing sound, knowledge-based business cases for 
individual acquisition programs and improving program outcomes. For 
example, a new concept decision review initiative, acquisition 
approaches based on capability need dates, a move to require more 
prototyping early in programs, and the establishment of review boards 
to monitor weapon system configuration changes are all designed to 
enable key department leaders to make informed decisions well ahead of 
a program's start. If implemented properly, these initiatives can help 
establish a more balanced mix of programs in which to invest, establish 
manageable business cases for individual programs, and empower and hold 
accountable program managers to deliver weapons less expensively and on-
time. However, improving acquisition outcomes will also require a 
change in the environment and incentives that lead DOD and the military 
services to overpromise capabilities and underestimate costs in order 
to sell new programs and capture the funding needed to start and 
sustain them. 

DOD's Major Acquisition Programs Continue to Experience Significant 
Cost Growth and Schedule Delays: 

DOD is not receiving expected returns on its large investment in weapon 
systems. While it is committing substantially more investment dollars 
to develop and procure new weapon systems, our analysis shows that the 
2007 portfolio of major defense acquisition programs is experiencing 
greater cost growth and schedule delays than programs in fiscal years 
2000 and 2005.[Footnote 3] For example, as shown in table 1, total 
acquisition costs for 2007 programs have increased 26 percent from 
first estimates, whereas programs in fiscal year 2000 had increased by 
6 percent. Total RDT&E costs for programs in 2007 have increased by 40 
percent from first estimates, compared to 27 percent for programs in 
2000. The story is no better when expressed in unit costs. Based on our 
analysis for the 2007 portfolio, 44 percent of DOD's major defense 
acquisition programs are paying at least 25 percent more per unit than 
originally expected. The percentage of programs experiencing a 25 
percent or more increase in program acquisition unit costs in fiscal 
year 2000 was 37 percent. 

Table 1: Analysis of DOD Major Defense Acquisition Program Portfolios 
(fiscal year [FY] 2008 dollars): 

Portfolio size: Number of programs; 
FY 2000 Portfolio: 75; 
FY 2005 Portfolio: 91; 
FY 2007 Portfolio: 95. 

Portfolio size: Total planned commitments; 
FY 2000 Portfolio: $790 Billion; 
FY 2005 Portfolio: $1.5 Trillion; 
FY 2007 Portfolio: $1.6 Trillion. 

Portfolio size: Commitments outstanding; 
FY 2000 Portfolio: $380 Billion; 
FY 2005 Portfolio: $887 Billion; 
FY 2007 Portfolio: $858 Billion. 
 
Portfolio performance: Change in total acquisition cost from first 
estimate; 
FY 2000 Portfolio: 6 percent; 
FY 2005 Portfolio: 18 percent; 
FY 2007 Portfolio: 26 percent. 

Portfolio performance: Estimated total acquisition cost growth; 
FY 2000 Portfolio: $42 Billion; 
FY 2005 Portfolio: $202 Billion; 
FY 2007 Portfolio: $295 Billion. 

Portfolio performance: Share of programs with 25 percent or more 
increase in program acquisition unit cost; 
FY 2000 Portfolio: 37 percent; 
FY 2005 Portfolio: 44 percent; 
FY 2007 Portfolio: 44 percent. 

Portfolio performance: Average schedule delay in delivering initial 
capabilities; 
FY 2000 Portfolio: 16 months; 
FY 2005 Portfolio: 17 months; 
FY 2007 Portfolio: 21 months. 

Source: GAO analysis of DOD data. 

Note: Data were obtained from DOD's Selected Acquisition Reports (dated 
December 1999, 2004, and 2006) or, in a few cases, data were obtained 
directly from program offices. Number of programs reflects the programs 
with Selected Acquisition Reports. In our analysis we have broken a few 
Selected Acquisition Report programs (such as Missile Defense Agency 
systems) into smaller elements or programs. Not all programs had 
comparative cost and schedule data, and these programs were excluded 
from the analysis where appropriate. Also, data do not include full 
costs of developing Missile Defense Agency systems. 

[End of table] 

The consequence of cost growth is reduced buying power, which can 
represent significant opportunity costs for DOD. In other words, every 
dollar spent on inefficiencies in acquiring one weapon system is less 
money available for other priorities and programs. Total acquisition 
cost for the current portfolio of major programs under development or 
in production has grown by nearly $300 billion over initial estimates. 
As program costs increase, DOD must request more funding to cover the 
overruns, make trade-offs with existing programs, delay the start of 
new programs, or take funds from other accounts. 

Just as importantly, DOD has already missed fielding dates for many 
programs and many others are behind schedule. Because of program 
delays, warfighters often have to operate costly legacy systems longer 
than expected, find alternatives to fill capability gaps, or go without 
the capability. The warfighter's urgent need for the new weapon system 
is often cited when the case is first made for developing and producing 
the system. However, on average, the current portfolio of programs has 
experienced a 21-month delay in delivering initial operational 
capability to the warfighter and, in fact, 14 percent are more than 4 
years late. 

DOD Weapon System Programs Continue to Move Forward Without Proper 
Knowledge about Requirements, Technology, Design, and Manufacturing 
Processes: 

In assessing the 72 weapon programs, we found no evidence of widespread 
adoption of a knowledge-based acquisition process within DOD despite 
polices to the contrary. Reconciling this discrepancy between policy 
and practice is essential for getting better outcomes for DOD programs. 
The majority of programs in our assessment this year proceeded with 
lower levels of knowledge at critical junctures and attained key 
elements of product knowledge later in development than expected under 
best practices (see fig. 1). This exposes programs to significant and 
unnecessary technology, design, and production risks, and ultimately 
leads to cost growth and schedule delays. The building of knowledge 
over a product's development is cumulative, as one knowledge point 
builds on the next, and failure to capture key product knowledge can 
lead to problems that eventually cascade and become magnified 
throughout product development and production. 

Figure 1: Knowledge Achievement for Weapon System Programs in 2008 
Assessment at Key Junctures: 

[See PDF for image] 

This figure is a chart depicting the Knowledge Achievement for Weapon 
System Programs in 2008 Assessment at Key Junctures, as follows: 

Key junctures: Development start: 
Best practices: Knowledge Point 1: 
* Mature all critical technologies. 
DOD outcomes[A]: 12 percent of programs. 

Key junctures: Design review: 
Best practices: Knowledge Point 2: 
* Achieve knowledge point 1 on time and complete 90 percent of 
engineering drawings. 
DOD outcomes[A]: 4 percent of programs. 

Key junctures: Production start: 
Best practices: Knowledge Point 3: 
* Achieve knowledge points 1 and 2 on time, and have all critical 
processes under statistical control. 
DOD outcomes[A]: 0 percent of programs[B]. 

Source: GAO analysis of DOD data. 

[A] Not all programs provided information for each knowledge point or 
had passed through all three key junctures. 

[B] In our assessment of two programs, the Light Utility Helicopter and 
the Joint Cargo Aircraft, are depicted as meeting all three knowledge 
points when they began at production start. We excluded these two 
programs from our analysis because they were based on commercially 
available products and we did not assess their knowledge attainment 
with our best practices metrics. 

[End of figure] 

Programs Begin without Matching Product Requirements with Available 
Resources: 

Very few of the programs we assessed started system development with 
evidence that the proposed solution was based on mature technologies 
and proven design features. As a result, programs are still working to 
mature technologies during system development and production, which 
causes significantly higher cost growth than programs that start 
development with mature technologies. Only 12 percent of the programs 
in our assessment demonstrated all of their critical technologies as 
fully mature at the start of system development and they have had much 
better outcomes than the others. For those programs in our assessment 
with immature technologies at development start, total RDT&E costs grew 
by 44 percent more than for programs that began with mature 
technologies. More often than not, programs were still maturing 
technologies late into development and even into production. 

In addition to ensuring that technologies are mature, best practices 
for product development suggest that the developer should have 
delivered a preliminary design of the proposed weapon system based on a 
robust systems engineering process before committing to system 
development. This process should allow the developer--the contractor 
responsible for designing the weapon system--to analyze the customer's 
expectations for the product and identify gaps between resources and 
those expectations, which then can be addressed through additional 
investments, alternate designs, and ultimately trade-offs. Only ten 
percent of the programs in our assessment had completed their 
preliminary design review prior to committing to system development. 
The other 90 percent averaged about 2 1/2 years into system development 
before the review was completed or planned to be completed. Programs 
like the Aerial Common Sensor and Joint Strike Fighter did not deliver 
a sound preliminary design at system development start and discovered 
problems early in their design activities that required substantial 
resources be added to the programs or, in the case of Aerial Common 
Sensor, termination of the system development contract. 

Programs Continue to Move into System Demonstration and Production 
without Achieving Design Stability: 

Knowing that a product's design is stable before system demonstration 
reduces the risk of costly design changes occurring during the 
manufacturing of production representative prototypes--when investments 
in acquisitions become much more significant. Only a small portion of 
the programs in our assessment that have held a design review captured 
the necessary knowledge to ensure that they had mature technologies at 
system development start and a stable system design before entering the 
more costly system demonstration phase of development. Over half of the 
programs in our assessment did not even have mature technologies at the 
design review (knowledge that actually should have been achieved before 
system development start). Also, less than one-quarter of the programs 
that provided data on drawings released at the design review reached 
the best practices standard of 90 percent. We have found that programs 
moving forward into system demonstration with low levels of design 
stability are more likely than other programs to encounter costly 
design changes and parts shortages that in turn caused labor 
inefficiencies, schedule delays, and quality problems. Even by the 
beginning of production, more than a third of the programs that had 
entered this phase still had not released 90 percent of their 
engineering drawings. 

In addition, we found that over 80 percent of the programs providing 
data did not or did not plan to demonstrate the successful integration 
of the key subsystems and components needed for the product through an 
integration laboratory, or better yet, through testing an early system 
prototype by the design review. For example, the Navy's E-2D Advanced 
Hawkeye moved past the design review and entered systems demonstration 
without fully proving--through the use of an integration lab or 
prototype--that the design could be successfully integrated. The 
program did not have all the components operational in a systems 
integration lab until almost 2 years after the design review. While the 
program estimated it had released 90 percent of the drawings needed for 
the system by the design review, as it was conducting system 
integration activities, it discovered that it needed substantially more 
drawings. This increase means that the program really had completed 
only 53 percent of the drawings prior to the review, making it 
difficult to ensure the design was stable. 

Programs Enter Production without Demonstrating Acceptable 
Manufacturing Processes and Weapon System Performance: 

In addition to lacking mature technologies and design stability, most 
programs have not or do not plan to capture critical manufacturing and 
testing knowledge before entering production. This knowledge ensures 
that the product will work as intended and can be manufactured 
efficiently to meet cost, schedule, and quality targets. Of the 26 
programs in our assessment that have had production decisions, none 
provided data showing that they had all their critical manufacturing 
processes in statistical control by the time they entered into the 
production phase.[Footnote 4] In fact, only 3 of these programs 
indicated that they had even identified the key product characteristics 
or associated critical manufacturing processes--key initial steps to 
ensuring critical production elements are stable and in control. 
Failing to capture key manufacturing knowledge before producing the 
product can lead to inefficiencies and quality problems. For example, 
the Wideband Global SATCOM program encountered cost and schedule delays 
because contractor personnel installed fasteners incorrectly. Discovery 
of the problem resulted in extensive inspection and rework to correct 
the deficiencies, contributing to a 15-month schedule delay. 

In addition to demonstrating that the product can be built efficiently, 
our work has shown that production and post-production costs are 
minimized when a fully integrated, capable prototype is demonstrated to 
show it will work as intended and in a reliable manner. We found that 
many programs are susceptible to discovering costly problems late in 
development, when the more complex software and advanced capabilities 
are tested. Of the 33 programs that provided us data about the overlap 
between system development and production, almost three-quarters still 
had or planned to have system demonstration activities left to complete 
after production had begun. For 9 programs, the amount of system 
development work remaining was estimated to be over 4 years. This 
practice of beginning production before successfully demonstrating that 
the weapon system will work as intended increases the potential for 
discovering costly design changes that ripple through production into 
products already fielded. 

Forty programs we assessed provided us information on when they had or 
planned to have tested a fully configured, integrated production 
representative article (i.e., prototype) in the intended environment. 
Of these, 62 percent reported that they did not conduct or do not plan 
to conduct that test before a production decision. We also found 
examples where product reliability is not being demonstrated in a 
timely fashion. Making design changes to achieve reliability 
requirements after production begins is inefficient and costly. For 
example, despite being more than 5 years past the production decision, 
the Air Force's Joint Air-to-Surface Standoff Missile experienced four 
failures during four flight tests in 2007, resulting in an overall 
missile reliability rate of less than 60 percent. The failures halted 
procurement of new missiles by the Air Force until the problems could 
be resolved. 

Absence of Disciplined Systems Engineering Practices Leads to 
Unexecutable Business Cases: 

DOD's poor acquisition outcomes stem from the absence of knowledge that 
disciplined systems engineering practices can bring to decision makers 
prior to beginning a program. Systems engineering is a process that 
translates customer needs into specific product requirements for which 
requisite technological, software, engineering, and production 
capabilities can be identified. These activities include requirements 
analysis, design, and testing in order to ensure that the product's 
requirements are achievable given available resources. Early systems 
engineering provides knowledge that enables a developer to identify and 
resolve gaps before product development begins. Consequently, 
establishing a sound acquisition program with an executable business 
case depends on determining achievable requirements based on systems 
engineering that are agreed to by both the acquirer and developer 
before a program's initiation. We have recently reported on the impact 
that poor systems engineering practices have had on several programs 
such as the Global Hawk Unmanned Aircraft System, F-22A, Expeditionary 
Fighting Vehicle, Joint Air-to-Surface Standoff Missile and others. 
[Footnote 5] 

When early systems engineering, specifically requirements analysis, is 
not performed, increased cost risk to the government and long 
development cycle times can be the result. DOD awards cost 
reimbursement type contracts for the development of major weapon 
systems because of the risk and uncertainty involved with its 
programs.[Footnote 6] Because the government often does not perform the 
necessary systems engineering analysis before a contract is signed to 
determine whether a match exists between requirements and available 
resources, significant contract cost increases can occur as the scope 
of the requirements change or becomes better understood by the 
government and contractor. Another potential consequence of the lack of 
requirements analysis is unpredictable cycle times. Requirements that 
are limited and well-understood contribute to shorter, more predictable 
cycle times. Long cycle times promote instability, especially 
considering DOD's tendency to have changing requirements and program 
manager turnover. On the other hand, time-defined developments can 
allow for more frequent assimilation of new technologies into weapon 
systems and speed new capabilities to the warfighter. In fact, DOD 
itself suggests that system development should be limited to about 5 
years. 

Additional Factors Can Contribute to Poor Weapon Program Outcomes: 

This year, we gathered new data focused on other factors we believe 
could have a significant influence on DOD's ability to improve cost and 
schedule outcomes. These factors were changes to requirements after 
development began, the length of program managers' tenure, reliance on 
contractors for program support, and difficulty managing software 
development. 

Foremost, several DOD programs in our assessment incurred requirement 
changes after the start of system development and experienced cost 
increases. Among the 46 programs we surveyed, RDT&E costs increased by 
11 percent over initial estimates for programs that have not had 
requirements changes, while they increased 72 percent among those that 
had requirements changes (see fig. 2).[Footnote 7] 

Figure 2: Average RDT&E Cost Growth for Programs since Initial 
Estimates: 

[See PDF for image] 

This figure is a vertical bar graph depicting the following data: 

Average RDT&E Cost Growth for Programs since Initial Estimates: 
Programs without requirements changes: 11%; 
Programs with requirements changes: 72%. 

Source: GAO analysis of DOD data. 

[End of figure] 

At the same time, DOD's practice of frequently changing program 
managers during a program's development makes it difficult to hold them 
accountable for the business cases that they are entrusted to manage 
and deliver. Our analysis indicates that for 39 major acquisition 
programs started since March 2001, the average time in system 
development was about 37 months. The average tenure for program 
managers on those programs during that time was about 17 months--less 
than half of what is required by DOD policy. 

We also found that DOD is relying more on contractors to support the 
management and oversight of weapon system acquisitions and contracts. 
For 52 DOD programs that provided information, about 48 percent of the 
program office staff was composed of individuals outside of DOD (see 
table 2). In a prior review of space acquisition programs, we found 
that 8 of 13 cost-estimating organizations and program offices believed 
the number of cost estimators was inadequate and we found that 10 of 
those offices had more contractor personnel preparing cost estimates 
than government personnel. We also found examples during this year's 
assessment where the program offices expressed concerns about having 
inadequate personnel to conduct their program office roles. 

Table 2: Program Office Staffing Composition for 52 DOD Programs 
(Percentage of staff): 

Government: 
Program management: 70; 
Administrative support: 39; 
Business functions: 64; 
Engineering and technical: 48; 
Other: 45; 
Total: 52. 

Support contractors; 
Program management: 22; 
Administrative support: 60; 
Business functions: 35; 
Engineering and technical: 34; 
Other: 55; 
Total: 36. 

Other non-government[A]; 
Program management: 8; 
Administrative support: 1; 
Business functions: 1; 
Engineering and technical: 18; 
Other: 1; 
Total: 12. 

Total non-government; 
Program management: 30; 
Administrative support: 61; 
Business functions: 36; 
Engineering and technical: 52; 
Other: 56; 
Total: 48. 

Source: GAO analysis of DOD data. 

Note: Table may not add due to rounding. 

[A] Other includes federally funded research and development centers, 
universities, and affiliates. 

[End of table] 

Finally, as programs rely more heavily on software to perform critical 
functions for weapon systems, we found that a large number of programs 
are encountering difficulties in managing their software development. 
Roughly half of the programs that provided us software data had at 
least a 25 percent growth in their expected lines of code--a key metric 
used by leading software developers--since system development started. 
For example, software requirements were not well understood on the 
Future Combat Systems when the program began; and as the program moves 
toward preliminary design activities, the number of lines of software 
code has nearly tripled. Changes to the lines of code needed can 
indicate potential cost and schedule problems. 

The Way Forward: Potential Solutions: 

Our work shows that acquisition problems will likely persist until DOD 
provides a better foundation for buying the right things, the right 
way. This involves (1) maintaining the right mix of programs to invest 
in by making better decisions as to which programs should be pursued 
given existing and expected funding and, more importantly, deciding 
which programs should not be pursued; (2) ensuring that programs that 
are started are executable by matching requirements with resources and 
locking in those requirements; and (3) making it clear that programs 
will then be executed based on knowledge and holding program managers 
responsible for that execution. We have made similar recommendations in 
past GAO reports. 

These changes will not be easy to make. They will require DOD to 
reexamine not only its acquisition process, but its requirement setting 
and funding processes as well. They will also require DOD to change how 
it views program success, and what is necessary to achieve success. 
This includes changing the environment and incentives that lead DOD and 
the military services to overpromise on capability and underestimate 
costs in order to sell new programs and capture the funding needed to 
start and sustain them. Finally, none of this will be achieved without 
a true partnership among the department, the military services, the 
Congress, and the defense industry. All of us must embrace the idea of 
change and work diligently to implement it. 

Buy the Right Things: Develop and Implement an Investment Strategy: 

The first, and most important, step toward improving acquisition 
outcomes is implementing a new DOD-wide investment strategy for weapon 
systems. We have reported that DOD should develop an overarching 
strategy and decision-making processes that prioritize programs based 
on a balanced match between customer needs and available department 
resources---that is the dollars, technologies, time, and people needed 
to achieve these capabilities. We also recommended that capabilities 
not designated as a priority should be set out separately as desirable 
but not funded unless resources were both available and sustainable. 
This means that the decision makers responsible for weapon system 
requirements, funding, and acquisition execution must establish an 
investment strategy in concert. 

DOD's Under Secretary of Defense for Acquisition, Technology and 
Logistics--DOD's corporate leader for acquisition--should develop this 
strategy in concert with other senior leaders, for example, combatant 
commanders who would provide input on user needs; DOD's comptroller and 
science and technology leaders, who would provide input on available 
resources; and acquisition executives from the military services, who 
could propose solutions. Finally, once priority decisions are made, 
Congress will need to enforce discipline through its legislative and 
oversight mechanisms. 

Table 3: Key Actions for Developing an Investment Strategy for 
Acquiring New Systems: 

Who: Under Secretary of Defense for Acquisition, Technology and 
Logistics in concert with other senior officials; 
Action: 
* Analyze customer needs vs. wants based on available technology and 
available resources; 
* Compare analysis to DOD's long-term vision; 
* Determine priorities for acquisitions based on this comparison; 
* Separate other programs as "desirable," resources permitting; 
* Enforce funding for priorities annually; measure success against the 
plan. 

Source: GAO. 

[End of table] 

Buy the Right Way: Ensure Individual Programs Are Executable: 

Once DOD has prioritized capabilities, it should work vigorously to 
make sure each new program is executable before the acquisition begins. 
More specifically, this means assuring requirements for specific weapon 
systems are clearly defined and achievable given available resources 
and that all alternatives have been considered. System requirements 
should be agreed to by service acquisition executives as well as 
combatant commanders. Once programs begin, requirements should not 
change without assessing their potential disruption to the program and 
assuring that they can be accommodated within time and funding 
constraints. In addition, DOD should prove that technologies can work 
as intended before including them in acquisition programs. More 
ambitious technology development efforts should be assigned to the 
science and technology community until they are ready to be added to 
future generations of the product. DOD should also require the use of 
independent cost estimates as a basis for budgeting funds. Our work 
over the past 10 years has consistently shown when these basic steps 
are taken, programs are better positioned to be executed within cost 
and schedule. 

To keep programs executable, DOD should demand that all go/no-go 
decisions be based on quantifiable data and demonstrated knowledge. 
These data should cover critical program facets such as cost, schedule, 
technology readiness, design readiness, production readiness, and 
relationships with suppliers. Development should not be allowed to 
proceed until certain knowledge thresholds are met--for example, a high 
percentage of engineering drawings completed at critical design review. 
DOD's current policies encourage these sorts of metrics to be used as a 
basis for decision making, but they do not demand it. DOD should also 
place boundaries on the time allowed for system development. 

Table 4: Key Actions for Making Sure Programs Are Executable: 

Who: Military services and joint developers with support from USD AT&L; 
Action; 
* Keep technology discovery/invention out of acquisition programs; 
* Follow an incremental path toward meeting user needs; assure all 
alternatives are considered; 
* Ensure system requirements are agreed to by service acquisition 
executives and warfighters and that no additional requirements are 
added during execution; 
* Use systems engineering to close gaps between requirements and 
resources prior to launching the development process; 
* Require the use of independent cost estimates as a basis for 
budgeting funds; update cost estimates annually and track against the 
original baseline estimate; 
* Encourage the use of earned value data at each systems engineering 
technical review in order to track program progress against original 
baseline estimates; 
* Use quantifiable data and demonstrable knowledge to make decisions to 
move to next phases; 
* Employ additional management reviews when deviations of cost or 
schedule exceed 10 percent against baseline estimates; 
* Place boundaries on time allowed for specific phases of development. 

Source: GAO. 

[End of table] 

To further ensure that programs are executable, DOD should pursue an 
evolutionary path toward meeting user needs rather than attempting to 
satisfy all needs in a single step. This approach has been consistently 
used by successful commercial companies we have visited over the past 
decade because it provides program managers with more achievable 
requirements, which, in turn, facilitate shorter cycle times. With 
shorter cycle times, the companies we have studied have also been able 
to assure that program managers and senior leaders stay with programs 
throughout the duration of a program. 

DOD has policies that encourage evolutionary development, but programs 
often favor pursuing more revolutionary, exotic solutions that will 
attract funds and support. The department and, more importantly, the 
military services, tend to view success as capturing the funding needed 
to start and sustain a development program. In order to do this, they 
must overpromise capability and underestimate cost. In order for DOD to 
move forward, this view of success must change. World-class commercial 
firms identify success as developing products within cost estimates and 
delivering them on time in order to survive in the marketplace. This 
forces incremental, knowledge-based product development programs that 
improve capability as new technologies are matured. 

Hold People Accountable: 

To strengthen accountability, DOD must also clearly delineate 
responsibilities among those who have a role in deciding what to buy as 
well as those who have role in executing, revising, and terminating 
programs. Within this context, rewards and incentives must be altered 
so that success can be viewed as delivering needed capability at the 
right price and the right time, rather than attracting and retaining 
support for numerous new and ongoing programs. 

To enable accountability to be exercised at the program level once a 
program begins, DOD will need to (1) match program manager tenure with 
development or the delivery of a product; (2) tailor career paths and 
performance management systems to incentivize longer tenures; (3) 
strengthen training and career paths as needed to ensure program 
managers have the right qualifications for run the programs they are 
assigned to; (4) empower program managers to execute their programs, 
including an examination of whether and how much additional authority 
can be provided over funding, staffing, and approving requirements 
proposed after the start of a program; and (5) develop and provide 
automated tools to enhance management and oversight as well as to 
reduce the time required to prepare status information. 

DOD also should hold contractors accountable for results. As we have 
recommended, this means structuring contracts so that incentives 
actually motivate contractors to achieve desired acquisition outcomes 
and withholding fees when those goals are not met. 

Table 5: Key Actions for Accountability: 

Who: The Secretary of Defense and military service secretaries; 
Actions; 
* Make it clear who is accountable on a program for what, including 
program managers, their leaders, stakeholders, and contractors; 
* Hold people accountable when these responsibilities are not met; 
* Require program managers and others, as appropriate, to stay with 
programs until a product is delivered or for system design and 
demonstration; 
* Empower program managers to execute their programs so that they can 
be accountable; strengthen training and career paths as needed to 
ensure that qualified program managers are being assigned; 
* Improve the use of fees in order to hold contractors accountable. 

Source: GAO. 

[End of table] 

Recent DOD Actions Provide Opportunities for Improvement: 

DOD has taken actions related to some of these steps. Based in part on 
GAO recommendations and congressional direction, DOD has recently begun 
to develop several initiatives that, if adopted and implemented 
properly, could provide a foundation for establishing sound, knowledge- 
based business cases for individual acquisition programs and improving 
program outcomes. For example, DOD is experimenting with a new concept 
decision review, different acquisition approaches according to expected 
fielding times, and panels to review weapon system configuration 
changes that could adversely affect program cost and schedule. In 
addition, in September 2007 the Office of the Under Secretary of 
Defense for Acquisition, Technology and Logistics issued a policy 
memorandum to ensure weapon acquisition programs were able to 
demonstrate key knowledge elements that could inform future development 
and budget decisions. This policy directed pending and future programs 
to include acquisition strategies and funding that provide for two or 
more competing contractors to develop technically mature prototypes 
through system development start (knowledge point 1), with the hope of 
reducing technical risk, validating designs and cost estimates, 
evaluating manufacturing processes, and refining requirements. Each of 
the initiatives is designed to enable more informed decisions by key 
department leaders well ahead of a program's start, decisions that 
provide a closer match between each program's requirements and the 
department's resources. 

DOD also plans to implement new practices similar to past GAO 
recommendations that are intended to provide program managers more 
incentives, support, and stability. The department acknowledges that 
any actions taken to improve accountability must be based on a 
foundation whereby program managers can launch and manage programs 
toward greater performance, rather than focusing on maintaining support 
and funding for individual programs. DOD acquisition leaders have told 
us that any improvements to program managers' performance hinge on the 
success of the department's initiatives. 

In addition, DOD has taken actions to strengthen the link between award 
and incentive fees with desired program outcomes, which has the 
potential to increase the accountability of DOD programs for fees paid 
and of contractors for results achieved. 

Concluding Observations: 

In closing, the past year has seen several new proposed approaches to 
improve the way DOD buys weapons. These approaches have come from 
within the department, from highly credible commissions established by 
the department, and from GAO. They are based on solid principles. If 
they are to produce better results, however, they must heed the lessons 
taught--but perhaps not learned--by various past studies and by DOD's 
acquisition history itself. Specifically, DOD must do a better job of 
prioritizing its needs in the context of the nation's greater fiscal 
challenges. It must become more disciplined in managing the mix of 
programs to meet available funds. If everything is a priority, nothing 
is a priority. 

Policy must also be manifested in decisions on individual programs or 
reform will be blunted. DOD's current acquisition policy is a case in 
point. The policy supports a knowledge-based, evolutionary approach to 
acquiring new weapons. However, the practice--decisions made on 
individual programs--sacrifices knowledge and realism about what can 
done within the available time and funding in favor of revolutionary 
solutions. 

Reform will not be real unless each weapon system is shown to be both a 
worthwhile investment and a realistic, executable program based on the 
technology, time, and money available. This cannot be done until the 
acquisition environment is changed along with the incentives associated 
with it. DOD and the military services cannot continue to view success 
through the prism of securing the funding needed to start and sustain 
new programs. Success must be defined in terms of delivering the 
warfighter capabilities when needed and as promised and incentives must 
be aligned to encourage a disciplined, knowledge-based approach to 
achieve this end. 

The upcoming change in administration presents challenges as well as 
opportunities to improve the process and its outcomes through sustained 
implementation of best practices, as well as addressing new issues that 
may emerge. Significant changes will only be possible with greater, and 
continued, department level support, including strong and consistent 
vision, direction, and advocacy from DOD leadership, as well as 
sustained oversight and cooperation from the Congress. In addition, all 
of the players involved with acquisitions--the requirements community; 
the Joint Chiefs of Staff; the comptroller; the Under Secretary of 
Defense for Acquisition, Technology and Logistics; and perhaps most 
importantly, the military services--must be unified in implementing 
reforms from top to bottom. 

Mr. Chairmen and Members of the Committee and Subcommittee, this 
concludes my statement. I will be happy to take any questions that you 
may have at this time. 

Contacts and Staff Acknowledgements: 

For further questions about this statement, please contact Michael J. 
Sullivan at (202) 512-4841. Individuals making key contributions to 
this statement include Ron Schwenn, Assistant Director; Ridge C. 
Bowman; Quindi C. Franco; Matthew B. Lea; Brian Mullins; Kenneth E. 
Patton, and Alyssa B. Weir. 

[End of section] 

Related GAO Products: 

Best Practices: Increased Focus on Requirements and Oversight Needed to 
Improve DOD's Acquisition Environment and Weapon System Quality. 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-294]. Washington, 
D.C.: Feb. 1, 2008. 

Defense Acquisitions: Assessments of Selected Weapon Programs. 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-406SP]. 
Washington, D.C.: March 30, 2007. 

Best Practices: Stronger Practices Needed to Improve DOD Technology 
Transition Processes. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-
06-883]. Washington, D.C.: September 14, 2006. 

Best Practices: Better Support of Weapon System Program Managers Needed 
to Improve Outcomes. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-
06-110]. Washington, D.C.: November 1, 2005. 

Defense Acquisitions: Major Weapon Systems Continue to Experience Cost 
and Schedule Problems under DOD's Revised Policy. [hyperlink, 
http://www.gao.gov/cgi-bin/getrpt?GAO-06-368]. Washington, D.C.: April 
13, 2006. 

DOD Acquisition Outcomes: A Case for Change. [hyperlink, 
http://www.gao.gov/cgi-bin/getrpt?GAO-06-257T]. Washington, D.C.: 
November 15, 2005. 

Defense Acquisitions: Stronger Management Practices Are Needed to 
Improve DOD's Software-Intensive Weapon Acquisitions. [hyperlink, 
http://www.gao.gov/cgi-bin/getrpt?GAO-04-393]. Washington, D.C.: March 
1, 2004. 

Best Practices: Setting Requirements Differently Could Reduce Weapon 
Systems' Total Ownership Costs. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-03-57]. Washington, D.C.: February 11, 2003. 

Defense Acquisitions: Factors Affecting Outcomes of Advanced Concept 
Technology Demonstration. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-03-52]. Washington, D.C.: December 2, 2002. 

Best Practices: Capturing Design and Manufacturing Knowledge Early 
Improves Acquisition Outcomes. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-02-701]. Washington, D.C.: July 15, 2002. 

Defense Acquisitions: DOD Faces Challenges in Implementing Best 
Practices. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-02-469T]. 
Washington, D.C.: February 27, 2002. 

Best Practices: Better Matching of Needs and Resources Will Lead to 
Better Weapon System Outcomes. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-01-288]. Washington, D.C.: March 8, 2001. 

Best Practices: A More Constructive Test Approach Is Key to Better 
Weapon System Outcomes. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO/NSIAD-00-199]. Washington, D.C.: July 31, 2000. 

Defense Acquisition: Employing Best Practices Can Shape Better Weapon 
System Decisions. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-
NSIAD-00-137]. Washington, D.C.: April 26, 2000. 

Best Practices: DOD Training Can Do More to Help Weapon System Programs 
Implement Best Practices. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO/NSIAD-99-206]. Washington, D.C.: August 16, 1999. 

Best Practices: Better Management of Technology Development Can Improve 
Weapon System Outcomes. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO/NSIAD-99-162]. Washington, D.C.: July 30, 1999. 

Defense Acquisitions: Best Commercial Practices Can Improve Program 
Outcomes. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-NSIAD-99-
116]. Washington, D.C.: March 17, 1999. 

Defense Acquisitions: Improved Program Outcomes Are Possible. 
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-NSIAD-98-123]. 
Washington, D.C.: March 17, 1998. 

Best Practices: Successful Application to Weapon Acquisition Requires 
Changes in DOD's Environment. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO/NSIAD-98-56]. Washington, D.C.: February 24, 1998. 

Best Practices: Commercial Quality Assurance Practices Offer 
Improvements for DOD. [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO/NSIAD-96-162]. Washington, D.C.: August 26, 1996. 

[End of section] 

Footnotes: 

[1] Major defense acquisition programs (MDAP) are those identified by 
DOD that require eventual total research, development, test, and 
evaluation (RDT&E) expenditures of more than $365 million or $2.19 
billion for procurement in fiscal year 2000 constant dollars. 

[2] Not all 72 programs in this year's assessment provided information 
for every knowledge point or had proceeded through system development. 
Details of our scope and methodology can be found in GAO-08-467SP. 

[3] Our analysis in this area reflects comparisons of performance for 
programs meeting DOD's criteria for being a major defense acquisition 
program in fiscal year 2007 and programs meeting the same criteria in 
fiscal years 2005 and 2000. The analysis does not include all the same 
systems in all 3 years. 

[4] We have excluded two programs from this calculation, Light Utility 
Helicopter and Joint Cargo Aircraft. While we have assessed these 
programs as having mature manufacturing processes, this is because they 
are commercial acquisitions, not because processes were demonstrated to 
be in statistical control. Also, the Multifunctional Information 
Distribution System (MIDS) program indicates that its two critical 
processes are in statistical control but it has not formally entered 
the production phase. 

[5] GAO, Best Practices: Increased Focus on Requirements and Oversight 
Needed to Improve DOD's Acquisition Environment and Weapon System 
Quality, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-294] 
(Washington D.C.: Feb. 1, 2008). 

[6] In contrast, a firm-fixed price contract provides for a pre- 
established price, and places more risk and responsibility for costs 
and resulting profit or loss on the contractor and provides more 
incentive for efficient and economical performance. With either a cost 
reimbursement or firm-fixed price type contract, if the government 
changes the requirements after performance has begun, which then causes 
a price or cost increase to the contractor, the government must pay for 
these changes. 

[7] This average does not include the C-130 J program because of its 
extreme RDT&E cost growth. The average including C-130 J is 210 
percent. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office: 
441 G Street NW, Room LM: 
Washington, D.C. 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: