This is the accessible text file for GAO report number GAO-10-2 
entitled 'Information Technology: Agencies Need to Improve the 
Implementation and Use of Earned Value Techniques to Help Manage Major 
System Acquisitions' which was released on November 9, 2009. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to the Chairman, Subcommittee on Federal Financial Management, 
Government Information, Federal Services, and International Security, 
Committee on Homeland Security and Governmental Affairs, U.S. Senate: 

United States Government Accountability Office: GAO: 

October 2009: 

Information Technology: 

Agencies Need to Improve the Implementation and Use of Earned Value 
Techniques to Help Manage Major System Acquisitions: 

GAO-10-2: 

GAO Highlights: 

Highlights of GAO-10-2, a report to the Chairman, Subcommittee on 
Federal Financial Management, Government Information, Federal Services, 
and International Security, Committee on Homeland Security and 
Governmental Affairs, U.S. Senate. 

Why GAO Did This Study: 

In fiscal year 2009, the federal government planned to spend about $71 
billion on information technology (IT) investments. To more effectively 
manage such investments, in 2005 the Office of Management and Budget 
(OMB) directed agencies to implement earned value management (EVM). EVM 
is a project management approach that, if implemented appropriately, 
provides objective reports of project status, produces early warning 
signs of impending schedule delays and cost overruns, and provides 
unbiased estimates of anticipated costs at completion. 

GAO was asked to assess selected agencies’ EVM policies, determine 
whether they are adequately using earned value techniques to manage key 
system acquisitions, and evaluate selected investments’ earned value 
data to determine their cost and schedule performances. To do so, GAO 
compared agency policies with best practices, performed case studies, 
and reviewed documentation from eight agencies and 16 major investments 
with the highest levels of IT development-related spending in fiscal 
year 2009. 

What GAO Found: 

While all eight agencies have established policies requiring the use of 
EVM on major IT investments, these policies are not fully consistent 
with best practices. In particular, most lack training requirements for 
all relevant personnel responsible for investment oversight. Most 
policies also do not have adequately defined criteria for revising 
program cost and schedule baselines. Until agencies expand and enforce 
their EVM policies, it will be difficult for them to gain the full 
benefits of EVM. 

GAO’s analysis of 16 investments shows that agencies are using EVM to 
manage their system acquisitions; however, the extent of implementation 
varies. Specifically, for 13 of the 16 investments, key practices 
necessary for sound EVM execution had not been implemented. For 
example, the project schedules for these investments contained issues—
such as the improper sequencing of key activities—that undermine the 
quality of their performance baselines. This inconsistent application 
of EVM exists in part because of the weaknesses contained in agencies’ 
policies, combined with a lack of enforcement of policies already in 
place. Until key EVM practices are fully implemented, these investments 
face an increased risk that managers cannot effectively optimize EVM as 
a management tool. 

Furthermore, earned value data trends of these investments indicate 
that most are currently experiencing shortfalls against cost and 
schedule targets. The total life-cycle costs of these programs have 
increased by about $2 billion. Based on GAO’s analysis of current 
performance trends, 11 programs will likely incur cost overruns that 
will total about $1 billion at contract completion—in particular, 2 of 
these programs account for about 80 percent of this projection. As 
such, GAO estimates the total cost overrun to be about $3 billion at 
program completion (see figure). However, with timely and effective 
management action, it is possible to reverse negative trends so that 
the projected cost overruns may be reduced. 

Figure: Cost Overruns Incurred and Projected Overruns of 16 Programs: 

$2.0 billion: Life-cycle cost overruns already incurred; $1.0 billion: 
GAO-estimated most likely cost overruns; $3.0 billion: GAO-estimated 
total cost overrun at completion. 

Source: GAO analysis of program data. 

[End of figure] 

What GAO Recommends: 

GAO is recommending that the selected agencies modify EVM policies to 
be consistent with best practices, implement EVM practices that address 
identified weaknesses, and manage negative earned value trends. Seven 
agencies that commented on a draft of this report generally agreed with 
GAO’s results and recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-10-2] or key 
components. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Agencies' EVM Policies Are Not Comprehensive: 

Agencies' Key Acquisition Programs Are Using EVM, but Are Not 
Consistently Implementing Key Practices: 

Earned Value Data Show Trends of Cost Overruns and Schedule Slippages 
on Most Programs: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Case Studies of Selected Programs' Implementation of 
Earned Value Management: 

Appendix III: Comments from the Department of Commerce: 

Appendix IV: Comments from the Department of Defense: 

Appendix V: Comments from the Department of Justice: 

Appendix VI: Comments from the National Aeronautics and Space 
Administration: 

Appendix VII: Comments from the Department of Veterans Affairs: 

Appendix VIII: GAO Contact and Staff Acknowledgments: 

Related GAO Products: 

Tables: 

Table 1: Key Components of an Effective EVM Policy: 

Table 2: Assessment of Key Agencies' EVM Policies: 

Table 3: Eleven Key EVM Practices for System Acquisition Programs: 

Table 4: Assessment of EVM Practices for Case Study Programs: 

Table 5: Program Life-cycle Cost Estimate Changes: 

Table 6: Contractor Cumulative Cost and Schedule Performances: 

Table 7: Sixteen Case Study Programs: 

Table 8: GAO EVM Practice Assessment of Agriculture's MIDAS Program: 

Table 9: GAO EVM Practice Assessment of Commerce's DRIS Program: 

Table 10: GAO EVM Practice Assessment of Commerce's FDCA Program: 

Table 11: GAO EVM Practice Assessment of Defense's AOC Program: 

Table 12: GAO EVM Practice Assessment of Defense's JTRS-HMS Program: 

Table 13: GAO EVM Practice Assessment of Defense's WIN-T Program: 

Table 14: GAO EVM Practice Assessment of Homeland Security's ACE 
Program: 

Table 15: GAO EVM Practice Assessment of Homeland Security's Deepwater 
COP Program: 

Table 16: GAO EVM Practice Assessment of Homeland Security's WHTI 
Program: 

Table 17: GAO EVM Practice Assessment of Justice's NGI Program: 

Table 18: GAO EVM Practice Assessment of NASA's JWST Project: 

Table 19: GAO EVM Practice Assessment of NASA's Juno Project: 

Table 20: GAO EVM Practice Assessment of NASA's MSL Project: 

Table 21: GAO EVM Practice Assessment of Transportation's ERAM Program: 

Table 22: GAO EVM Practice Assessment of Transportation's SBS Program: 

Table 23: GAO EVM Practice Assessment of Veterans Affairs' VistA-FM 
Program: 

Figures: 

Figure 1: GAO EV Data Analysis of Agriculture's MIDAS Program: 

Figure 2: GAO EV Data Analysis of Commerce's DRIS Program: 

Figure 3: GAO EV Data Analysis of Commerce's FDCA Program: 

Figure 4: GAO EV Data Analysis of Defense's AOC Program: 

Figure 5: GAO EV Data Analysis of Defense's JTRS-HMS Program: 

Figure 6: GAO EV Data Analysis of Defense's WIN-T Program: 

Figure 7: GAO EV Data Analysis of Homeland Security's ACE Program: 

Figure 8: GAO EV Data Analysis of Homeland Security's Deepwater COP 
Program: 

Figure 9: GAO EV Data Analysis of Homeland Security's WHTI Program: 

Figure 10: GAO EV Data Analysis of Justice's NGI Program: 

Figure 11: GAO EV Data Analysis of NASA's JWST Project: 

Figure 12: GAO EV Data Analysis of NASA's Juno Project: 

Figure 13: GAO EV Data Analysis of NASA's MSL Project: 

Figure 14: GAO EV Data Analysis of Transportation's ERAM Program: 

Figure 15: GAO EV Data Analysis of Transportation's SBS Program: 

Figure 16: GAO EV Data Analysis of Veterans Affairs' VistA-FM Program: 

Abbreviations: 

ACE: Automated Commercial Environment: 

ANSI: American National Standards Institute AOC Air and Space 
Operations Center--Weapon System: 

COP: Integrated Deepwater System--Common Operational Picture: 

DOD: Department of Defense: 

DRIS: Decennial Response Integration System: 

EIA: Electronic Industries Alliance: 

ERAM: En Route Automation Modernization: 

EV: earned value: 

EVM: earned value management: 

FDCA: Field Data Collection Automation: 

IT: information technology: 

JTRS-HMS: Joint Tactical Radio System--Handheld, Manpack, Small Form 
Fit: 

JWST: James Webb Space Telescope: 

MIDAS: Farm Program Modernization: 

MSL: Mars Science Laboratory: 

NASA: National Aeronautics and Space Administration: 

NGI: Next Generation Identification: 

OMB: Office of Management and Budget: 

SBS: Surveillance and Broadcast System: 

VistA-FM: Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization: 

WHTI: Western Hemisphere Travel Initiative: 

WIN-T: Warfighter Information Network--Tactical: 

[End of section] 

United States Government Accountability Office: Washington, DC 20548: 

October 8, 2009: 

The Honorable Thomas R. Carper: 
Chairman: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: Committee on Homeland 
Security and Governmental Affairs: United States Senate: 

Dear Mr. Chairman: 

In fiscal year 2009, the federal government planned to spend over $70 
billion on information technology (IT) investments, many of which 
involve systems and technologies to modernize legacy systems, increase 
communication and networking capabilities, and transition to new 
systems designed to significantly improve the government's ability to 
carry out critical mission functions into the 21st century. To more 
effectively manage such investments, the Office of Management and 
Budget (OMB) has a number of key initiatives under way--one of which 
was established in 2005 and directs agencies to implement earned value 
management (EVM).[Footnote 1] EVM is a project management approach 
that, if implemented appropriately, provides objective reports of 
project status, produces early warning signs of impending schedule 
slippages and cost overruns, and provides unbiased estimates of 
anticipated costs at completion. 

This report responds to your request that we review the federal 
government's use of EVM. Specifically, our objectives were to (1) 
assess whether key departments and agencies have appropriately 
established EVM policies, (2) determine whether these agencies are 
adequately using earned value techniques to manage key system 
acquisitions, and (3) evaluate the earned value data of these selected 
investments to determine their cost and schedule performances. 

To address our objectives, we reviewed agency EVM policies and 
individual programs' EVM-related documentation, including cost 
performance reports and project schedules, from eight agencies and 16 
major investments from those agencies, respectively.[Footnote 2] The 
eight agencies account for about 75 percent of the planned IT spending 
for fiscal year 2009. The 16 programs selected for case study represent 
investments with about $3.5 billion in total planned spending for 
system development work in fiscal year 2009. We compared the agencies' 
policies and practices with federal standards and best practices of 
leading organizations to determine the effectiveness of their use of 
earned value data in managing IT investments. We also analyzed the 
earned value data from the programs to determine whether they are 
projected to finish within planned cost and schedule targets. In 
addition, we interviewed relevant agency officials, including key 
personnel on programs that we selected for case study and officials 
responsible for implementing EVM. 

We conducted this performance audit from February to October 2009, in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objective. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objective. Appendix I contains further 
details about our objectives, scope, and methodology. See also the page 
of related products at the end of this report for previous work that we 
have done on certain programs in our case studies. 

Background: 

Each year, OMB and federal agencies work together to determine how much 
the government plans to spend on IT projects and how these funds are to 
be allocated. Planned federal IT spending in fiscal year 2009 totaled 
about $71 billion--of which $22 billion was planned for IT system 
development work, and the remainder was planned for operations and 
maintenance of existing systems. OMB plays a key role in overseeing 
federal agencies' IT investments and how they are managed, stemming 
from its functions of assisting the President in overseeing the 
preparation of the federal budget and supervising budget preparation in 
executive branch agencies. In helping to formulate the President's 
spending plans, OMB is responsible for evaluating the effectiveness of 
agency programs, policies, and procedures; assessing competing funding 
demands among agencies; and setting funding priorities. To carry out 
these responsibilities, OMB depends on agencies to collect and report 
accurate and complete information; these activities depend, in turn, on 
agencies having effective IT management practices. 

To drive improvement in the implementation and management of IT 
projects, Congress enacted the Clinger-Cohen Act in 1996, expanding the 
responsibilities delegated to OMB and agencies under the Paperwork 
Reduction Act.[Footnote 3] The Clinger-Cohen Act requires agencies to 
engage in performance-and results-based management, and to implement 
and enforce IT management policies and guidelines. The act also 
requires OMB to establish processes to analyze, track, and evaluate the 
risks and results of major capital investments in information systems 
made by executive agencies. 

Over the past several years, we have reported and testified on OMB's 
initiatives to highlight troubled projects,[Footnote 4] justify IT 
investments,[Footnote 5] and use project management tools.[Footnote 6] 
We have made multiple recommendations to OMB and federal agencies to 
improve these initiatives to further enhance the oversight and 
transparency of federal IT projects. As a result, OMB recently used 
this body of work to develop and implement improved processes to 
oversee and increase transparency of IT investments. Specifically, in 
June 2009, OMB publicly deployed a Web site that displays dashboards of 
all major federal IT investments to provide OMB and others with the 
ability to track the progress of these investments over time. 

EVM Provides Insight on Program Cost and Schedule: 

Given the size and significance of the government's investment in IT, 
it is important that projects be managed effectively to ensure that 
public resources are wisely invested. Effectively managing projects 
entails, among other things, pulling together essential cost, schedule, 
and technical information in a meaningful, coherent fashion so that 
managers have an accurate view of the program's development status. 
Without meaningful and coherent cost and schedule information, program 
managers can have a distorted view of a program's status and risks. To 
address this issue, in the 1960s, the Department of Defense (DOD) 
developed the EVM technique, which goes beyond simply comparing 
budgeted costs with actual costs. This technique measures the value of 
work accomplished in a given period and compares it with the planned 
value of work scheduled for that period and with the actual cost of 
work accomplished. 

Differences in these values are measured in both cost and schedule 
variances. Cost variances compare the value of the completed work 
(i.e., the earned value) with the actual cost of the work performed. 
For example, if a contractor completed $5 million worth of work and the 
work actually cost $6.7 million, there would be a negative $1.7 million 
cost variance. Schedule variances are also measured in dollars, but 
they compare the earned value of the completed work with the value of 
the work that was expected to be completed. For example, if a 
contractor completed $5 million worth of work at the end of the month 
but was budgeted to complete $10 million worth of work, there would be 
a negative $5 million schedule variance. Positive variances indicate 
that activities are costing less or are completed ahead of schedule. 
Negative variances indicate activities are costing more or are falling 
behind schedule. These cost and schedule variances can then be used in 
estimating the cost and time needed to complete the program. 

Without knowing the planned cost of completed work and work in progress 
(i.e., the earned value), it is difficult to determine a program's true 
status. Earned value allows for this key information, which provides an 
objective view of program status and is necessary for understanding the 
health of a program. As a result, EVM can alert program managers to 
potential problems sooner than using expenditures alone, thereby 
reducing the chance and magnitude of cost overruns and schedule 
slippages. Moreover, EVM directly supports the institutionalization of 
key processes for acquiring and developing systems and the ability to 
effectively manage investments--areas that are often found to be 
inadequate on the basis of our assessments of major IT investments. 

Federal Guidance Calls for Using EVM to Improve IT Management: 

In August 2005, OMB issued guidance outlining steps that agencies must 
take for all major and high-risk development projects to better ensure 
improved execution and performance and to promote more effective 
oversight through the implementation of EVM.[Footnote 7] Specifically, 
this guidance directs agencies to (1) develop comprehensive policies to 
ensure that their major IT investments are using EVM to plan and manage 
development; (2) include a provision and clause in major acquisition 
contracts or agency in-house project charters directing the use of an 
EVM system that is compliant with the American National Standards 
Institute (ANSI) standard;[Footnote 8] (3) provide documentation 
demonstrating that the contractor's or agency's in-house EVM system 
complies with the national standard; (4) conduct periodic surveillance 
reviews; and (5) conduct integrated baseline reviews[Footnote 9] on 
individual programs to finalize their cost, schedule, and performance 
goals. 

Building on OMB's requirements, in March 2009, we issued a guide on 
best practices for estimating and managing program costs.[Footnote 10] 
This guide highlights the policies and practices adopted by leading 
organizations to implement an effective EVM program. Specifically, in 
the guide, we identify the need for organizational policies that 
establish clear criteria for which programs are required to use EVM, 
specify compliance with the ANSI standard, require a standard product-
oriented structure for defining work products, require integrated 
baseline reviews, provide for specialized training, establish criteria 
and conditions for rebaselining programs, and require an ongoing 
surveillance function. In addition, we identify key practices that 
individual programs can use to ensure that they establish a sound EVM 
system, that the earned value data are reliable, and that the data are 
used to support decision making. 

Prior Reviews on Agency Use of EVM to Acquire and Manage IT Systems 
Have Identified Weaknesses: 

We have previously reported on the weaknesses associated with the 
implementation of sound EVM programs at various agencies, as well as on 
the lack of aggressive management action to correct poor cost and 
schedule performance trends based on earned value data for major system 
acquisition programs: 

* In July 2008, we reported that the Federal Aviation Administration's 
EVM policy was not fully consistent with best practices.[Footnote 11] 
For example, the agency required its program managers to obtain EVM 
training, but did not enforce completion of this training or require 
other relevant personnel to obtain this training. In addition, although 
the agency was using EVM to manage IT acquisitions, not all programs 
were ensuring that their earned value data were reliable. Specifically, 
of the three programs collecting EVM data, only one program adequately 
ensured that its earned value data were reliable. As a result, the 
agency faced an increased risk that managers were not getting the 
information they needed to effectively manage the programs. In response 
to our findings and recommendations, the Federal Aviation 
Administration reported that it had initiatives under way to improve 
its EVM oversight processes. 

* In September 2008, we reported that the Department of the Treasury's 
EVM policy was not fully consistent with best practices.[Footnote 12] 
For example, while the department's policy addressed some practices, 
such as establishing clear criteria for which programs are to use EVM, 
it did not address others, such as requiring and enforcing EVM 
training. In addition, six programs at Treasury and its bureaus were 
not consistently implementing practices needed for establishing a 
comprehensive EVM system. For example, when executing work plans and 
recording actual costs, a key practice for ensuring that the data 
resulting from the EVM system are reliable, only two of the six 
investments that we reviewed incorporated government costs with 
contractor costs. As a result, we reported that Treasury may not be 
able to effectively manage its critical programs. In response to our 
findings and recommendations, Treasury reported that it would release a 
revised EVM policy and further noted that initiatives to improve EVM- 
related training were under way. 

* In a series of reports and testimonies from September 2004 to June 
2009, we reported that the National Oceanic and Atmospheric 
Administration's National Polar-orbiting Operational Environmental 
Satellite System program was likely to overrun its contract at 
completion on the basis of our analysis of contractor EVM data. 
[Footnote 13] Specifically, the program had delayed key milestones and 
experienced technical issues in the development of key sensors, which 
we stated would affect cost and schedule estimates. As predicted, in 
June 2006 the program was restructured, decreasing its complexity, 
delaying the availability of the first satellite by 3 to 5 years, and 
increasing its cost estimate from $6.9 billion to $12.5 billion. 
However, the program has continued to face significant technical and 
management issues. As of June 2009, launch of the first satellite was 
delayed by 14 months, and our current projected total cost estimate is 
approximately $15 billion. We made multiple recommendations to improve 
this program, including establishing a realistic time frame for 
revising the cost and schedule baselines, developing plans to mitigate 
the risk of gaps in satellite continuity, and tracking the program 
executive committee's action items from inception to closure. 

Agencies' EVM Policies Are Not Comprehensive: 

While the eight agencies we reviewed have established policies 
requiring the use of EVM on their major IT investments, none of these 
policies are fully consistent with best practices, such as 
standardizing the way work products are defined. We recently reported 
[Footnote 14] that leading organizations establish EVM policies that: 

* establish clear criteria for which programs are to use EVM; 

* require programs to comply with the ANSI standard; 

* require programs to use a product-oriented structure for defining 
work products; 

* require programs to conduct detailed reviews of expected costs, 
schedules, and deliverables (called an integrated baseline review); 

* require and enforce EVM training; 

* define when programs may revise cost and schedule baselines (called 
rebaselining); and: 

* require system surveillance--that is, routine validation checks to 
ensure that major acquisitions are continuing to comply with agency 
policies and standards. 

Table 1 describes the key components of an effective EVM policy. 

Table 1: Key Components of an Effective EVM Policy: 

Component: Clear criteria for implementing EVM on all major IT 
investments; Description: OMB requires agencies to implement EVM on all 
major IT investments and ensure that the corresponding contracts 
include provisions for using EVM systems. However, each agency is 
responsible for establishing its own definition of a "major" IT 
investment. As a result, agencies should clearly define the conditions 
under which a new or ongoing acquisition program is required to 
implement EVM. 

Component: Compliance with the ANSI standard; Description: OMB requires 
agencies to use EVM systems that are compliant with a national standard 
developed by ANSI and EIA (ANSI/EIA-748-B). This standard consists of 
32 guidelines that an organization can use to establish a sound EVM 
system, ensure that the data resulting from the EVM system are 
reliable, and use earned value data for decision-making purposes. 

Component: Standard structure for defining the work products; 
Description: The work breakdown structure defines the work necessary to 
accomplish a program's objectives. It is the first criterion stated in 
the ANSI standard and the basis for planning the program baseline and 
assigning responsibility for the work. It is a best practice to 
establish a product-oriented work breakdown structure because it allows 
a program to track cost and schedule by defined deliverables, such as a 
hardware or software component. This allows a program manager to more 
precisely identify which components are causing cost or schedule 
overruns and to more effectively mitigate the root cause of the 
overruns. Standardizing the work breakdown structure is also considered 
a best practice because it enables an organization to collect and share 
data among programs. 

Component: Integrated baseline review; Description: An integrated 
baseline review is an evaluation of the performance measurement 
baseline--the foundation for an EVM system--to determine whether all 
program requirements have been addressed, risks have been identified, 
mitigation plans are in place, and available and planned resources are 
sufficient to complete the work. The main goal of an integrated 
baseline review is to identify potential program risks, including risks 
associated with costs, management processes, resources, schedules, and 
technical issues. 

Component: Training requirements; Description: EVM training should be 
provided and enforced for all personnel with investment oversight and 
program management responsibilities. Executive personnel with oversight 
responsibilities need to understand EVM terms and analysis products to 
make sound investment decisions. Program managers and staff need to be 
able to interpret and validate earned value data to effectively manage 
deliverables, costs, and schedules. 

Component: Rebaselining criteria; Description: At times, management may 
conclude that the remaining budget and schedule targets for completing 
a program (including the contract) are significantly insufficient, and 
that the current baseline is no longer valid for realistic performance 
measurement. Management may decide that a revised baseline for the 
program is needed to restore its control of the remaining work effort. 
An agency's rebaselining criteria should define acceptable reasons for 
rebaselining and require programs to (1) explain why the current plan 
is no longer feasible and what measures will be implemented to prevent 
recurrence and (2) develop a realistic cost and schedule estimate for 
remaining work that has been validated and spread over time to the new 
plan. 

Component: System surveillance; 
Description: Surveillance is the process of reviewing a program's 
(including contractors') EVM system as it is applied to one or more 
programs. The purpose of surveillance is to focus on how well a program 
is using its EVM system to manage cost, schedule, and technical 
performances. The following two goals are associated with EVM system 
surveillance: (1) ensure that the program is following corporate 
processes and procedures and (2) confirm that the program's processes 
and procedures continue to satisfy ANSI guidelines. 

Source: GAO-09-3SP. 

[End of table] 

The eight agencies we reviewed do not have comprehensive EVM policies. 
Specifically, none of the agencies' policies are fully consistent with 
all seven key components of an effective EVM policy. Table 2 provides a 
detailed assessment, by agency, and a discussion of the agencies' 
policies follows the table. 

Table 2: Assessment of Key Agencies' EVM Policies: 

Agency: Agriculture; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed all EVM practices in this policy area; Compliance with 
the ANSI standard: The agency addressed all EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency did not address any EVM practices in this policy area; 
Integrated baseline review: The agency addressed all EVM practices in 
this policy area; Training requirements: The agency addressed some EVM 
practices in this policy area; Rebaselining criteria: The agency 
addressed some EVM practices in this policy area; System surveillance: 
The agency addressed all EVM practices in this policy area. 

Agency: Commerce; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed all EVM practices in this policy area; Compliance with 
the ANSI standard: The agency addressed all EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency did not address any EVM practices in this policy area; 
Integrated baseline review: The agency addressed all EVM practices in 
this policy area; Training requirements: The agency addressed all EVM 
practices in this policy area; Rebaselining criteria: The agency 
addressed all EVM practices in this policy area; System surveillance: 
The agency addressed all EVM practices in this policy area. 

Agency: Defense; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed all EVM practices in this policy area; Compliance with 
the ANSI standard: The agency addressed all EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency addressed all EVM practices in this policy area; Integrated 
baseline review: The agency addressed all EVM practices in this policy 
area; Training requirements: The agency addressed some EVM practices in 
this policy area; Rebaselining criteria: The agency addressed all EVM 
practices in this policy area; System surveillance: The agency 
addressed all EVM practices in this policy area. 

Agency: Homeland Security; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed all EVM practices in this policy area; Compliance with 
the ANSI standard: The agency addressed all EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency addressed some EVM practices in this policy area; Integrated 
baseline review: The agency addressed all EVM practices in this policy 
area; Training requirements: The agency addressed some EVM practices in 
this policy area; Rebaselining criteria: The agency addressed some EVM 
practices in this policy area; System surveillance: The agency 
addressed all EVM practices in this policy area. 

Agency: Justice; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed all EVM practices in this policy area; Compliance with 
the ANSI standard: The agency addressed all EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency addressed some EVM practices in this policy area; Integrated 
baseline review: The agency addressed all EVM practices in this policy 
area; Training requirements: The agency addressed some EVM practices in 
this policy area; Rebaselining criteria: The agency addressed all EVM 
practices in this policy area; System surveillance: The agency 
addressed all EVM practices in this policy area. 

Agency: National Aeronautics and Space Administration; Clear criteria 
for implementing EVM on all major IT investments: The agency addressed 
all EVM practices in this policy area; Compliance with the ANSI 
standard: The agency addressed all EVM practices in this policy area; 
Standard structure for defining the work products: The agency addressed 
some EVM practices in this policy area; Integrated baseline review: The 
agency addressed all EVM practices in this policy area; Training 
requirements: The agency addressed some EVM practices in this policy 
area; Rebaselining criteria: The agency addressed some EVM practices in 
this policy area; System surveillance: The agency addressed all EVM 
practices in this policy area. 

Agency: Transportation; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed all EVM practices in this policy area; Compliance with 
the ANSI standard: The agency addressed some EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency did not address any EVM practices in this policy area; 
Integrated baseline review: The agency addressed all EVM practices in 
this policy area; Training requirements: The agency addressed some EVM 
practices in this policy area; Rebaselining criteria: The agency 
addressed some EVM practices in this policy area; System surveillance: 
The agency addressed all EVM practices in this policy area. 

Agency: Veterans Affairs; 
Clear criteria for implementing EVM on all major IT investments: The 
agency addressed some EVM practices in this policy area; Compliance 
with the ANSI standard: The agency addressed all EVM practices in this 
policy area; Standard structure for defining the work products: The 
agency did not address any EVM practices in this policy area; 
Integrated baseline review: The agency addressed all EVM practices in 
this policy area; Training requirements: The agency addressed some EVM 
practices in this policy area; Rebaselining criteria: The agency 
addressed some EVM practices in this policy area; System surveillance: 
The agency addressed all EVM practices in this policy area. 

Source: GAO analysis of agency data. 

[End of table] 

* Criteria for implementing EVM on all IT major investments: Seven of 
the eight agencies fully defined criteria for implementing EVM on major 
IT investments. The agencies with sound policies typically defined 
"major" investments as those exceeding a certain cost threshold, and, 
in some cases, agencies defined lower tiers of investments requiring 
reduced levels of EVM compliance. Veterans Affairs only partially met 
this key practice because its policy did not clearly state whether 
programs or major subcomponents of programs (projects and subprojects) 
had to comply with EVM requirements. According to agency officials, 
this lack of clarity may cause EVM to be inconsistently applied across 
the investments. Without an established policy that clearly defines the 
conditions under which new or ongoing acquisition programs are required 
to implement EVM, these agencies cannot ensure that EVM is being 
appropriately applied on their major investments. 

* Compliance with the ANSI standard: Seven of the eight agencies 
required that all work activities performed on major investments be 
managed by an EVM system that complies with industry standards. One 
agency, Transportation, partially met this key practice because its 
policy contained inconsistent criteria for when investments must comply 
with standards. Specifically, in one section, the policy requires a 
certain class of investments to adhere to a subset of the ANSI 
standard; however, in another section, the policy merely states that 
the investments must comply with general EVM principles. This latter 
section is vague and could be interpreted in multiple ways, either more 
broadly or narrowly than the specified subset of the ANSI standard. 
Without consistent criteria on investment compliance, Transportation 
may be unable to ensure that the work activities for some of its major 
investments are establishing sound EVM systems that produce reliable 
earned value data and provide the basis for informed decision making. 

* Standard structure for defining the work products: DOD was the only 
agency to fully meet this key practice by developing and requiring the 
use of standard product-oriented work breakdown structures. Four 
agencies did not meet this key practice, while the other three only 
partially complied. Of those agencies that partially complied, National 
Aeronautics and Space Administration (NASA) policy requires mission (or 
space flight) projects to use a standardized product-oriented work 
breakdown structure; however, IT projects do not have such a 
requirement. NASA officials reported that they are working to develop a 
standard structure for their IT projects; however, they were unable to 
provide a time frame for completion. Homeland Security and Justice have 
yet to standardize their product structures. 

Among the agencies that did not implement this key practice, reasons 
included, among other things, the difficulty in establishing a standard 
structure for component agencies that conduct different types of work 
with varying complexity. While this presents a challenge, agencies 
could adopt an approach similar to DOD's and develop various standard 
work structures based on the kinds of work being performed by the 
various component agencies (e.g., automated information system, IT 
infrastructure, and IT services). Without fully implementing a standard 
product-oriented structure (or structures), agencies will be unable to 
collect and share data among programs and may not have the information 
they need to make decisions on specific program components. 

* Integrated baseline review: All eight agencies required major IT 
investments to conduct an integrated baseline review to ensure that 
program baselines fully reflect the scope of work to be performed, key 
risks, and available resources. For example, DOD required that these 
reviews occur within 6 months of contract award and after major 
modifications have taken place, among other things. 

* Training requirements: Commerce was the only agency to fully meet 
this key practice by requiring and enforcing EVM training for all 
personnel with investment oversight and program management 
responsibilities. Several of the partially compliant agencies required 
EVM training for project managers--but did not extend this requirement 
to other program management personnel or executives with investment 
oversight responsibilities. Many agencies told us that it would be a 
significant challenge to require and enforce EVM training for all 
relevant personnel, especially at the executive level. Instead, most 
agencies have made voluntary EVM training courses available agencywide. 
However, without comprehensive EVM training requirements and 
enforcement, agencies cannot effectively ensure that programs have the 
appropriate skills to validate and interpret EVM data, and that their 
executives will be able to make fully informed decisions based on the 
EVM analysis. 

* Rebaselining criteria: Three of the eight agencies fully met this key 
practice. For example, the Justice policy outlines acceptable reasons 
for rebaselining, such as when the baseline no longer reflects the 
current scope of work being performed, and requires investments to 
explain why their current plans are no longer feasible and to develop 
realistic cost and schedule estimates for remaining work. Among the 
five partially compliant agencies, Agriculture and Veterans Affairs 
provided policies, but in draft form; NASA was in the process of 
updating its policy to include more detailed criteria for rebaselining; 
and Homeland Security did not define acceptable reasons but did require 
an explanation of the root causes for cost and schedule variances and 
the development of new cost and schedule estimates. In several cases, 
agencies were unaware of the detailed rebaselining criteria to be 
included in their EVM policies. Until their policies fully meet this 
key practice, agencies face an increased risk that their executive 
managers will make decisions about programs with incomplete 
information, and that these programs will continue to overrun costs and 
schedules because their underlying problems have not been identified or 
addressed. 

* System surveillance: All eight agencies required ongoing EVM system 
surveillance of all programs (and contracts with EVM requirements) to 
ensure their continued compliance with industry standards. For example, 
Agriculture required its surveillance teams to submit reports--to the 
programs and the Chief Information Officer--with documented findings 
and recommendations regarding compliance. Furthermore, the agency also 
established a schedule to show when EVM surveillance is expected to 
take place on each of its programs. 

Agencies' Key Acquisition Programs Are Using EVM, but Are Not 
Consistently Implementing Key Practices: 

Our studies of 16 major system acquisition programs showed that all 
agencies are using EVM; however, the extent of that implementation 
varies among the programs. Our work on best practices in EVM identified 
11 key practices that are implemented on acquisition programs of 
leading organizations. These practices can be organized into three 
management areas: establishing a sound EVM system, ensuring reliable 
data, and using earned value data to make decisions. Table 3 lists 
these 11 key EVM practices by management area. 

Table 3: Eleven Key EVM Practices for System Acquisition Programs: 

Program management area of responsibility: Establish a comprehensive 
EVM system; EVM practice: 
* Define the scope of effort using a work breakdown structure.
* Identify who in the organization will perform the work.
* Schedule the work.
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve.
* Determine objective measure of earned value.
* Develop the performance measurement baseline. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; EVM practice: 
* Execute the work plan and record all costs.
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan.
* Forecast estimates at completion. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; EVM practice: 
* Take management action to mitigate risks.
* Update the performance measurement baseline as changes occur. 

Source: GAO-09-3SP. 

[End of table] 

Of the 16 case study programs, 3 demonstrated a full level of maturity 
in all three management areas; 3 had full maturity in two areas; and 4 
had reached full maturity in one area. The remaining 6 programs did not 
demonstrate full levels of maturity in any of the management areas; 
however, in all but 1 case, they were able to demonstrate partial 
capabilities in each of the three areas. Table 4 identifies the 16 case 
study programs and summarizes our results for these programs. Following 
the table is a summary of the programs' implementation of each key area 
of EVM program management responsibility. Additional details on the 16 
case studies are provided in appendix II. 

Table 4: Assessment of EVM Practices for Case Study Programs: 

Agency: Agriculture; 
Program: Farm Program Modernization; Establishing a comprehensive EVM 
system: The program partially implemented the EVM practices in this 
program management area; Ensuring that data resulting from the EVM 
system are reliable: The program fully implemented all EVM practices in 
this program management area; Ensuring that the program management team 
is using earned value data for decision-making purposes: The program 
fully implemented all EVM practices in this program management area. 

Agency: Commerce; 
Program: Decennial Response Integration System; Establishing a 
comprehensive EVM system: The program fully implemented all EVM 
practices in this program management area; Ensuring that data resulting 
from the EVM system are reliable: The program fully implemented all EVM 
practices in this program management area; Ensuring that the program 
management team is using earned value data for decision-making 
purposes: The program fully implemented all EVM practices in this 
program management area. 

Agency: Commerce; 
Program: Field Data Collection Automation; Establishing a comprehensive 
EVM system: The program partially implemented the EVM practices in this 
program management area; Ensuring that data resulting from the EVM 
system are reliable: The program partially implemented the EVM 
practices in this program management area; Ensuring that the program 
management team is using earned value data for decision-making 
purposes: The program partially implemented the EVM practices in this 
program management area. 

Agency: Defense; 
Program: Air and Space Operations Center--Weapon System; Establishing a 
comprehensive EVM system: The program partially implemented the EVM 
practices in this program management area; Ensuring that data resulting 
from the EVM system are reliable: The program partially implemented the 
EVM practices in this program management area; Ensuring that the 
program management team is using earned value data for decision-making 
purposes: The program fully implemented all EVM practices in this 
program management area. 

Agency: Defense; 
Program: Joint Tactical Radio System--Handheld, Manpack, Small Form 
Fit; Establishing a comprehensive EVM system: The program partially 
implemented the EVM practices in this program management area; Ensuring 
that data resulting from the EVM system are reliable: The program fully 
implemented all EVM practices in this program management area; Ensuring 
that the program management team is using earned value data for 
decision-making purposes: The program fully implemented all EVM 
practices in this program management area. 

Agency: Defense; 
Program: Warfighter Information Network--Tactical; Establishing a 
comprehensive EVM system: The program partially implemented the EVM 
practices in this program management area; Ensuring that data resulting 
from the EVM system are reliable: The program fully implemented all EVM 
practices in this program management area; Ensuring that the program 
management team is using earned value data for decision-making 
purposes: The program partially implemented the EVM practices in this 
program management area. 

Agency: Homeland Security; 
Program: Automated Commercial Environment; Establishing a comprehensive 
EVM system: The program partially implemented the EVM practices in this 
program management area; Ensuring that data resulting from the EVM 
system are reliable: The program partially implemented the EVM 
practices in this program management area; Ensuring that the program 
management team is using earned value data for decision-making 
purposes: 

Agency: Homeland Security;
Program: Agency: Integrated Deepwater System--Common Operational 
Picture; Establishing a comprehensive EVM system: The program partially 
implemented the EVM practices in this program management area; Ensuring 
that data resulting from the EVM system are reliable: The program 
partially implemented the EVM practices in this program management 
area; Ensuring that the program management team is using earned value 
data for decision-making purposes: The program partially implemented 
the EVM practices in this program management area. 

Agency: Homeland Security;
Program: Western Hemisphere Travel Initiative; Establishing a 
comprehensive EVM system: The program partially implemented the EVM 
practices in this program management area; Ensuring that data resulting 
from the EVM system are reliable: The program partially implemented the 
EVM practices in this program management area; Ensuring that the 
program management team is using earned value data for decision-making 
purposes: The program partially implemented the EVM practices in this 
program management area. 

Agency: Justice; 
Program: Next Generation Identification; Establishing a comprehensive 
EVM system: The program fully implemented all EVM practices in this 
program management area; Ensuring that data resulting from the EVM 
system are reliable: The program fully implemented all EVM practices in 
this program management area; Ensuring that the program management team 
is using earned value data for decision-making purposes: The program 
fully implemented all EVM practices in this program management area. 

Agency: National Aeronautics and Space Administration; Program: James 
Webb Space Telescope; Establishing a comprehensive EVM system: The 
program partially implemented the EVM practices in this program 
management area; Ensuring that data resulting from the EVM system are 
reliable: The program partially implemented the EVM practices in this 
program management area; Ensuring that the program management team is 
using earned value data for decision-making purposes: The program 
partially implemented the EVM practices in this program management 
area. 

Agency: National Aeronautics and Space Administration; Program: Juno; 
Establishing a comprehensive EVM system: The program partially 
implemented the EVM practices in this program management area; Ensuring 
that data resulting from the EVM system are reliable: The program fully 
implemented all EVM practices in this program management area; Ensuring 
that the program management team is using earned value data for 
decision-making purposes: The program fully implemented all EVM 
practices in this program management area. 

Agency: National Aeronautics and Space Administration; Program: Mars 
Science Laboratory; Establishing a comprehensive EVM system: The 
program partially implemented the EVM practices in this program 
management area; Ensuring that data resulting from the EVM system are 
reliable: The program partially implemented the EVM practices in this 
program management area; Ensuring that the program management team is 
using earned value data for decision-making purposes: The program 
partially implemented the EVM practices in this program management 
area. 

Agency: Transportation; 
Program: En Route Automation Modernization; Establishing a 
comprehensive EVM system: The program partially implemented the EVM 
practices in this program management area; Ensuring that data resulting 
from the EVM system are reliable: The program partially implemented the 
EVM practices in this program management area; Ensuring that the 
program management team is using earned value data for decision-making 
purposes: The program fully implemented all EVM practices in this 
program management area. 

Agency: Transportation; 
Program: Surveillance and Broadcast System; Establishing a 
comprehensive EVM system: The program fully implemented all EVM 
practices in this program management area; Ensuring that data resulting 
from the EVM system are reliable: The program fully implemented all EVM 
practices in this program management area; Ensuring that the program 
management team is using earned value data for decision-making 
purposes: The program fully implemented all EVM practices in this 
program management area. 

Agency: Veterans Affairs; 
Program: Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization; Establishing a comprehensive 
EVM system: The program partially implemented the EVM practices in this 
program management area; Ensuring that data resulting from the EVM 
system are reliable: The program partially implemented the EVM 
practices in this program management area; Ensuring that the program 
management team is using earned value data for decision-making 
purposes: The program did not implement the EVM practices in this 
program management area. 

Source: GAO analysis of program data. 

[End of table] 

Most Programs Did Not Fully Establish Comprehensive EVM Systems: 

Most programs did not fully implement the key practices needed to 
establish comprehensive EVM systems. Of the 16 programs, 3 fully 
implemented the practices in this program management area, and 13 
partially implemented the practices. The Decennial Response Integration 
System, Next Generation Identification, and Surveillance and Broadcast 
System programs demonstrated that they had fully implemented the six 
practices in this area. For example, our analysis of the Decennial 
Response Integration System program schedule showed that activities 
were properly sequenced, realistic durations were established, and 
labor and material resources were assigned. The Surveillance and 
Broadcast System program conducted a detailed integrated baseline 
review to validate its performance baseline. It was also the only 
program to fully institutionalize EVM at the program level--meaning 
that it collects performance data on the contractor and government work 
efforts--in order to get a complete view into program status. 

Thirteen programs demonstrated that they partially implemented the six 
key practices in this area. In most cases, programs had work breakdown 
structures that defined work products to an appropriate level of detail 
and had identified the personnel responsible for delivering these work 
products. However, for all 13 programs, the project schedules contained 
issues that undermined the quality of their performance baselines. 
Weaknesses in these schedules included the improper sequencing of 
activities, such as incomplete or missing linkages between tasks; a 
lack of resources assigned to all activities; invalid critical paths 
(the sequence of activities that, if delayed, will impact the planned 
completion date of the project); and the excessive or unjustified use 
of constraints, which impairs the program's ability to forecast the 
impact of ongoing delays on future planned work activities. These 
weaknesses are of concern because the schedule serves as the 
performance baseline against which earned value is measured. As such, 
poor schedules undermine the overall quality of a program's EVM system. 
Other key weaknesses included the following examples: 

* Nine programs did not adequately determine an objective measure of 
earned value and develop the performance baseline--that is, key 
practices most appropriately addressed through a comprehensive 
integrated baseline review, which none of them fully performed. For 
example, the Air and Space Operations Center--Weapon System program 
conducted an integrated baseline review in May 2007 to validate one 
segment of work contained in the baseline; however, the program had not 
conducted subsequent reviews for the remaining work because doing so 
would preclude staff from completing their normal work activities. 
Other reasons cited by the programs for not performing these reviews 
included the lack of a fully defined scope of work or management's 
decision to use ongoing EVM surveillance to satisfy these practices. 
Without having performed a comprehensive integrated baseline review, 
programs have not sufficiently evaluated the validity of their baseline 
plan to determine whether all significant risks contained in the plan 
have been identified and mitigated, and that the metrics used to 
measure the progress made on planned work elements are appropriate. 

* Four programs did not define the scope of effort using a work 
breakdown structure. For example, the Veterans Health Information 
Systems and Technology Architecture--Foundations Modernization program 
provided a list of its subprograms; however, it did not define the 
scope of the detailed work elements that comprise each subprogram. 
Without a work breakdown structure, programs lack a basis for planning 
the performance baseline and assigning responsibility for that work, 
both of which are necessary to accomplish a program's objectives. 

Many Programs Did Not Fully Implement Practices to Ensure Data 
Reliability: 

Many programs did not fully ensure that their EVM data were reliable. 
Of the 16 programs, 7 fully implemented the practices for ensuring the 
reliability of the prime contractor and government performance data, 
and 9 partially implemented the practices. All 7 programs that 
demonstrated full implementation conduct monthly reviews of earned 
value data with technical engineering staff and other key personnel to 
ensure that the data are consistent with actual performance; perform 
detailed performance trend analyses to track program progress, cost, 
and schedule drivers; and make estimates of cost at completion. Four 
programs that we had previously identified as having schedule 
weaknesses (Farm Program Modernization; Joint Tactical Radio System-- 
Handheld, Manpack, Small Form Fit; Juno; and Warfighter Information 
Network--Tactical) were aware of these issues and had sufficient 
controls in place to mitigate them in order to ensure that the earned 
value data are reliable. 

Nine programs partially implemented the three practices for ensuring 
that earned value data are reliable. In all cases, the program had 
processes in place to review earned value data (from monthly contractor 
EVM reports in all but one case), identify and record cost and schedule 
variances, and forecast estimates at completion. However, 5 of these 
programs did not adequately analyze EVM performance data and properly 
record variances from the performance baseline. For example, 2 programs 
did not adequately document justifications for cost and schedule 
variances, including root causes, potential impacts, and corrective 
actions. Other weaknesses in this area include anomalies in monthly 
performance reports, such as negative dollars being spent for work 
performed, which impacts the validity of performance data. In addition, 
7 of these programs did not demonstrate that they could adequately 
execute the work plan and record costs because, among other things, 
they were unaware of the schedule weaknesses we identified and did not 
have sufficient internal controls in place to deal with these issues to 
improve the reliability of the earned value data. Lastly, 2 of these 
programs could not adequately forecast estimates at completion due, in 
part, to anomalies in the prime contractor's EVM reports, in 
combination with the weaknesses contained in the project schedule. 

Most Programs Used Earned Value Data for Decision-making Purposes: 

Programs were uneven in their use of earned value data to make 
decisions. Of the 16 programs, 9 fully implemented the practices for 
using earned value data for decision making, 6 partially implemented 
them, and 1 did not implement them. Among the 9 fully implemented 
programs, both the Automated Commercial Environment and Juno programs 
integrated their EVM and risk management processes to support the 
program manager in making better decisions. The Automated Commercial 
Environment program actively recorded risks associated with major 
variances from the EVM reports in the program's risk register. Juno 
further used the earned value data to analyze threats against remaining 
management reserve and to estimate the cost impact of these threats. 

Six programs demonstrated limited capabilities in using earned value 
data for making decisions. In most cases, these programs included 
earned value performance trend data in monthly program management 
review briefings. However, the majority had processes for taking 
management action to address the cost and schedule drivers causing poor 
trends that were ad hoc and separate from the programs' risk management 
processes--and, in most cases, the risks and issues found in the EVM 
reports did not correspond to the risks contained in the program risk 
registers. In addition, 4 of these programs were not able to adequately 
update the performance baseline as changes occurred because, in many 
cases, the original baseline was not appropriately validated. For 
example, the Mars Science Laboratory program just recently updated its 
performance baseline as part of a recent replan effort. However, 
without validating the original and current baselines with a project- 
level integrated baseline review, it is unclear whether the changes to 
the baseline were reasonable, and whether the risks assumed in the 
baseline have been identified and appropriately mitigated. 

One program (Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization) was not using earned value 
data for decision making. Specifically, the program did not actively 
manage earned value performance trends, nor were these data 
incorporated into programwide management reviews. 

Inconsistent Implementation Is Due in Part to Weaknesses in Policy and 
Lack of Enforcement: 

The inconsistent application of EVM across the investments exists in 
part because of the weaknesses we previously identified in the eight 
agencies' policies, as well as a lack of enforcement of the EVM policy 
components already in place. For example, deficiencies in all three 
management areas can be attributed, in part, to a lack of comprehensive 
EVM training requirements--which was a policy component that most 
agencies did not fully address. The only 3 programs that had fully 
implemented all key EVM practices either had comprehensive training 
requirements in their agency EVM policy or enforced rigorous training 
requirements beyond that for which the policy called. Most of the 
remaining programs met the minimum requirements of their agencies' 
policies. However, all programs that had attained full maturity in two 
management areas had also implemented more stringent training 
requirements, although none could match the efforts made on the other 3 
programs. Without making this training a comprehensive requirement, 
these agencies are at risk that their major system acquisition programs 
will continue to have management and technical staff who lack the 
skills to fully implement key EVM practices. 

Our case study analysis also highlighted multiple areas in which 
programs were not in compliance with their agencies' established EVM 
policies. This is an indication that agencies are not adequately 
enforcing program compliance. These policy areas include requiring EVM 
compliance at the start of the program, validating the baseline with an 
integrated baseline review, and conducting ongoing EVM surveillance. 

Until key EVM practices are fully implemented, selected programs face 
an increased risk that program managers cannot effectively optimize EVM 
as a management tool to mitigate and reverse poor cost and schedule 
performance trends. 

Earned Value Data Show Trends of Cost Overruns and Schedule Slippages 
on Most Programs: 

Earned value data trends of the 16 case study programs indicate that 
most are currently experiencing cost overruns and schedule slippages, 
and, based on our analysis, it is likely that when these programs are 
completed, the total cost overrun will be about $3 billion. To date, 
these programs, collectively, have already overrun their original life- 
cycle cost estimates by almost $2 billion (see table 5). 

Table 5: Program Life-cycle Cost Estimate Changes (Dollars in 
millions): 

Agency: Agriculture; 
Program: Farm Program Modernization; Original life-cycle cost estimate: 
$451.0; Current life-cycle cost estimate: $451.0; Cost overruns in 
excess of original cost estimate: $0.0. 

Agency: Commerce; 
Program: Decennial Response Integration System; Original life-cycle 
cost estimate: $574.0[A]; Current life-cycle cost estimate: $946.0[A]; 
Cost overruns in excess of original cost estimate: $372.0. 

Agency: Commerce; 
Program: Field Data Collection Automation; Original life-cycle cost 
estimate: $595.7; Current life-cycle cost estimate: $801.1; Cost 
overruns in excess of original cost estimate: $205.4. 

Agency: Defense; 
Program: Air and Space Operations Center--Weapon System; Original life-
cycle cost estimate: $4,425.0; Current life-cycle cost estimate: 
$4,425.0; Cost overruns in excess of original cost estimate: 0.0. 

Agency: Defense; 
Program: Joint Tactical Radio System--Handheld, Manpack, Small Form 
Fit; Original life-cycle cost estimate: $19,214.0; Current life-cycle 
cost estimate: $11,599.0; Cost overruns in excess of original cost 
estimate: Dollars in millions: n/a[B]. 

Agency: Defense; 
Program: Warfighter Information Network--Tactical; Original life-cycle 
cost estimate: $38,157.1; Current life-cycle cost estimate: $38,157.1; 
Cost overruns in excess of original cost estimate: 0.0. 

Agency: Homeland Security; 
Program: Automated Commercial Environment; Original life-cycle cost 
estimate: $1,500.0[C]; Current life-cycle cost estimate: $2,241.0[C]; 
Cost overruns in excess of original cost estimate: $741.0. 

Agency: Homeland Security; 
Program: Integrated Deepwater System--Common Operational Picture; 
Original life-cycle cost estimate: $1,353.0[C]; Current life-cycle cost 
estimate: $1,353.0[C]; Cost overruns in excess of original cost 
estimate: 0.0. 

Agency: Homeland Security; 
Program: Western Hemisphere Travel Initiative; Original life-cycle cost 
estimate: $1,228.0; Current life-cycle cost estimate: $1,228.0; Cost 
overruns in excess of original cost estimate: 0.0. 

Agency: Justice; 
Program: Next Generation Identification; Original life-cycle cost 
estimate: $1,075.9; Current life-cycle cost estimate: $1,075.9; Cost 
overruns in excess of original cost estimate: 0.0. 

Agency: National Aeronautics and Space Administration; Program: James 
Webb Space Telescope; Original life-cycle cost estimate: $4,964.0; 
Current life-cycle cost estimate: $4,964.0; Cost overruns in excess of 
original cost estimate: 0.0. 

Agency: National Aeronautics and Space Administration; Program: Juno; 
Original life-cycle cost estimate: $1,050.0; Current life-cycle cost 
estimate: $1,050.0; Cost overruns in excess of original cost estimate: 
0.0. 

Agency: National Aeronautics and Space Administration; Program: Mars 
Science Laboratory; Original life-cycle cost estimate: $1,634.0; 
Current life-cycle cost estimate: $2,286.0; Cost overruns in excess of 
original cost estimate: $652.0. 

Agency: Transportation; 
Program: En Route Automation Modernization; Original life-cycle cost 
estimate:$3,649.4; Current life-cycle cost estimate: $3,649.4; Cost 
overruns in excess of original cost estimate: 0.0. 

Agency: Transportation; 
Program: Surveillance and Broadcast System; Original life-cycle cost 
estimate: $4,313.0; Current life-cycle cost estimate: $4,328.9; Cost 
overruns in excess of original cost estimate: $15.9. 

Agency: Veterans Affairs; 
Program: Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization; Original life-cycle cost 
estimate: $1,897.4; Current life-cycle cost estimate: $1,897.4; Cost 
overruns in excess of original cost estimate: 0.0. 

Agency: Total; 
Cost overruns in excess of original cost estimate: $1,986.3 billion. 

Source: GAO analysis of program and contractor data. 

[A] We removed $37 million from the original estimate, which 
represented costs associated with the closeout of the program. We did 
this because the current estimate does not include costs for these 
activities. An estimate for these activities is currently being 
revised. In addition, the cost increase associated with the current 
estimate is due, in part, to an agency-directed expansion of program 
scope (related to the system's ability to process a higher volume of 
paper forms) in April 2008. 

[B] It is not appropriate to compare the original and current life-
cycle cost estimates for this program because the scope has 
significantly changed since inception (such as newly imposed security 
requirements). In addition, due to a change in the agency's migration 
strategy for replacing legacy radios with new tactical radios, the 
planned quantity of radios procured was decreased from 328,514 to 
95,551. As a result, the life-cycle cost estimate was reduced and no 
longer represents the original scope of the program. 

[C] The original and current life-cycle costs do not include operations 
and maintenance costs. 

[End of table] 

Taking the current earned value performance[Footnote 15] into account, 
our analysis of the 16 case study programs indicated that most are 
experiencing shortfalls against their currently planned cost and 
schedule targets. Specifically, earned value performance data over a 12-
month period showed that the 16 programs combined have exceeded their 
cost targets by $275 million. During that period, they also experienced 
schedule variances and were unable to accomplish almost $93 million 
worth of planned work. In most cases, the negative cost and schedule 
performance trends were attributed to ongoing technical issues in the 
development or testing of system components. 

Furthermore, our projections of future estimated costs at completion 
based on our analysis of current contractor performance trends indicate 
that these programs will most likely continue to experience cost 
overruns to completion, totaling almost $1 billion. In contrast, the 
programs' contractors estimate the cost overruns at completion will be 
approximately $469.7 million. These estimates are based on the 
contractors' assumption that their efficiency in completing the 
remaining work will significantly improve over what has been done to 
date. Furthermore, it should be noted that in 4 cases, the contractor- 
estimated overrun is smaller than the cost variances they have already 
accumulated--which is an indication that these estimates are 
aggressively optimistic.[Footnote 16] 

With the inclusion of the overruns already incurred to date, the total 
increase in life-cycle costs will be about $3 billion. Our analysis is 
presented in table 6. Additional details on the 16 case studies are 
provided in appendix II. 

Table 6: Contractor Cumulative Cost and Schedule Performances (Dollars 
in millions): 

Agency: Agriculture; 
Program: Farm Program Modernization[A,B]; Contractor budget at 
completion: $7.0; Percentage complete: 94%; 
Cumulative cost variance: $<0.1; 
Cumulative schedule variance: ($0.2); Contractor-estimated cost 
overrun/underrun at completion: $<0.1; GAO most likely cost 
overrun/underrun at completion: $<0.1. 

Agency: Commerce; 
Program: Decennial Response Integration System; Contractor budget at 
completion: $468.6; Percentage complete: 50%; 
Cumulative cost variance: $13.6; 
Cumulative schedule variance: $2.3; Contractor-estimated cost 
overrun/underrun at completion: $7.0 underrun; GAO most likely cost 
overrun/underrun at completion: $7.0 underrun. 

Agency: Commerce; 
Program: Field Data Collection Automation; Contractor budget at 
completion: $555.6; Percentage complete: 75%; 
Cumulative cost variance: ($3.5); Cumulative schedule variance: $0.4; 
Contractor-estimated cost overrun/underrun at completion: $2.9 overrun; 
GAO most likely cost overrun/underrun at completion: $4.6 overrun. 

Agency: Defense; 
Program: Air and Space Operations Center--Weapon System; Contractor 
budget at completion: $171.3; Percentage complete: 86%; 
Cumulative cost variance: ($0.1); Cumulative schedule variance: $0.4; 
Contractor-estimated cost overrun/underrun at completion: $0.8 overrun; 
GAO most likely cost overrun/underrun at completion: $0.8 overrrun. 

Agency: Defense; 
Program: Joint Tactical Radio System--Handheld, Manpack, Small Form 
Fit; Contractor budget at completion: $30.8; Percentage complete: 74%; 
Cumulative cost variance: ($62.4); Cumulative schedule variance: 
($8.8); Contractor-estimated cost overrun/underrun at completion: $70.1 
overrun; GAO most likely cost overrun/underrun at completion: $89.1 
overrun. 

Agency: Defense; 
Program: Warfighter Information Network--Tactical; Contractor budget at 
completion: $747.0; Percentage complete: 34%; 
Cumulative cost variance: $0.8; 
Cumulative schedule variance: ($12.0); Contractor-estimated cost 
overrun/underrun at completion: $3.7 underrun; GAO most likely cost 
overrun/underrun at completion: $15.1 overrun. 

Agency: Homeland Security; 
Program: Automated Commercial Environment; Contractor budget at 
completion: $382.3; Percentage complete: 83%; 
Cumulative cost variance: ($18.8); Cumulative schedule variance: 
($13.2); Contractor-estimated cost overrun/underrun at completion: $0.5 
underrun; GAO most likely cost overrun/underrun at completion: $24.1 
overrun. 

Agency: Homeland Security; 
Program: Integrated Deepwater System--Common Operational Picture; 
Contractor budget at completion: $130.2; Percentage complete: 99%; 
Cumulative cost variance: ($4.2); Cumulative schedule variance: 0.0; 
Contractor-estimated cost overrun/underrun at completion: $4.2 overrun; 
GAO most likely cost overrun/underrun at completion: $4.2 overrun. 

Agency: Homeland Security; 
Program: Western Hemisphere Travel Initiative[C]; Contractor budget at 
completion: $45.3; Percentage complete: 100%; 
Cumulative cost variance: n/a; 
Cumulative schedule variance: n/a; Contractor-estimated cost 
overrun/underrun at completion: n/a; GAO most likely cost 
overrun/underrun at completion: n/a. 

Agency: Justice; 
Program: Next Generation Identification; Contractor budget at 
completion: $37.5; Percentage complete: 91%; 
Cumulative cost variance: ($1.4); Cumulative schedule variance: ($0.5); 
Contractor-estimated cost overrun/underrun at completion: $1.5 overrun; 
GAO most likely cost overrun/underrun at completion: $1.6 overrun. 

Agency: National Aeronautics and Space Administration; Program: James 
Webb Space Telescope; Contractor budget at completion: $1,271.6; 
Percentage complete: 64%; 
Cumulative cost variance: ($224.7); Cumulative schedule variance: 
($9.4); Contractor-estimated cost overrun/underrun at completion: 
$448.5 overrun[D]; GAO most likely cost overrun/underrun at completion: 
$448.5 overrun. 

Agency: National Aeronautics and Space Administration; Program: Juno; 
Contractor budget at completion: $369.0; Percentage complete: 32; 
Cumulative cost variance: ($13.2); Cumulative schedule variance: 
($12.3); Contractor-estimated cost overrun/underrun at completion: $6.4 
overrun; GAO most likely cost overrun/underrun at completion: $49.8 
overrun. 

Agency: National Aeronautics and Space Administration; Program: Mars 
Science Laboratory[E]; Contractor budget at completion: $1,223.0; 
Percentage complete: 77%; 
Cumulative cost variance: $2.2; 
Cumulative schedule variance: ($6.2); Contractor-estimated cost 
overrun/underrun at completion: $4.1 overrun; GAO most likely cost 
overrun/underrun at completion: n/a. 

Agency: Transportation; 
Program: En Route Automation Modernization; Contractor budget at 
completion: $1,480.2; Percentage complete: 89%; 
Cumulative cost variance: $36.9; 
Cumulative schedule variance: $15.9; Contractor-estimated cost 
overrun/underrun at completion: $15.3 underrun; GAO most likely cost 
overrun/underrun at completion: $15.3 underrun. 

Agency: Transportation; 
Program: Surveillance and Broadcast System[A]; Contractor budget at 
completion: $1,007.9; Percentage complete: 27%; 
Cumulative cost variance: $14.7; 
Cumulative schedule variance: ($24.0); Contractor-estimated cost 
overrun/underrun at completion: $41.6 underrun; GAO most likely cost 
overrun/underrun at completion: $21.7 overrun. 

Agency: Veterans Affairs; 
Program: Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization[A]; Contractor budget at 
completion: $1,897.4; Percentage complete: 10%; 
Cumulative cost variance: ($14.9); Cumulative schedule variance: 
($24.9); Contractor-estimated cost overrun/underrun at completion: $0.7 
underrun; GAO most likely cost overrun/underrun at completion: $350.2 
overrun. 

Agency: Total; 
Contractor budget at completion: $10,324.7; Cumulative cost variance: 
$275.0 overrun; Cumulative schedule variance: $92.5 overrun; Contractor-
estimated cost overrun/underrun at completion: $469.7 overrun; GAO most 
likely cost overrun/underrun at completion: $987.4 overrun. 

Source: GAO analysis of program and contractor data. 

[A] Earned value data reflect performance for the full scope of the 
program. 

[B] This program is currently in the initiation phase of its life 
cycle, and the budget at completion reflects only work planned to be 
completed in this phase. 

[C] The program's contractor completed development work in June 2009. 

[D] Project officials stated that they have adequate contingency 
reserves built into their life-cycle cost estimate to cover this 
estimated overrun and any additional overruns (should performance 
continue to degrade) through contract completion. 

[E] EVM reporting was suspended between November 2008 and February 2009 
while the project was being replanned; therefore, we did not have 
sufficient data to make a reliable independent estimate at completion. 

[End of table] 

Eleven programs are expected to incur a cost overrun at contract 
completion. In particular, two programs (i.e., the James Webb Space 
Telescope and Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization programs) will likely 
experience a combined overrun of $798.7 million, which accounts for 
about 80 percent of our total projection. 

With timely and effective action taken by program and executive 
management, it is possible to reverse negative performance trends so 
that the projected cost overruns at completion may be reduced. To get 
such results, management at all levels could be strengthened, including 
contractor management, program office management, and executive-level 
management. For example, programs could strengthen program office 
controls and contractor oversight by obtaining earned value data weekly 
(instead of monthly) so that they can make decisions with immediate and 
greater impact. Additionally, key risks could be elevated to the 
program level and, if necessary, to the executive level to ensure that 
appropriate mitigation plans are in place and that they are tracked to 
closure. 

Conclusions: 

Key agencies have taken a number of important steps to improve the 
management of major acquisitions through the implementation of EVM. 
Specifically, the agencies have established EVM policies and require 
their major system acquisition programs to use EVM. However, none of 
the eight agencies that we reviewed have comprehensive EVM policies. 
Most of these policies omit or lack sufficient guidance on the type of 
work structure needed to effectively use EVM data and on the training 
requirements for all relevant personnel. Without comprehensive 
policies, it will be difficult for the agencies to gain the full 
benefits of EVM. 

Few of our 16 case study programs had fully implemented EVM 
capabilities, raising concerns that programs cannot efficiently produce 
reliable estimates of cost at completion. Many of these weaknesses 
found on these programs can be traced back to inadequate agency EVM 
policies and raise questions concerning the agencies' enforcement of 
the policies already established, including the completion of the 
integrated baseline reviews and system surveillance. Until agencies 
expand and enforce their EVM policies, it will be difficult for them to 
optimize the effectiveness of this management tool, and they will face 
an increased risk that managers are not getting the information they 
need to effectively manage the programs. 

In addition to concerns about their implementation of EVM, the 
programs' earned value data show trends toward cost overruns that are 
likely to collectively total about $3 billion. Without timely and 
aggressive management action, this projected overrun will be realized, 
resulting in the expenditure of over $1 billion more than currently 
planned. 

Recommendations for Executive Action: 

To address the weaknesses identified in agencies' policies and 
practices in using EVM, we are making recommendations to the eight 
major agencies included in this review. Specifically, we recommend that 
the following three actions be taken by the Secretaries of the 
Departments of Agriculture, Commerce, Defense, Homeland Security, 
Justice, Transportation, and Veterans Affairs and the Administrator of 
the National Aeronautics and Space Administration: 

* modify policies governing EVM to ensure that they address the 
weaknesses that we identified, taking into consideration the criteria 
used in this report; 

* direct key system acquisition programs to implement the EVM practices 
that address the detailed weaknesses that we identified in appendix II, 
taking into consideration the criteria used in this report; and: 

* direct key system acquisition programs to take action to reverse 
current negative performance trends, as shown in the earned value data, 
to mitigate the potential cost and schedule overruns. 

Agency Comments and Our Evaluation: 

We provided the selected eight agencies with a draft of our report for 
review and comment. The Department of Homeland Security responded that 
it had no comments. The remaining seven agencies generally agreed with 
our results and recommendations. Agencies also provided technical 
comments, which we incorporated in the report as appropriate. 

The comments of the agencies are summarized in the following text: 

* In e-mail comments on a draft of the report, officials from the U.S. 
Department of Agriculture's Office of the Chief Information Officer 
stated that the department has begun to address the weaknesses in its 
EVM policy identified in the report. 

* In written comments on a draft of the report, the Secretary of 
Commerce stated that, regarding the second and third recommendations, 
the Department of Commerce was pleased that the Decennial Response 
Integration System was found to have fully implemented all 11 key EVM 
practices, and that the Field Data Collection Automation program fully 
implemented six key practices. The department added that its recent 
actions on the Field Data Collection Automation program should move 
this program to full compliance with the key EVM practices. 
Furthermore, regarding the first recommendation, the Secretary stated 
that while the department understands and appreciates the value of 
standardized work breakdown structures, it maintained that the 
development of these work structures should take place at the 
department's operating units (e.g., Census Bureau), given the wide 
diversity of missions and project complexity among these units. As 
noted in our report, we agree that agencies could develop standard work 
structures based on the kinds of work being performed by the various 
component agencies. Therefore, we support these efforts described by 
the department because they are generally consistent with the intent of 
our recommendation. Commerce's comments are printed in appendix III. 

* In written comments on a draft of the report, the Department of 
Defense's Director of Defense Procurement and Acquisition Policy stated 
that the department concurred with our recommendations. Among other 
things, DOD stated that it is essential to maintain the appropriate 
oversight of acquisition programs, including the use of EVM data to 
understand program status and anticipate potential problems. DOD's 
comments are printed in appendix IV. 

* In written comments on a draft of the report, the Department of 
Justice's Assistant Attorney General for Administration stated that, 
after discussion with our office, it was agreed that the second 
recommendation, related to implementing EVM practices that address 
identified weakness, was inadvertently directed to the department, and 
that no response was necessary. We agreed because the case study 
program reviewed fully met all key EVM practices. The department 
concurred with the two remaining recommendations related to modifying 
EVM policies and reversing negative performance trends. Furthermore, 
the Assistant Attorney General noted that Justice had begun to take 
steps to improve its use of EVM, such as modifying its policy to 
require EVM training for all personnel with investment oversight and 
program management responsibilities. Justice's comments are printed in 
appendix V. 

* In written comments on a draft of the report, the National 
Aeronautics and Space Administration's Deputy Administrator stated that 
the agency concurred with two recommendations and partially concurred 
with one recommendation. In particular, the Deputy Administrator agreed 
that opportunities exist for improving the implementation of EVM, but 
stated that NASA classifies the projects included in the scope of the 
audit as space flight projects (not as IT-specific projects), which 
affects the applicability of the agency's EVM policies and guidance 
that were reviewed. We recognize that different classifications of IT 
exist; however, consistent with other programs included in the audit, 
the selected NASA projects integrate and rely on various elements of 
IT. As such, we reviewed both the agency's space flight and IT-specific 
guidance. Furthermore, the agency partially concurred with one 
recommendation because it stated that efforts were either under way or 
planned that will address the weaknesses we identified. We support the 
efforts that NASA described in its comments because they are generally 
consistent with the intent of our recommendation. NASA's comments are 
printed in appendix VI. 

* In e-mail comments on a draft of the report, the Department of 
Transportation's Director of Audit Relations stated that the department 
is taking immediate steps to modify its policies governing EVM, taking 
into consideration the criteria used in the draft report. 

* In written comments on a draft of the report, the Secretary of 
Veterans Affairs stated that the Department of Veterans Affairs 
generally agreed with our conclusions and concurred with our 
recommendations. Furthermore, the Secretary stated that Veterans 
Affairs has initiatives under way to address the weaknesses identified 
in the report. Veterans Affairs' comments are printed in appendix VII. 

As agreed with your office, unless you publicly announce the contents 
of this report earlier, we plan no further distribution until 30 days 
from the report date. At that time, we will send copies of this report 
to interested congressional committees; the Secretaries of the 
Departments of Agriculture, Commerce, Defense, Homeland Security, 
Justice, Transportation, and Veterans Affairs; the Administrator of the 
National Aeronautics and Space Administration; and other interested 
parties. In addition, the report will be available at no charge on our 
Web site at [hyperlink, http://www.gao.gov]. 

If you or your staff have any questions on the matters discussed in 
this report, please contact me at (202) 512-9286 or pownerd@gao.gov. 
Contact points for our Offices of Congressional Relations and Public 
Affairs may be found on the last page of this report. GAO staff who 
made major contributions to this report are listed in appendix VIII. 

Sincerely yours, 

Signed by: 

David A. Powner: 
Director, Information Technology Management Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to (1) assess whether key departments and agencies 
have appropriately established earned value management (EVM) policies, 
(2) determine whether these agencies are adequately using earned value 
techniques to manage key system acquisitions, and (3) evaluate the 
earned value data of these selected investments to determine their cost 
and schedule performances. 

For this governmentwide review, we assessed eight agencies and 16 
investments. We initially identified the 10 agencies with the highest 
amount of spending for information technology (IT) development, 
modernization, and enhancement work as reported in the Office of 
Management and Budget's (OMB) Fiscal Year 2009 Exhibit 53. These 
agencies were the Departments of Agriculture, Commerce, Defense, Health 
and Human Services, Homeland Security, Justice, Transportation, the 
Treasury, and Veterans Affairs and the National Aeronautics and Space 
Administration. We excluded Treasury from our selection because we 
recently performed an extensive review of EVM at that agency.[Footnote 
17] We also subsequently removed Health and Human Services from our 
selection because the agency did not have investments in system 
acquisition that met our dollar threshold (as defined in the following 
text). The resulting eight agencies also made up about 75 percent of 
the government's planned IT spending for fiscal year 2009. 

To ensure that we examined significant investments, we chose from 
investments (related to system acquisition) that were expected to 
receive development, modernization, and enhancement funding in fiscal 
year 2009 in excess of $90 million.[Footnote 18] We limited the number 
of selected investments to a maximum of 3 per agency. For agencies with 
more than 3 investments that met our threshold, we selected the top 3 
investments with the highest planned spending. For agencies with 3 or 
fewer such investments, we chose all of the investments meeting our 
dollar threshold. Lastly, we excluded investments with related EVM work 
already under way at GAO.[Footnote 19] 

To assess whether key agencies have appropriately established EVM 
policies, we analyzed agency policies and guidance for EVM. 
Specifically, we compared these policies and guidance documents with 
both OMB's requirements and key best practices recognized within the 
federal government and industry for the implementation of EVM. These 
best practices are contained in the GAO cost guide.[Footnote 20] We 
also interviewed key agency officials to obtain information on their 
ongoing and future EVM plans. 

To determine whether these agencies are adequately using earned value 
techniques to manage key system acquisitions, we analyzed program 
documentation, including project work breakdown structures, project 
schedules, integrated baseline review briefings, risk registers, and 
monthly management briefings for the 16 selected investments. 
Specifically, we compared program documentation with EVM and scheduling 
best practices as identified in the cost guide.[Footnote 21] We 
determined whether the program implemented, partially implemented, or 
did not implement each of the 11 practices. We also interviewed program 
officials (and observed key program status review meetings) to obtain 
clarification on how EVM practices are implemented and how the data are 
used for decision-making purposes. 

To evaluate the earned value data of the selected investments to 
determine their cost and schedule performances, we analyzed the earned 
value data contained in contractor EVM performance reports obtained 
from the programs. To perform this analysis, we compared the cost of 
work completed with budgeted costs for scheduled work for a 12-month 
period to show trends in cost and schedule performances. We also used 
data from these reports to estimate the likely costs at completion 
through established earned value formulas. This resulted in three 
different values, with the middle value being the most likely. To 
assess the reliability of the cost data, we compared it with other 
available supporting documents (including OMB and agency financial 
reports); electronically tested the data to identify obvious problems 
with completeness or accuracy; and interviewed agency and program 
officials about the data. For the purposes of this report, we 
determined that the cost data were sufficiently reliable. We did not 
test the adequacy of the agency or contractor cost-accounting systems. 
Our evaluation of these cost data was based on what we were told by the 
agency and the information they could provide. 

We conducted this performance audit from February to October 2009 at 
the agencies' offices in the Washington, D.C., metropolitan area; Fort 
Monmouth, New Jersey; Jet Propulsion Lab, Pasadena, California; Hanscom 
Air Force Base, Massachusetts; and Naval Base San Diego, California. 
Our work was done in accordance with generally accepted government 
auditing standards. Those standards require that we plan and perform 
the audit to obtain sufficient, appropriate evidence to provide a 
reasonable basis for our findings and conclusions based on our audit 
objectives. We believe that the evidence obtained provides a reasonable 
basis for our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Case Studies of Selected Programs' Implementation of 
Earned Value Management: 

We conducted case studies of 16 major system acquisition programs (see 
table 7). For each of these programs, the remaining sections of this 
appendix provide the following: a brief description of the program, 
including a graphic illustration of the investment's life cycle; an 
assessment of the program's implementation of the 11 key EVM practices; 
and an analysis of the program's recent earned value (EV) data and 
trends. These data and trends are often described in terms of cost and 
schedule variances. Cost variances compare the earned value of the 
completed work with the actual cost of the work performed. Schedule 
variances are also measured in dollars, but they compare the earned 
value of the completed work with the value of the work that was 
expected to be completed. Positive variances are good--they indicate 
that activities are costing less than expected or are completed ahead 
of schedule. Negative variances are bad--they indicate activities are 
costing more than expected or are falling behind schedule. 

Table 7: Sixteen Case Study Programs: 

Agency: Agriculture; 
Program: Farm Program Modernization. 

Agency: Commerce; 
Program: Decennial Response Integration System. 

Agency: Commerce; 
Program: Field Data Collection Automation. 

Agency: Defense; 
Program: Air and Space Operations Center--Weapon System. 

Agency: Defense; 
Program: Joint Tactical Radio System--Handheld, Manpack, Small Form 
Fit. 

Agency: Defense; 
Program: Warfighter Information Network--Tactical. 

Agency: Homeland Security; 
Program: Automated Commercial Environment. 

Agency: Homeland Security; 
Program: Integrated Deepwater System--Common Operational Picture. 

Agency: Homeland Security; 
Program: Western Hemisphere Travel Initiative. 

Agency: Justice; 
Program: Next Generation Identification. 

Agency: National Aeronautics and Space Administration; 
Program: James Webb Space Telescope. 

Agency: National Aeronautics and Space Administration;
Program: Juno. 

Agency: National Aeronautics and Space Administration;
Program: Mars Science Laboratory. 

Agency: Transportation; 
Program: En Route Automation Modernization. 

Agency: Transportation; 
Program: Surveillance and Broadcast System. 

Agency: Veterans Affairs; 
Program: Veterans Health Information Systems and Technology 
Architecture--Foundations Modernization. 

Source: GAO analysis of program data. 

[End of table] 

The following information describes the key that we used in tables 8 
through 23 to convey the results of our assessment of the 16 case study 
programs' implementation of the 11 EVM practices. 

Key description: 

The program fully implemented all EVM practices in this program 
management area; 

The program partially implemented the EVM practices in this program 
management area; 

The program did not implement the EVM practices in this program 
management area. 

Farm Program Modernization: 

[Sidebar: 
Investment Details: 
Department of Agriculture (Farm Service Agency); 
Program start date: 2004; 
Total life-cycle cost: 
* Current: $451 million; 
* Original: $451 million; 
Program end date: 
* Current: 2018; 
* Original: 2017; 
Rebaselines: 1 (September 2008); 
Major contractor: Prime contract to be awarded in the first quarter of 
FY 2010. End of sidebar] 

The Farm Program Modernization (MIDAS) program is intended to address 
the long-term needs in delivering farm benefit programs via business 
process reengineering and implementation of a commercial off-the-shelf 
enterprise resource planning solution. MIDAS is an initiative of the 
Farm Service Agency, which is responsible for administering 35 farm 
benefit programs. To support these programs, the agency uses two 
primary systems--a distributed network of legacy computers and a 
centralized Web farm (to store customer data and host Web-based 
applications)--both of which have shortcomings. While MIDAS is to 
replace these computers, it is also intended to provide new 
applications and redesigned business processes. The Web farm is 
expected to remain in operation in a supporting role for the program. 
Currently, MIDAS is in the initiation phase of its life cycle and plans 
to award the system integration contract in the first quarter of fiscal 
year 2010. 

Initiation: 
Development: 
Operations and maintenance. 

Source: GAO analysis of U.S. Department of Agriculture (Farm Service 
Agency) data. 

Table 8: GAO EVM Practice Assessment of Agriculture's MIDAS Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of U.S. Department of Agriculture (Farm Service 
Agency) data. 

[End of table] 

MIDAS fully met 6 of the 11 key practices for implementing EVM and 
partially met 5 practices. Specifically, a key weakness in the EVM 
system is the lack of a comprehensive integrated baseline review. 
Instead, MIDAS focused solely on evaluating the program's compliance 
with industry standards and chose not to validate the quality of the 
baseline. Program officials stated that they plan to conduct a full 
review to address the risks and realism of the baseline after the prime 
contract has been awarded. Furthermore, while the MIDAS schedule is 
generally sound, resources were not assigned to all activities, and the 
critical path (the longest duration path through the sequenced list of 
key activities) could not be identified because the current schedule 
ends in September 2009. Finally, MIDAS met all key practices associated 
with data reliability, such as executing the work plan and recording 
costs, as well as all key practices for decision making. 

[Sidebar: 
EV Performance Details: 
Based on performance data from June 2008 to May 2009, MIDAS generally 
met its planned cost targets. However, at the same time the program 
consistently has had negative schedule variances, indicating that work 
is slightly behind schedule. Reasons for this slippage include work 
being accomplished less efficiently than planned, with some activities, 
such as the acquisition of a project management information system, 
being delayed. We concur with the program’s estimate that it will meet 
its current budget at completion—worth approximately $7.0 million—for 
program initiation activities. 
Program percent complete: 94%; 
Estimates at completion: 
* Program: $6.9 million; 
* GAO: $6.9 million. 
End of sidebar] 

Figure 1: GAO EV Data Analysis of Agriculture's MIDAS Program (dollars 
in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: $0.2; 
Cumulative schedule variance: $0. 

Date: July, 2008; 	
Cumulative cost variance: $0.2; 
Cumulative schedule variance:$0. 

Date: August, 2008; 	
Cumulative cost variance: $0.2; 
Cumulative schedule variance: -$0.1. 

Date: September, 2008; 
Cumulative cost variance: $0.1; 
Cumulative schedule variance: -$0.2. 

Date: October, 2008; 	
Cumulative cost variance: -$0.2; 
Cumulative schedule variance: -$0.3. 

Date: November, 2008; 	
Cumulative cost variance: -$0.2; 
Cumulative schedule variance: -$0.4. 

Date: December, 2008; 	
Cumulative cost variance: $0.1; 
Cumulative schedule variance: -$0.3. 

Date: January, 2009; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.1. 

Date: February, 2009; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.1. 

Date: March, 2009; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.1. 

Date: April, 2009; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.2. 

Date: May, 2009; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.2. 

Source: GAO analysis of U.S. Department of Agriculture (Farm Service 
Agency) data. 

[End of figure] 

Decennial Response Integration System: 

[Sidebar: 
Investment Details: 
Department of Commerce (Census Bureau): 
Program start date: March 2006; 
Total life-cycle cost:
* Current: $946 million; 
* Original: $574 million; 
Program end date: 
* Current: September 2013; 
* Original: September 2013; 
Rebaselines: 0; 
Major contractor: Lockheed Martin. 
End of sidebar] 

The Decennial Response Integration System (DRIS) is to be used during 
the 2010 Census for collecting and integrating census responses from 
all sources, including forms and telephone interviews. The system is to 
improve accuracy and timeliness by standardizing the response data and 
providing the data to other Census Bureau systems for analysis and 
processing. Among other things, DRIS is expected to process census data 
provided by respondents via census forms, telephone agents, and 
enumerators; assist the public via telephone; and monitor the quality 
and status of data capture operations. The DRIS program's estimated 
life-cycle costs have increased by $372 million, which is mostly due to 
increases in both paper and telephone workloads. For example, the paper 
workload increased due to an April 2008 redesign of the 2010 Census 
that reverted planned automated operations to paper-based processes and 
requires DRIS to process an additional estimated 40 million paper 
forms. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 9: GAO EVM Practice Assessment of Commerce's DRIS Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

[End of table] 

DRIS fully implemented all 11 of the key EVM practices necessary to 
manage its system acquisition program. Specifically, the program 
implemented all practices for establishing a comprehensive EVM system, 
such as defining the scope of work and scheduling the work. The 
program's schedule appropriately captured and sequenced key activities 
and assigned realistic resources to all key activities. Furthermore, 
the DRIS team ensured that the resulting EVM data were appropriately 
verified and validated for reliability by analyzing performance data to 
identify the magnitude and effect of problems causing key variances, 
tracking related risks in the program's risks register, and performing 
quality checks of the schedule and critical path. Lastly, the DRIS 
program management team conducted rigorous reviews of EV performance on 
a monthly basis and took the appropriate management actions to mitigate 
risks. 

[Sidebar: 
EV Performance Details: 
Based on performance data from June 2008 to May 2009, the DRIS 
contractor has outperformed its planned cost targets by $13.6 million. 
For this same period, it has also outperformed its schedule targets by 
completing $2.3 million worth of work ahead of schedule. We concur with 
the contractor’s estimate that it will underrun its current budget—
worth approximately $468.6 million—by $7.0 million. 
Contract percent complete: 50%; 
Estimates at completion: 
* Contractor: $461.7 million; 
* GAO: $461.7 million. 
Note: The DRIS contractor did not report EV data in November 2008. 
End of sidebar] 

Figure 2: GAO EV Data Analysis of Commerce's DRIS Program dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: $4.7; 
Cumulative schedule variance: -$0.1. 

Date: July, 2008; 	
Cumulative cost variance: $5.1; 
Cumulative schedule variance: $0. 

Date: August, 2008; 	
Cumulative cost variance: $4.8; 
Cumulative schedule variance: -$0.2. 

Date: September, 2008; 	
Cumulative cost variance: $4.5; 
Cumulative schedule variance: -$0.6. 

Date: October, 2008; 	
Cumulative cost variance: $4.6; 
Cumulative schedule variance: -$0.4. 

Date: November, 2008; 
No data. 		 

Date: December, 2008; 	
Cumulative cost variance: $8.6; 
Cumulative schedule variance: $3.5. 

Date: January, 2009; 	
Cumulative cost variance: $9; 
Cumulative schedule variance: $5.9. 

Date: February, 2009; 	
Cumulative cost variance: $11.8; 
Cumulative schedule variance: $8.4. 

Date: March, 2009; 	
Cumulative cost variance: $12.3; 
Cumulative schedule variance: $0.2. 

Date: April, 2009; 	
Cumulative cost variance: $15; 
Cumulative schedule variance: $0.5. 

Date: May, 2009; 	
Cumulative cost variance: $13.6; 
Cumulative schedule variance: $2.3. 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

[End of figure] 

Field Data Collection Automation: 

[Sidebar: 
Investment Details: 
Department of Commerce (Census Bureau); 
Program start date: March 2006; 
Total life-cycle cost: 
* Current: $801.1 million; 
* Original: $595.7 million; 
Program end date: 
* Current: December 2011; 
* Original: December 2011; 
Rebaselines: 1 (October 2008); 
Major contractor: Harris Corporation. 
End of sidebar] 

The Field Data Collection Automation (FDCA) program is intended to 
provide automation support for the 2010 Census field data collection 
operations. The program includes the development of handheld computers 
for identifying and correcting addresses for all known living quarters 
in the United States (known as address canvassing) and the systems, 
equipment, and infrastructure that field staff will use to collect 
data. FDCA handheld computers were originally to be used for other 
census field operations, such as following up with nonrespondents 
through personal interviews. However, in April 2008, due to problems 
identified during testing and cost overruns and schedule slippages in 
the FDCA program, the Secretary of Commerce announced a redesign of the 
2010 Census, and rebaselined FDCA in October 2008. As a result, FDCA's 
life-cycle costs have increased from an estimated $596 million to $801 
million, a $205 million increase. Furthermore, the responsibility for 
the design, development, and testing of IT systems for other key field 
operations was moved from the FDCA contractor to the Census Bureau. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 10: GAO EVM Practice Assessment of Commerce's FDCA Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Forecast estimates at completion; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

[End of table] 

FDCA fully met 6 of the 11 key practices for implementing EVM and 
partially met 5 others. Specifically, the program fully met most 
practices for establishing a comprehensive EVM system, such as defining 
the scope of the work effort; however, it only partially met the 
practice for scheduling the work. Specifically, the program schedule 
contained weaknesses, including key milestones with fixed completion 
dates--which hampers the program's ability to see the impact of delays 
experienced on open tasks on successor tasks. As such, the FDCA program 
cannot use the schedule as an active management tool. Furthermore, 
anomalies in the prime contractor's EVM reports, combined with 
weaknesses in the master schedule, affect FDCA's ability to execute the 
work plan, analyze variances, and make reliable estimates of cost at 
completion. Lastly, cost and schedule drivers identified in EVM reports 
were not fully consistent with the program's risk register, which 
prevents the program from taking the appropriate management action to 
mitigate risks and effectively using EV data for decisions. 

[Sidebar: 
EV Performance Details: 
Due to contractor performance issues, the FDCA program established a 
new program baseline in October 2008. Based on performance data from 
October 2008 to May 2009, the contractor has currently exceeded its 
revised cost target by $3.5 million. We estimate that the FDCA contract 
will overrun its current budget—worth approximately $555.6 million—by 
$4.6 million. Our analysis indicates that the rebaselined contract is 
currently on schedule. 
Contract percent complete: 75%. 
Estimates at completion: 
* Contractor: $558.5 million; 
* GAO: $560.2 million. 
Note: EV data between June 2008 and September 2008 did not reflect 
actual program performance because the program was rebaselining; 
therefore, these data have been omitted. 
End of sidebar] 

Figure 3: GAO EV Data Analysis of Commerce's FDCA Program (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: October, 2008; 	
Cumulative cost variance: $2.4; 
Cumulative schedule variance: -$1.5. 

Date: November, 2008; 
Cumulative cost variance: $4; 
Cumulative schedule variance: -$1. 		 

Date: December, 2008; 	
Cumulative cost variance: $6.5; 
Cumulative schedule variance: -$4.7. 

Date: January, 2009; 	
Cumulative cost variance: $4.4; 
Cumulative schedule variance: $1.3. 

Date: February, 2009; 	
Cumulative cost variance: -$0.9; 
Cumulative schedule variance: -$2.9. 

Date: March, 2009; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.1. 

Date: April, 2009; 	
Cumulative cost variance: -$2.2; 
Cumulative schedule variance: $0.1. 

Date: May, 2009; 	
Cumulative cost variance: -$3.5; 
Cumulative schedule variance: $0.4. 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

[End of figure] 

Air and Space Operations Center--Weapon System: 

[Sidebar: 
Investment Details: 
Department of Defense (Department of the Air Force); 
Program start date: September 2000; 
Total life-cycle cost: 
* Current: $4.425 billion; 
* Original: $4.425 billion; 
Program end date: 
* Current: September 2023; 
* Original: September 2023; 
Rebaselines: 0; 
Major contractor: Lockheed Martin. 
End of sidebar] 

The Air and Space Operations Center--Weapon System (AOC) is the air and 
space operations planning, execution, and assessment system for the 
Joint Force Air Component Commander. According to the agency, there are 
currently 11 AOCs located around the world, each aligned to the 
Combatant Commands of the Unified Command Plan, with additional support 
units for training, help desk, testing, and contingency manpower 
augmentation. Each AOC is designed to enable commanders to exercise 
command and control of air, space, information operations, and combat 
support forces to achieve the objectives of the joint force commander 
and combatant commander in joint and coalition military operations. As 
such, the AOC system is intended as the planning and execution engine 
of any air campaign. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 11: GAO EVM Practice Assessment of Defense's AOC Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Defense (Department of the Air 
Force) data. 

[End of table] 

AOC fully met 7 of the 11 key practices and partially met 4 others. AOC 
applied EVM at the contract level and has a capable government team 
that has made it an integral part of project management. AOC performed 
detailed analyses of the EV data and reviews the data with engineering 
staff to ensure that the appropriate metrics have been applied for 
accurate reporting. AOC has also integrated EVM with its risk 
management processes to ensure that resources are applied to watch or 
mitigate risks associated with the cost and schedule drivers reported 
in the EVM reports. Weaknesses found in AOC's EVM processes relate to 
the development and validation of the contractor baseline. In 
particular, AOC has not performed an integrated baseline review for all 
work that is currently on contract. The master schedule also contained 
issues, such as a high number of converging tasks and out-of-sequence 
tasks, that hamper AOC's ability to determine the start dates of future 
tasks. Taken together, these issues undermine the reliability of the 
schedule as a baseline to measure EV performance. 

[Sidebar: 
EV Performance Details: 
As of April 2009, the AOC contractor has overrun its planned cost 
targets by $58,000. However, for this same period, it has completed 
$422,000 worth of work ahead of schedule. Based on the performance data 
from May 2008 to April 2009, we concur with the contractor’s estimate 
that it will overrun its current budget—worth approximately $171.3 
million—by $793,000. 
Contract percent complete: 86%; 
Estimates at completion: 
* Contractor: $172.1 million; 
* GAO: $172.1 million. 
End of sidebar] 

Figure 4: GAO EV Data Analysis of Defense's AOC Program (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: May, 2008; 	
Cumulative cost variance: -$0.2; 
Cumulative schedule variance: -$1. 

Date: June, 2008; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.6. 

Date: July, 2008; 	
Cumulative cost variance: -$0.8; 
Cumulative schedule variance: -$0.5. 

Date: August, 2008; 	
Cumulative cost variance: $5.1; 
Cumulative schedule variance: -$0.8. 

Date: September, 2008; 	
Cumulative cost variance: -$2.2; 
Cumulative schedule variance: -$1.4. 

Date: October, 2008; 	
Cumulative cost variance: -$2.4; 
Cumulative schedule variance: -$1.1. 

Date: November, 2008; 
Cumulative cost variance: -$1.7; 
Cumulative schedule variance: -$0.6. 		 

Date: December, 2008; 	
Cumulative cost variance: -$0.8; 
Cumulative schedule variance: -$0.5. 

Date: January, 2009; 	
Cumulative cost variance: -$0.5; 
Cumulative schedule variance: -$0.4. 

Date: February, 2009; 	
Cumulative cost variance:-$0.2; 
Cumulative schedule variance: -$1.3. 

Date: March, 2009; 	
Cumulative cost variance: -$0.2; 
Cumulative schedule variance: -$0.1. 

Date: April, 2009; 	
Cumulative cost variance: -$0.1; 
Cumulative schedule variance: $0.4. 

Source: GAO analysis of Department of Defense (Department of the Air 
Force) data. 

[End of figure] 

Joint Tactical Radio System--Handheld, Manpack, Small Form Fit: 

[Sidebar: 
Investment Details: 
Department of Defense (Joint—Department of the Navy Lead); 
Program start date: April 2004; 
Total life-cycle cost: 
* Current: $11.559 billion; 
* Original: $19.214 billion; 
Program end date: 
* Current: 2048; 
* Original: 2045; 
Rebaselines: 1 (June 2006); 
Major contractor: General Dynamics C4 Systems. 
End of sidebar] 

The Joint Tactical Radio System (JTRS) program is developing software- 
defined radios that are expected to interoperate with existing radios 
and increase communications and networking capabilities. The JTRS- 
Handheld, Manpack, Small Form Fit (HMS) product office, within the JTRS 
Ground Domain program office, is developing handheld, manpack, and 
small form fit radios. In 2006, the program was restructured to include 
two concurrent phases of development. Phase I includes select small 
form fit radios, while Phase II includes small form fit radios with 
enhanced security as well as handheld and manpack variants. Subsequent 
to the program's restructure, the department updated its migration 
strategy for replacing legacy radios with new tactical radios. As such, 
the total planned quantity of JTRS-HMS radios was reduced from an 
original baseline of 328,514--established in May 2004--to 95,551. As a 
result, the total life-cycle cost of the JTRS-HMS program was reduced 
from an estimated $19.2 billion to $11.6 billion, a $7.6 billion 
decrease. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 12: GAO EVM Practice Assessment of Defense's JTRS-HMS Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program fully implemented the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Defense (Joint--Department of the 
Navy Lead) data. 

[End of table] 

JTRS-HMS fully met 10 of the 11 key practices and partially met 1 
practice. Specifically, JTRS-HMS implemented most practices for 
establishing a comprehensive EVM system, such as performing rigorous 
reviews to validate the baseline; however, the current schedule 
contained some weaknesses, such as out-of-sequence logic and activities 
without resources assigned. Program officials were aware of these 
issues and attributed them to weaknesses in subcontractor schedules 
that are integrated on a monthly basis. The JTRS-HMS program fully met 
practices for ensuring that the resulting EV data were appropriately 
verified and validated for reliability and demonstrated that the 
program management team was using these data for decision-making 
purposes. 

[Sidebar: 
EV Performance Details: 
Based on performance data from June 2008 to May 2009, the JTRS-HMS 
contractor has experienced negative cost and schedule variances. 
Specifically, as of May 2009, the contractor has exceeded its planned 
cost target by $62.4 million. We estimate that the JTRS-HMS contract 
will overrun its current budget—worth approximately $530.8 million—by 
$89.1 million. Furthermore, as of May 2009, JTRS-HMS has not completed 
$8.8 million in planned work. Both cost and schedule variances are 
primarily due to radio hardware development, including design issues 
related to hardware miniaturization. 
Contract percent complete: 74%; 
Estimates at completion: 
* Contractor: $600.9 million; 
* GAO: $619.9 million. 
End of sidebar] 

Figure 5: GAO EV Data Analysis of Defense's JTRS-HMS Program (dollars 
in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: -$40.9; 
Cumulative schedule variance: -$7.9. 

Date: July, 2008; 	
Cumulative cost variance: -$42; 
Cumulative schedule variance: -$8.8. 

Date: August, 2008; 	
Cumulative cost variance: -$44.9; 
Cumulative schedule variance: -$10.4. 

Date: September, 2008; 	
Cumulative cost variance: -$48.3; 
Cumulative schedule variance: -$8.6. 

Date: October, 2008; 	
Cumulative cost variance: -$50.5; 
Cumulative schedule variance: -$6.4. 

Date: November, 2008; 
Cumulative cost variance: -$52.9; 
Cumulative schedule variance: -$6.8.		 

Date: December, 2008; 	
Cumulative cost variance: -$54.7; 
Cumulative schedule variance: -$5.7. 

Date: January, 2009; 	
Cumulative cost variance: -$56.8; 
Cumulative schedule variance: -$6.5. 

Date: February, 2009; 	
Cumulative cost variance: -$58.1; 
Cumulative schedule variance: -$8.3. 

Date: March, 2009; 	
Cumulative cost variance: -$59.4; 
Cumulative schedule variance: -$8.4. 

Date: April, 2009; 	
Cumulative cost variance: -$60.2; 
Cumulative schedule variance: -$8.8. 

Date: May, 2009; 	
Cumulative cost variance: -$62.4; 
Cumulative schedule variance: -$8.8. 

Source: GAO analysis of Department of Defense (Joint—Department of the 
Navy Lead) data. 

[End of figure] 

Warfighter Information Network--Tactical: 

[Sidebar: 
Investment Details: 
Department of Defense (Department of the Army); 
Program start date: July 2003; 
Total life-cycle cost: 
* Current: $38.157 billion; 
* Original: $38.157 billion; 
Program end date: 
* Current: 2025; 
* Original: 2025; 
Rebaselines: 1 (June 2007); 
Major contractor: General Dynamics C4 Systems. 
End of sidebar] 

The Warfighter Information Network--Tactical (WIN-T) program is 
designed to be the Army's high-speed and high-capacity backbone 
communications network. The program connects Department of the Army 
units with higher levels of command and provides the Army's tactical 
portion of the Global Information Grid--a Department of Defense 
initiative aimed at building a secure network and set of information 
capabilities modeled after the Internet. WIN-T was restructured in June 
2007 following a unit cost increase above the critical cost growth 
threshold (known as a Nunn-McCurdy breach). As a result of the 
restructuring, it was determined that WIN-T would be fielded in four 
increments. The third increment is expected to provide the Army with a 
full networking on-the-move capability and fully support the Army's 
Future Combat Systems. In May 2009, the Increment 3 program baseline 
was approved, and the life-cycle cost for the program was estimated at 
$38.2 billion. Our assessment of EVM practices and EV data was 
performed on WIN-T Increment 3. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 13: GAO EVM Practice Assessment of Defense's WIN-T Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program did not implement all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program did not implement the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program did not implement all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Defense (Department of the Army) 
data. 

[End of table] 

WIN-T fully met 7 of the 11 key practices for implementing EVM, 
partially met 1 practice, and did not meet 3 practices. Specifically, 
WIN-T only partially met the practices for establishing a comprehensive 
EVM system. The schedule contained weaknesses, including fixed 
completion dates--which prevented the schedule from showing the impact 
of delays experienced on open or successor tasks or the expected 
completion dates of key activities. Furthermore, WIN-T has not 
conducted an integrated baseline review on the current scope of work 
since rebaselining the prime contract in December 2007. According to 
program officials, this review has not been conducted because they have 
not yet finalized the contract. However, as of August 2009, it has been 
20 months since work began, which increases the risk that the program 
has not been measuring progress against a reasonable baseline. Without 
conducting this review to validate the performance baseline, the 
baseline cannot be adequately updated as changes occur, and EV data 
cannot be used effectively for decision-making purposes. 

[Sidebar: 
EV Performance Details: 
Based on contractor performance data from June 2008 to May 2009, the 
WIN-T contract has outperformed its planned cost targets by $880,000. 
However, for the same period, it has not completed $12.0 million in 
planned work. These schedule variances are due, in part, to issues 
found during initial testing that were addressed in subsequent software 
releases, resulting in planned software development work being delayed 
to future releases. Based on these data, we estimate that the WIN-T 
contract will overrun its current budget—worth approximately $747.0 
million—by $15.1 million. 
Contract percent complete: 34%; 
Estimates at completion: 
* Contractor: $743.3 million; 
* GAO: $762.1 million. 
End of sidebar] 

Figure 6: GAO EV Data Analysis of Defense's WIN-T Program (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: $0.5; 
Cumulative schedule variance: $0. 

Date: July, 2008; 	
Cumulative cost variance: $0.5; 
Cumulative schedule variance: -$4.8. 

Date: August, 2008; 	
Cumulative cost variance: $2.1; 
Cumulative schedule variance: -$6.2. 

Date: September, 2008; 	
Cumulative cost variance: $2.7; 
Cumulative schedule variance: -$5.4. 

Date: October, 2008; 	
Cumulative cost variance: $2.2; 
Cumulative schedule variance: -$2. 

Date: November, 2008; 
Cumulative cost variance: $1.4; 
Cumulative schedule variance: -$7.4.		 

Date: December, 2008; 	
Cumulative cost variance: $2.3; 
Cumulative schedule variance: -$7.4. 

Date: January, 2009; 	
Cumulative cost variance: $2.4; 
Cumulative schedule variance: -$9.5. 

Date: February, 2009; 	
Cumulative cost variance: $0.9; 
Cumulative schedule variance: -$12.2. 

Date: March, 2009; 	
Cumulative cost variance: $2.1; 
Cumulative schedule variance: -$14.5. 

Date: April, 2009; 	
Cumulative cost variance: $2.1; 
Cumulative schedule variance: -$8.2. 

Date: May, 2009; 	
Cumulative cost variance: $0.8; 
Cumulative schedule variance: -$12. 

Source: GAO analysis of Department of Defense (Department of the Army) 
data. 

[End of figure] 

Automated Commercial Environment: 

[Sidebar: 
Investment Details: 
Department of Homeland Security (U.S. Customs and Border Protection); 
Program start date: 2001; 
Total life-cycle development cost: 
* Current: $2.2 billion; 
* Original: $1.5 billion; 
Program end date: 
* Current: 2016; 
* Original: 2016; 
Rebaselines: 0; 
Major contractor: IBM. 
End of sidebar] 

The Automated Commercial Environment (ACE) program is the commercial 
trade processing system being developed by the U.S. Customs and Border 
Protection to facilitate trade while strengthening border security. The 
program is to provide trade compliance and border security staff with 
the right information at the right time, while minimizing 
administrative burden. Deployed in phases, ACE is expected to be 
expanded to provide cargo processing capabilities across all modes of 
transportation and intended to replace existing systems with a single, 
multimodal manifest system for land, air, rail, and sea cargo. 
Ultimately, ACE is expected to become the central data collection 
system for the federal agencies that, by law, require international 
trade data, and should deliver these capabilities in a secure, paper- 
free, Web-enabled environment. As a result of poorly managed 
requirements, the total life-cycle development cost of the ACE program 
increased from an estimated $1.5 billion to $2.2 billion--a $700 
million increase. 

Table 14: GAO EVM Practice Assessment of Homeland Security's ACE 
Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program fully implemented the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Homeland Security (U.S. Customs 
and Border Protection) data. 

[End of table] 

ACE fully met 9 of the 11 key practices for implementing EVM and 
partially met the remaining 2 practices. Specifically, ACE fully met 5 
of 6 practices for establishing a comprehensive EVM system, such as 
defining the scope of the work effort and developing the performance 
baseline, but partially met the practice for scheduling the work, in 
part, because resources were not assigned to all activities in the 
master schedule. ACE fully met 2 practices for ensuring that the data 
resulting from the EVM system were reliable, such as adequately 
analyzing EV performance data, but could not fully execute the work 
plan because of the weaknesses found in the schedule. Lastly, ACE 
demonstrated that the program management team was basing decisions on 
EVM data. 

It should be noted that the ACE program is being defined incrementally--
whereby the performance baseline is continuously updated as task orders 
for new work are issued. As such, the use of EVM to determine the true 
progress made and to project reliable final costs at completion is 
limited. 

Figure 7: GAO EV Data Analysis of Homeland Security's ACE Program 
(dollars in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: -$9.4; 
Cumulative schedule variance: -$7.3. 

Date: July, 2008; 	
Cumulative cost variance: -$12.6; 
Cumulative schedule variance: -$9.1. 

Date: August, 2008; 	
Cumulative cost variance: -$13.4; 
Cumulative schedule variance: -$10.4. 

Date: September, 2008; 	
Cumulative cost variance: -$14; 
Cumulative schedule variance: -$12.6. 

Date: October, 2008; 	
Cumulative cost variance: -$16; 
Cumulative schedule variance: -$14.1. 

Date: November, 2008; 
Cumulative cost variance: -$14; 
Cumulative schedule variance: -$13.3.		 

Date: December, 2008; 	
Cumulative cost variance: -$11.5; 
Cumulative schedule variance: -$15.8. 

Date: January, 2009; 	
Cumulative cost variance: -$11.5; 
Cumulative schedule variance: -$15.7. 

Date: February, 2009; 	
Cumulative cost variance: -$17.4; 
Cumulative schedule variance: -$14.9. 

Date: March, 2009; 	
Cumulative cost variance: -$18.6; 
Cumulative schedule variance: -$13.8. 

Date: April, 2009; 	
Cumulative cost variance: -$15; 
Cumulative schedule variance: -$13.4. 

Date: May, 2009; 	
Cumulative cost variance: -$18.8; 
Cumulative schedule variance: -$13.2. 

Source: GAO analysis of Department of Homeland Security (U.S. Customs 
and Border Protection) data. 

[End of figure] 

Integrated Deepwater System--Common Operational Picture: 

[Sidebar: 
Investment Details: 
Department of Homeland Security (U.S. Coast Guard); 
Program start date: August 2002; 
Total life-cycle development cost: 
* Current: $1.4 billion; 
* Original: $1.4 billion; 
Program end date: 
* Current: 2014; 
* Original: 2014; 
Rebaselines: 1 (July 2007); 
Major contractors: Lockheed Martin and Northrop Grumman. 
End of sidebar] 

The Integrated Deepwater System is a 25-year, $24 billion major 
acquisition program to recapitalize the U.S. Coast Guard's aging fleet 
of boats, airplanes, and helicopters, ensuring that all work together 
through a modern, capable communications system. This initiative is 
designed to enhance maritime domain awareness and enable the Coast 
Guard to meet its post-September 11 mission requirements. The program 
is composed of 15 major acquisition projects, including the Common 
Operational Picture (COP) program. 

Deepwater COP is to provide relevant, real-time operational 
intelligence and surveillance data to human capital managers, allowing 
them to direct and monitor all assigned forces and first responders. 
This is expected to allow commanders to distribute critical information 
to federal, state, and local agencies quickly; reduce duplication; 
enable earlier alerting; and enhance maritime awareness. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 15: GAO EVM Practice Assessment of Homeland Security's Deepwater 
COP Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program fully implemented the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Homeland Security (U.S. Coast 
Guard) data. 

[End of table] 

Deepwater COP fully met 7 of the 11 key practices and partially met 4 
others. Specifically, COP fully met 5 of the 6 practices for 
establishing a comprehensive EVM system, such as adequately defining 
all major elements of the work breakdown structure and developing the 
performance baseline. However, the program's master schedule contained 
weaknesses, such as a large number of concurrent tasks and activities 
without resources assigned. Officials were aware of some, but not all, 
of the weaknesses in the schedule and had controls in place to mitigate 
the weakness they were aware of in order to improve the reliability of 
the resulting EV data. Lastly, COP was unable to fully meet 1 of the 
practices for using EV data for management decisions because it could 
not demonstrate that cost and schedule drivers impacting EV performance 
were linked to its risk management processes. 

[Sidebar: 
EV Performance Details: 
Based on performance data from June 2008 to May 2009, the Deepwater COP 
contractor has experienced negative cost and schedule variances. 
Specifically, as of May 2009, the contractor has exceeded its planned 
cost target by $4.2 million. These cost variances are due, in part, to 
design and development tasks requiring more work than originally 
planned. We estimate that the contract will overrun its current budget—
worth approximately $130.2 million—by $4.2 million. Our analysis 
indicates that the contract is currently on schedule. 
Contract percent complete: 99%; 
Estimates at completion: 
* Contractor: $134.3 million; 
* GAO: $134.3 million. 
End of sidebar] 

Figure 8: GAO EV Data Analysis of Homeland Security's Deepwater COP 
Program (dollars in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: -$3.1; 
Cumulative schedule variance: -$0.1. 

Date: July, 2008; 	
Cumulative cost variance: -$3.9; 
Cumulative schedule variance: -$0.1. 

Date: August, 2008; 	
Cumulative cost variance: -$4.6; 
Cumulative schedule variance: -$0.1. 

Date: September, 2008; 	
Cumulative cost variance: -$4.8; 
Cumulative schedule variance: $0. 

Date: October, 2008; 	
Cumulative cost variance: -$3.3; 
Cumulative schedule variance: $0. 

Date: November, 2008; 
Cumulative cost variance: -$3.3; 
Cumulative schedule variance: $0.		 

Date: December, 2008; 	
Cumulative cost variance: -$3.2; 
Cumulative schedule variance: -$0.1. 

Date: January, 2009; 	
Cumulative cost variance: -$3.4; 
Cumulative schedule variance: -$0.3. 

Date: February, 2009; 	
Cumulative cost variance: -$3.5; 
Cumulative schedule variance: -$0.2. 

Date: March, 2009; 	
Cumulative cost variance: -$3.5; 
Cumulative schedule variance: $0. 

Date: April, 2009; 	
Cumulative cost variance: -$3.9; 
Cumulative schedule variance: $0. 

Date: May, 2009; 	
Cumulative cost variance: -$4.2; 
Cumulative schedule variance: $0. 

Source: GAO analysis of Department of Homeland Security (U.S. Coast 
Guard) data. 

[End of figure] 

Western Hemisphere Travel Initiative: 

[Sidebar: 
Investment Details: 
Department of Homeland Security (U.S. Customs and Border Protection); 
Program start date: January 2007; 
Total life-cycle cost: 
* Current: $1.2 billion; 
* Original: $1.2 billion; 
Program end date: June 1, 2009; 
Rebaselines: 1 (March 2008); 
Major contractor: Unisys. 
End of sidebar] 

The Western Hemisphere Travel Initiative (WHTI) program made 
modifications to vehicle processing lanes at ports of entry on the 
nation's northern and southern borders. WHTI is designed to allow U.S. 
Customs and Border Protection to effectively address new requirements 
imposed by the Intelligence Reform and Terrorism Prevention Act of 2004 
(completing these requirements by June 1, 2009). WHTI development was 
completed and its implementation addressed the 39 highest volume ports 
of entry, which support 95 percent of land border traffic. The 
initiative requires travelers to present a passport or other authorized 
travel document that denotes identity and citizenship when entering the 
United States. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 16: GAO EVM Practice Assessment of Homeland Security's WHTI 
Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Homeland Security (U.S. Customs 
and Border Protection) data. 

[End of table] 

WHTI fully met 6 of the 11 key practices for implementing EVM and 
partially met the remaining 5 practices. Specifically, weaknesses 
identified in validating the performance baseline and scheduling the 
work limited the program's ability to establish a comprehensive EVM 
system. Although the program held an integrated baseline review to 
validate the baseline in March 2008, the review did not cover many key 
aspects, such as identifying corrective actions needed to mitigate 
program risks. Furthermore, the master schedule contained deficiencies, 
such as activities that were out of sequence or lacking dependencies. 
While program officials described their use of processes for ensuring 
the reliability of the EVM system's data, such as capturing significant 
cost and schedule drivers in the risk register, the provided 
documentation did not corroborate what we were told. When combined, 
these weaknesses preclude the program from effectively making decisions 
about the program based on EV data. 

[Sidebar: 
EV Performance Details: 
Based on performance data from June 2008 to May 2009, the WHTI 
contractor experienced schedule variances. However, as of June 2009, 
program officials stated that the WHTI contract was successfully 
completed on time. The contractor did not report any cost variances 
because it was a firm-fixed-price contract. Additionally, program 
officials stated that the contract was completed on budget. 
Contract percent complete: 100%. 
End of sidebar] 

Figure 9: GAO EV Data Analysis of Homeland Security's WHTI Program 
(dollars in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: -$14.3; 
Cumulative schedule variance: $0. 

Date: July, 2008; 	
Cumulative cost variance: -$2.8; 
Cumulative schedule variance: $0. 

Date: August, 2008; 	
Cumulative cost variance: -$2.5; 
Cumulative schedule variance: $0. 

Date: September, 2008; 	
Cumulative cost variance: -$2.5; 
Cumulative schedule variance: $0. 

Date: October, 2008; 	
Cumulative cost variance: -$1.9; 
Cumulative schedule variance: $0. 

Date: November, 2008; 
Cumulative cost variance: -$1.7; 
Cumulative schedule variance: $0.		 

Date: December, 2008; 	
Cumulative cost variance: -$0.9; 
Cumulative schedule variance: $0. 

Date: January, 2009; 	
Cumulative cost variance: -$0.3; 
Cumulative schedule variance: $0. 

Date: February, 2009; 	
Cumulative cost variance: $0.8; 
Cumulative schedule variance: $0. 

Date: March, 2009; 	
Cumulative cost variance: -$0.7; 
Cumulative schedule variance: $0. 

Date: April, 2009; 	
Cumulative cost variance: -$0.5; 
Cumulative schedule variance: $0. 

Date: May, 2009; 	
Cumulative cost variance: -$0.2; 
Cumulative schedule variance: $0. 

Source: GAO analysis of Department of Homeland Security (U.S. Customs 
and Border Protection) data. 

[End of figure] 

Next Generation Identification: 

[Sidebar: 
Investment Details: 
Department of Justice (Federal Bureau of Investigation); 
Program start date: February 2008; 
Total life-cycle cost: 
* Current: $1.076 billion; 
* Original: $1.076 billion; 
Program end date: 
* Current: June 2018; 
* Original: April 2018; 
Rebaselines: 0; 
Major contractor: Lockheed Martin. 
End of sidebar] 

The Next Generation Identification (NGI) program is designed to support 
the Federal Bureau of Investigation's mission to reduce terrorist and 
criminal activities by providing timely, relevant criminal justice 
information to the law enforcement community. Today, the bureau 
operates and maintains one of the largest repositories of biometric- 
supported criminal history records in the world. The electronic 
identification and criminal history services support more than 82,000 
criminal justice agencies, authorized civil agencies, and international 
organizations. NGI is intended to ensure that the bureau's biometric 
systems are able to seamlessly share data that are complete, accurate, 
current, and timely. To accomplish this, the current system will be 
replaced or upgraded with new functionalities and state-of-the-art 
equipment. NGI is expected to be scaleable to accommodate five times 
the current workload volume with no increase in support manpower and 
will be flexible to respond to changing requirements. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 17: GAO EVM Practice Assessment of Justice's NGI Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program fully implemented the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Justice (Federal Bureau of 
Investigation) data. 

[End of table] 

NGI fully implemented all 11 key EVM practices. Specifically, the 
program implemented all practices for establishing a comprehensive EVM 
system, such as defining the scope of work and scheduling the work. For 
example, the schedule properly captured key activities, established 
reasonable durations, and established a sound critical path, all of 
which contribute to establishing a reliable baseline that performance 
can be measured against. Furthermore, the NGI team ensured that the 
resulting EV data were appropriately verified and validated for 
reliability by, for example, integrating the analysis of cost and 
schedule variances with the program's risk register to mitigate 
emerging and existing risks associated with key drivers causing major 
variances. In addition, the program's risk register includes cost and 
schedule impacts for every risk and links to the management reserve 
process. Lastly, NGI demonstrated that it is using EV data to make 
decisions by performing continuous quality checks of the schedule, 
reviewing open risks and opportunities, and reviewing EV data in weekly 
management reports. 

[Sidebar: 
EV Performance Details: 
Based on contractor performance data from October 2008 to April 2009, 
NGI experienced negative cost and schedule variances. Specifically, as 
of April 2009, the contractor has exceeded its planned cost targets by 
$1.4 million. Furthermore, as of April 2009, the contractor has not 
completed $0.5 million in planned work. These variances were due, in 
part, to the need for additional testing resources. We estimate that 
the NGI contract will overrun its current budget—worth approximately 
$37.5 million—by $1.6 million. 
Contract percent complete: 91%; 
Estimates at completion: 
* Contractor: $39.0 million; 
* GAO: $39.1 million. 
Note: NGI established its EV reporting baseline in October 2008. 
End of sidebar] 

Figure 10: GAO EV Data Analysis of Justice's NGI Program (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: October, 2008; 	
Cumulative cost variance: -$0.7; 
Cumulative schedule variance: -$0.9. 

Date: November, 2008; 
Cumulative cost variance: -$1.4; 
Cumulative schedule variance: -$1.1.		 

Date: December, 2008; 	
Cumulative cost variance: $0; 
Cumulative schedule variance: -$0.7. 

Date: January, 2009; 	
Cumulative cost variance: $0.8; 
Cumulative schedule variance: -$0.9. 

Date: February, 2009; 	
Cumulative cost variance: $0.2; 
Cumulative schedule variance: -$1.4. 

Date: March, 2009; 	
Cumulative cost variance: -$0.3; 
Cumulative schedule variance: -$0.5. 

Date: April, 2009; 	
Cumulative cost variance: -$1.4; 
Cumulative schedule variance: -$0.5. 

Source: GAO analysis of Department of Justice (Federal Bureau of 
Investigation) data. 

[End of figure] 

James Webb Space Telescope: 

[Sidebar: 
Investment Details: National Aeronautics and Space Administration; 
Project start date: March 1999; 
Total life-cycle cost: 
* Current: $4.964 billion; 
* Original: $4.964 billion; 
Project end date: 
* Current: December 2021; 
* Original: December 2021; 
Rebaselines: 0; 
Major contractor: Northrop Grumman. 
[End of sidebar] 

The James Webb Space Telescope (JWST) is designed to be the scientific 
successor to the Hubble Space Telescope and expected to be the premier 
observatory of the next decade. It is intended to seek to study and 
answer fundamental astrophysical questions, ranging from the formation 
and structure of the Universe to the origin of planetary systems and 
the origins of life. The telescope is an international collaboration of 
the National Aeronautics and Space Administration (NASA), the Canadian 
Space Agency, and the European Space Agency. JWST required the 
development of several new technologies, including a folding segmented 
primary mirror that will unfold after launch and a cryocooler for 
cooling midinfrared detectors to 7 degrees Kelvin. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 18: GAO EVM Practice Assessment of NASA's JWST Project: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area. 

Source: GAO analysis of National Aeronautics and Space Administration 
data. 

[End of table] 

JWST fully met 4 of the 11 key practices and partially met 7 practices. 
The project only partially met practices for establishing a 
comprehensive EVM system because of weaknesses in the work breakdown 
structure, in which the prime contractor has not fully defined the 
scope of each work element. In addition, the project only partially met 
the practice for scheduling work because of weaknesses resulting from 
manual integration of approximately 30 schedules, although officials 
did explain some mitigations for this risk. We also found deficiencies 
in the lower-level schedules, such as missing linkages between tasks, 
resources not being assigned, and excessively high durations. 
Furthermore, JWST only partially implemented practices to ensure that 
the data resulting from the EVM system are reliable, due, in part, to 
variance analysis reports being done quarterly (instead of monthly), 
which limits the project's ability to analyze and respond to cost and 
schedule variances in a timely manner. When combined, these weaknesses 
preclude the program from effectively making decisions about the 
program based on EV data. 

[Sidebar: 
EV Performance Details: 
EVM for the JWST project is being performed by the prime contractor and 
its major subcontractors. The scope of this work includes designing and 
developing the telescope, the spacecraft, and the sunshield; 
integrating and testing the observatory; and supporting launch 
operations. 
Based on contractor performance data from June 2008 to May 2009, the 
JWST project has experienced negative cost and schedule variances. 
Specifically, as of May 2009, the contractor has exceeded its planned 
cost target by $224.7 million. A key driver in this cost overrun was 
greater-than-expected complexity in the work, which required additional 
resources. We concur with the contractor estimate that it will overrun 
its budget—worth approximately $1.3 billion—by $448.5 million. 
Furthermore, as of May 2009, the project has not completed $9.4 million 
in planned work. 
Contract percent complete: 64%; 
Estimates at completion: 
* Contractor: $1.7 billion; 
* GAO: $1.7 billion. 
Note: The project suspended earned value reporting during November 2008 
while undergoing a replan. 
End of sidebar] 

Figure 11: GAO EV Data Analysis of NASA's JWST Project (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: -$145.7; 
Cumulative schedule variance: -$14.2. 

Date: July, 2008; 	
Cumulative cost variance: -$155.7; 
Cumulative schedule variance: -$16.1. 

Date: August, 2008; 	
Cumulative cost variance: -$163.9; 
Cumulative schedule variance: -$17.5. 

Date: September, 2008; 	
Cumulative cost variance: -$171; 
Cumulative schedule variance: -$20.6. 

Date: October, 2008; 	
Cumulative cost variance: -$174.8; 
Cumulative schedule variance: -$16.2. 

Date: November, 2008; 
No data. 

Date: December, 2008; 	
Cumulative cost variance: -$187; 
Cumulative schedule variance: -$4.5. 

Date: January, 2009; 	
Cumulative cost variance: -$194.3; 
Cumulative schedule variance: -$4.3. 

Date: February, 2009; 	
Cumulative cost variance: -$200.8; 
Cumulative schedule variance: -$6.3. 

Date: March, 2009; 	
Cumulative cost variance: -$209.3; 
Cumulative schedule variance: -$7.5. 

Date: April, 2009; 	
Cumulative cost variance: -$217.4; 
Cumulative schedule variance: -$8.5. 

Date: May, 2009; 	
Cumulative cost variance: -$224.7; 
Cumulative schedule variance: -$9.4. 

Source: GAO analysis of National Aeronautics and Space Administration 
data. 

[End of figure] 

Juno: 

[Sidebar: 
Investment Details: 
National Aeronautics and Space Administration; 
Project start date: June 2005; 
Total life-cycle cost: 
* Current: $1.05 billion; 
* Original: $1.05 billion; 
Project end date: 
* Current: October 2018; 
* Original: October 2018; 
Rebaselines: 0; 
Major contractor: Lockheed Martin. 
End of sidebar] 

Juno is part of the New Frontiers Program. The overarching scientific 
goal of the Juno mission is to improve our understanding of the origin 
and evolution of Jupiter. As the archetype of giant planets, Jupiter 
may provide knowledge that will improve our understanding of both the 
origin of our solar system and the planetary systems being discovered 
around other stars. The Juno project is expected to use a solar-powered 
spacecraft to make global maps of the gravity, magnetic fields, and 
atmospheric composition of Jupiter. The spacecraft is to make 33 orbits 
of Jupiter to sample the planet's full range of latitudes and 
longitudes. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 19: GAO EVM Practice Assessment of NASA's Juno Project: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of National Aeronautics and Space Administration 
data. 

[End of table] 

Juno fully met 8 of the 11 key practices for implementing EVM and 
partially met 3 practices. Specifically, the project fully met 3 
practices for establishing a comprehensive EVM system, but only 
partially met the practices for scheduling the work, determining the 
objective measure of earned value, and establishing the performance 
baseline. Juno was unable to fully meet these practices because the 
project's master schedule contained issues with the sequencing of work 
activities and lacked a comprehensive integrated baseline review. 
Although an integrated baseline review was conducted for a major 
contract in February 2009, the program did not validate the baseline, 
scope of work to be performed, or key risks and mitigation plans for 
the Juno project as a whole, which increases the risk that the project 
is measuring performance against an unreasonable baseline. Juno fully 
implemented all 3 practices associated with data reliability and the 2 
practices associated with using EV data for decision-making purposes. 

[Sidebar: 
EV Performance Details: 
Based on performance data from December 2008 to May 2009, the Juno 
project has experienced negative cost and schedule variances. 
Specifically, as of May 2009, the project has exceeded its cost target 
by $13.2 million. Based on these data, we estimate that the Juno 
project will overrun its current budget—worth approximately $369.0 
million—by $49.8 million. Furthermore, as of May 2009, the project has 
not completed $12.3 million in planned work. 
Project percent complete: 32%; 
Estimates at completion: 
* Project: $375.5 million; 
* GAO: $418.8 million. 
Note: Juno established its EV reporting baseline in December 2008. 
End of sidebar] 

Figure 12: GAO EV Data Analysis of NASA's Juno Project (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: December, 2008; 	
Cumulative cost variance: $0.8; 
Cumulative schedule variance: -$3.9. 

Date: January, 2009; 	
Cumulative cost variance: $1.2; 
Cumulative schedule variance: -$1.5. 

Date: February, 2009; 	
Cumulative cost variance: -$2.2; 
Cumulative schedule variance: -$6.3. 

Date: March, 2009; 	
Cumulative cost variance: -$5.1; 
Cumulative schedule variance: -$10.1. 

Date: April, 2009; 	
Cumulative cost variance: -$10.2; 
Cumulative schedule variance: -$10.3. 

Date: May, 2009; 	
Cumulative cost variance: -$13.2; 
Cumulative schedule variance: -$12.3. 

Source: GAO analysis of National Aeronautics and Space Administration 
data. 

[End of figure] 

Mars Science Laboratory: 

[Sidebar: 
Investment Details: 
National Aeronautics and Space Administration; 
Project start date: November 2003; 
Total life-cycle cost: 
* Current: $2.286 billion; 
* Original: $1.634 billion; 
Project end date: 
* Current: September 2015; 
* Original: September 2013; 
Rebaselines: 1 (March 2009); 
Major contractor: None: in-house development. 
End of sidebar] 

The Mars Science Laboratory (MSL) is part of the Mars Exploration 
Program. The program seeks to understand whether Mars was, is, or can 
be a habitable world. To answer this question, the MSL project is 
expected to investigate how geologic, climatic, and other processes 
have worked to shape Mars and its environment over time, as well as how 
they interact today. To accomplish this, the MSL project plans to place 
a mobile science laboratory on the surface of Mars to quantitatively 
assess a local site as a potential habitat for life, past or present. 
The project is considered one of NASA's flagship projects and designed 
to be the most advanced rover ever sent to explore the surface of Mars. 
Due to technical issues identified during the development of key 
components, the MSL launch date has recently slipped 2 years--from 
September 2009 to October 2011, and the project's life-cycle cost 
estimate has increased from about $1.63 billion to $2.29 billion, a 
$652 million increase. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 20: GAO EVM Practice Assessment of NASA's MSL Project: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area. 

Source: GAO analysis of National Aeronautics and Space Administration 
data. 

[End of table] 

MSL fully met 5 of the 11 key practices and partially met 6 others. 
Specifically, MSL fully met 3 practices for establishing a 
comprehensive EVM system, but only partially met 3 others because of 
weaknesses in the sequencing of all activities in the schedule and the 
lack of an integrated baseline review to validate the baseline and 
assess the achievability of the plan. While the project has taken steps 
to mitigate the latter weakness by requiring work agreements that 
document, among other things, the objective value of work and related 
risks for planned work packages, this is not a comprehensive review of 
the project's baseline. Furthermore, MSL only partially implemented 
practices associated with data reliability because its analysis of cost 
and schedule variances did not include the root causes for variances 
and corrective actions, which prevents the project from tracking and 
mitigating related risks. Lastly, without an initial validation of the 
performance baseline, the baseline cannot be appropriately updated to 
reflect program changes, thereby limiting the use of EV data for 
management decisions. 

[Sidebar: 
EV Performance Details: 
Due to significant cost and schedule overruns, the MSL project recently 
completed a project replan between November 2008 and February 2009. 
Specifically, as of October 2008, MSL had exceeded its cost targets by 
$189.8 million and had not completed $24.1 million in planned work, due 
primarily to technical issues experienced in the development of rover’s 
mechanical gears and avionics components. As a result of the replan, 
the project’s launch date was delayed 2 years, and the budget was 
increased from $768.7 million to $1.223 billion. Since the replan, the 
project is meeting cost targets but, as of May 2009, has not completed 
$6.2 million in planned work. 
Project percent complete: 77%; 
Estimates at completion: 
* Project: $1.227 billion; 
* GAO: N/A. 
Note: MSL suspended EVM reporting between November 2008 and February 
2009 while undergoing a project replan. Therefore, we did not have 
sufficient data to make a reliable independent estimate at completion. 
The project’s EV baseline does not include components being provided by 
external parties, such as other NASA centers and the Department of 
Energy. 
End of sidebar] 

Figure 13: GAO EV Data Analysis of NASA's MSL Project (dollars in 
millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: -$152.1; 
Cumulative schedule variance: -$34.2. 

Date: July, 2008; 	
Cumulative cost variance: -$162.1; 
Cumulative schedule variance: -$30.8. 

Date: August, 2008; 	
Cumulative cost variance: -$179.8; 
Cumulative schedule variance: -$27.4. 

Date: September, 2008; 	
Cumulative cost variance: -$181.7; 
Cumulative schedule variance: -$21.5. 

Date: October, 2008; 	
Cumulative cost variance: -$189.8; 
Cumulative schedule variance: -$24.1. 

Date: November, 2008; 
No data. 

Date: December, 2008; 	
No data. 

Date: January, 2009; 	
No data. 

Date: February, 2009; 	
No data. 

Date: March, 2009; 	
Cumulative cost variance: $1.3; 
Cumulative schedule variance: -$2.9. 

Date: April, 2009; 	
Cumulative cost variance: $1.3; 
Cumulative schedule variance: -$6.5. 

Date: May, 2009; 	
Cumulative cost variance: $2.2; 
Cumulative schedule variance: -$6.2. 

Source: GAO analysis of National Aeronautics and Space Administration 
data. 

[End of figure] 

En Route Automation Modernization: 

[Sidebar: 
Investment Details: 
Department of Transportation (Federal Aviation Administration); 
Program start date: August 2002; 
Total life-cycle cost: 
* Current: $3.65 billion; 
* Original: $3.65 billion; 
Program end date: 
* Current: September 2020; 
* Original: September 2020; 
Rebaselines: 0; 
Major contractor: Lockheed Martin. 
End of sidebar] 

The En Route Automation Modernization (ERAM) program is to replace 
existing software and hardware in the air traffic control automation 
computer system and its backup system, the Direct Radar Channel, and 
other associated interfaces, communications, and support infrastructure 
at en route centers across the country. This is a critical effort 
because ERAM is expected to upgrade hardware and software for 
facilities that control high-altitude air traffic. ERAM consists of two 
major components. One component has been fully deployed and is 
currently in operation at facilities across the country. The other 
component is scheduled for deployment through fiscal year 2011. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 21: GAO EVM Practice Assessment of Transportation's ERAM Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program partially implemented the EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Transportation (Federal Aviation 
Administration) data. 

[End of table] 

ERAM fully met 7 of the 11 key practices and partially met 4 others. 
ERAM applies EVM at the contract level and incorporates EV data into 
its overall management of the program. However, ERAM did not perform a 
comprehensive review of the baseline when the contract was finalized, 
or take similar actions to validate the baseline and ensure that the 
appropriate EV metrics had been applied. While ERAM does perform 
limited checks of the contractor schedule, our analysis showed some 
issues with the sequencing of activities and the use of constraints 
that may undermine the reliability of the schedule as a baseline to 
measure performance. 

However, it should be noted that the EV data are not a reflection of 
the total ERAM program. The government is also responsible for 
acquisition work--to which EVM is not being applied. Our analysis of 
the master schedule showed that ERAM would be unable to meet four major 
upcoming initial operating capability milestones due to issues 
associated with government work activities. Program officials noted 
that these milestones have since been pushed out. Since EVM is not 
applied at the program level, it is unclear whether these delays will 
impact overall cost. 

Sidebar: 
EV Performance Details: 
As of April 2009, the ERAM contractor has outperformed its planned cost 
targets by $36.9 million; for this same period, it has also 
outperformed its schedule targets by completing $15.9 million worth of 
work ahead of schedule. This strong performance is attributed to 
significant cost savings in hardware production and unplanned 
efficiencies in integration and testing at ERAM deployment sites. This 
has offset cost overruns associated with software development, such as 
code growth; an unexpectedly high number of defects delivered; and the 
resolution of defects at lower productivity rates than planned. 
Based on performance data from May 2008 to April 2009, we concur with 
the contractor estimate that it will underrun the current budget—worth 
$1.5 billion—by $15.0 million at completion. 
Contract percent complete: 89%; 
Estimates at completion: 
* Contractor: $1.465 billion; 
* GAO: $1.465 billion. 
End of sidebar] 

Figure 14: GAO EV Data Analysis of Transportation's ERAM Program 
(dollars in millions): 

[Refer to PDF for image: line graph] 

Date: May, 2008; 	
Cumulative cost variance: $10; 
Cumulative schedule variance: -$0.6. 

Date: June, 2008; 	
Cumulative cost variance: $12.2; 
Cumulative schedule variance: -$0.4. 

Date: July, 2008; 	
Cumulative cost variance: $17.4; 
Cumulative schedule variance: $1.9. 

Date: August, 2008; 	
Cumulative cost variance: $17; 
Cumulative schedule variance: $4.7. 

Date: September, 2008; 	
Cumulative cost variance: $17; 
Cumulative schedule variance: $4.7. 

Date: October, 2008; 	
Cumulative cost variance: $17.7; 
Cumulative schedule variance: $4.2. 

Date: November, 2008; 
Cumulative cost variance: $17.7; 
Cumulative schedule variance: $6.8.		 

Date: December, 2008; 	
Cumulative cost variance: $22.5; 
Cumulative schedule variance: $5.5. 

Date: January, 2009; 	
Cumulative cost variance: $31.9; 
Cumulative schedule variance: $14.1. 

Date: February, 2009; 	
Cumulative cost variance: $32.7; 
Cumulative schedule variance: $9.9. 

Date: March, 2009; 	
Cumulative cost variance: $37.1; 
Cumulative schedule variance: $16.4. 

Date: April, 2009; 	
Cumulative cost variance: $36.9; 
Cumulative schedule variance: $15.9. 

Source: GAO analysis of Department of Transportation (Federal Aviation 
Administration) data. 

[End of figure] 

Surveillance and Broadcast System: 

[Sidebar: 
Investment Details: 
Department of Transportation (Federal Aviation Administration); 
Program start date: August 2007; 
Total life-cycle cost: 
* Current: $4.33 billion; 
* Original: $4.31 billion; 
Program end date: 
* Current: September 2035; 
* Original: September 2035; 
Rebaselines: 0; 
Major contractor: ITT Corporation. 
End of sidebar] 

The Surveillance and Broadcast System (SBS) is to provide new 
surveillance solutions that employ technology using avionics and ground 
stations for improved accuracy and update rates and to provide shared 
situational awareness (including visual updates of traffic, weather, 
and flight notices) between pilots and air traffic control. These 
technologies are considered critical to achieving the Federal Aviation 
Administration's strategic goals of decreasing the rate of accidents 
and incursions, improving the efficiency of air traffic, and reducing 
congestion. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 22: GAO EVM Practice Assessment of Transportation's SBS Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program fully implemented the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Forecast estimates at completion; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program fully implemented all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Transportation (Federal Aviation 
Administration) data. 

[End of table] 

SBS fully implemented all 11 key EVM practices. Specifically, SBS has 
institutionalized EVM at the program level--meaning that it collects 
and manages performance data on the contractor and government work 
efforts--in order to get a comprehensive view into program status. As 
part of this initiative, SBS performed detailed validation reviews of 
the contractor and program baselines; issued various process rules on 
resource planning, EV metrics, and data analysis; and collected 
government timecard data in order to ensure consistent EV application. 
In addition, the program management team conducted rigorous reviews of 
EV performance with the SBS program manger and the program's internal 
management review board on a monthly basis. Our analysis of the SBS 
master schedule showed that it was developed in accordance with 
scheduling best practices. For example, the schedule was properly 
sequenced, and the resources were assigned. Furthermore, SBS briefed 
the program manager monthly on the quality of the schedule to identify, 
for example, tasks without predecessors. 

[Sidebar: 
EV Performance Details: 
As of May 2009, SBS outperformed its planned cost targets by $14.7 
million. However, for this same period, it has been unable to complete 
$24.0 million worth of work. The strong cost performance is attributed 
to the ITT Corporation’s overestimation of systems engineering 
resources needed to complete work and better-than-expected performance 
for activities associated with system safety, among other things. The 
negative schedule variances are due in part to delays caused by the 
resolution of radio hardware issues found during testing. 
Based on performance data from June 2008 to May 2009, we estimate that 
SBS will most likely exceed the program’s current budget—which is 
currently worth about $1 billion—by about $21 million. 
Program percent complete: 27%; 
Estimates at completion: 
* Program: $966.3 million; 
* GAO: $1.015 billion. 
End of sidebar] 

Figure 15: GAO EV Data Analysis of Transportation's SBS Program 
(dollars in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: $3.2; 
Cumulative schedule variance: -$9.2. 

Date: July, 2008; 	
Cumulative cost variance: $1.5; 
Cumulative schedule variance: -$12.4. 

Date: August, 2008; 	
Cumulative cost variance: $3.4; 
Cumulative schedule variance: -$12.5. 

Date: September, 2008; 	
Cumulative cost variance: $3.7; 
Cumulative schedule variance: -$9. 

Date: October, 2008; 	
Cumulative cost variance: $5.9; 
Cumulative schedule variance: -$10.2. 

Date: November, 2008; 
Cumulative cost variance: $6.2; 
Cumulative schedule variance: -$12.		 

Date: December, 2008; 	
Cumulative cost variance: $4.5; 
Cumulative schedule variance: -$13.5. 

Date: January, 2009; 	
Cumulative cost variance: $8.1; 
Cumulative schedule variance: -$12.5. 

Date: February, 2009; 	
Cumulative cost variance: $9.3; 
Cumulative schedule variance: -$10.5. 

Date: March, 2009; 	
Cumulative cost variance: $10.1; 
Cumulative schedule variance: -$13.3. 

Date: April, 2009; 	
Cumulative cost variance: $11.7; 
Cumulative schedule variance: -$19.6. 

Date: May, 2009; 	
Cumulative cost variance: $14.7; 
Cumulative schedule variance: -$24. 

Source: GAO analysis of Department of Transportation (Federal Aviation 
Administration) data. 

[End of figure] 

Veterans Health Information Systems and Technology Architecture-- 
Foundations Modernization: 

[Sidebar: 
Investment Details: 
Department of Veterans Affairs; 
Program start date: 2006; 
Total life-cycle cost: 
* Current: $1.897 billion; 
* Original: $1.897 billion; 
Program end date: 
* Current: 2016; 
* Original: 2016; 
Rebaselines: 0; 
Major contractor: None: in-house development. 
End of sidebar] 

The Veterans Health Information Systems and Technology Architecture-- 
Foundations Modernization (VistA-FM) program addresses the need to 
transition the Veterans Affairs electronic medical record system to a 
new architecture. According to the department, the current system is 
costly and difficult to maintain and does not integrate well with newer 
software packages. VistA-FM is designed to provide a new architectural 
framework as well as additional standardization and common services 
components. This is intended to eliminate redundancies in coding and 
support interoperability among applications. Ultimately, the new 
architecture will lay the foundation for a new generation of computer 
systems in support of caring for America's veterans. During the course 
of our review, the department's Chief Information Officer suspended 
multiple components of the VistA-FM program until a new development 
plan can be put in place. This action was taken as part of a new 
departmentwide initiative to identify troubled IT projects and improve 
their execution. 

Initiation:
Development: 
Operations and maintenance: 

Source: GAO analysis of Department of Commerce (Census Bureau) data. 

Table 23: GAO EVM Practice Assessment of Veterans Affairs' VistA-FM 
Program: 

Program management area of responsibility: Establish a comprehensive 
EVM system; 
Key practices: 
* Define the scope of effort using a work breakdown structure; 
GAO assessment: The program did not implement all EVM practices in this 
program management area; 
* Identify who in the organization will perform the work; 
GAO assessment: The program did not implement all EVM practices in this 
program management area; 
* Schedule the work; 
GAO assessment: The program did not implement all EVM practices in this 
program management area; 
* Estimate the labor and material required to perform the work and 
authorize the budgets, including management reserve; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Determine objective measure of earned value; 
GAO assessment: The program did not implement all EVM practices in this 
program management area; 
* Develop the performance measurement baseline; 
GAO assessment: The program did not implement the EVM practices in this 
program management area. 

Program management area of responsibility: Ensure that the data 
resulting from the EVM system are reliable; 
Key practices: 
* Execute the work plan and record all costs; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Analyze EVM performance data and record variances from the 
performance measurement baseline plan; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area; 
* Forecast estimates at completion; 
GAO assessment: The program partially implemented all EVM practices in 
this program management area. 

Program management area of responsibility: Ensure that the program 
management team is using earned value data for decision-making 
purposes; 
Key practices: 
* Take management action to mitigate risks; 
GAO assessment: The program did not implement all EVM practices in this 
program management area; 
* Update the performance measurement baseline as changes occur; 
GAO assessment: The program did not implement all EVM practices in this 
program management area. 

Source: GAO analysis of Department of Veterans Affairs data. 

[End of table] 

VistA-FM partially met 4 key practices and did not meet 7 others, 
despite reporting compliance with the American National Standards 
Institute (ANSI) standard in its 2010 business case submission. 
Specifically, the program is still working to establish a comprehensive 
EVM system to meet ANSI compliance, among other things. For example, 
the work breakdown structure is organized around key program milestones 
instead of product deliverables, and does not fully describe the scope 
of work to be performed. Although the program's subprojects maintain 
their own schedules, VistA-FM does not currently have an integrated 
master schedule at the program level. This is of concern because it is 
not possible to establish the program's critical path and the time- 
phased budget baseline, a key component of EVM. The reliability of the 
data is also a potential issue because the program's EVM reports do not 
offer adequate detail to provide insight into data reliability issues. 
Additionally, the performance baseline has not been appropriately 
updated; program officials stated this update is in progress, but they 
did not have a completion date. 

[Sidebar: 
EV Performance Details: 
Based on performance data from June 2008 to May 2009, VistA-FM has 
experienced continual negative cost and schedule variances. 
Specifically, as of May 2009, the program has exceeded its planned cost 
target by $14.9 million, and has not completed $24.9 million in planned 
work. Program officials cited resource availability and 
interdependencies among projects as key drivers of cost and schedule 
variances. We estimate that the program will overrun its current budget—
worth approximately $1.897 billion—by $350.2 million. 
Program percent complete: 10%; 
Estimates at complete: 
* Program: $1.897 billion; 
* GAO: $2.248 billion. 
End of sidebar] 

Figure 16: GAO EV Data Analysis of Veterans Affairs' VistA-FM Program 
(dollars in millions): 

[Refer to PDF for image: line graph] 

Date: June, 2008; 	
Cumulative cost variance: $0.9; 
Cumulative schedule variance: -$25.7. 

Date: July, 2008; 	
Cumulative cost variance: $0.7; 
Cumulative schedule variance: -$26.7. 

Date: August, 2008; 	
Cumulative cost variance: $0.6; 
Cumulative schedule variance: -$26.5. 

Date: September, 2008; 	
Cumulative cost variance: -$14; 
Cumulative schedule variance: -$27.4. 

Date: October, 2008; 	
Cumulative cost variance: -$11.2; 
Cumulative schedule variance: -$24.8. 

Date: November, 2008; 
Cumulative cost variance: -$11.1; 
Cumulative schedule variance: -$24.7.		 

Date: December, 2008; 	
Cumulative cost variance: -$11.2; 
Cumulative schedule variance: -$24.9. 

Date: January, 2009; 	
Cumulative cost variance: -$11.4; 
Cumulative schedule variance: -$25. 

Date: February, 2009; 	
Cumulative cost variance: -$15.1; 
Cumulative schedule variance: -$24.6. 

Date: March, 2009; 	
Cumulative cost variance: -$15.2; 
Cumulative schedule variance: -$25.3. 

Date: April, 2009; 	
Cumulative cost variance: -$15; 
Cumulative schedule variance: -$24.6. 

Date: May, 2009; 	
Cumulative cost variance: -$14.9; 
Cumulative schedule variance: -$24.9. 

Source: GAO analysis of Department of Veterans Affairs data. 

[End of figure] 

[End of section] 

Appendix III: Comments from the Department of Commerce: 

The Secretary Of Commerce: 
Washington, D.C. 20230
	
September 18, 2009: 

Mr. David A. Powner: 
Director, Information Technology Management Issues: 
Government Accountability Office: 
441 G Street, N.W. 
Washington, DC 20548: 

Dear Mr. Powner: 

Thank you for the opportunity to review the draft of the Government 
Accountability Office's (GAO) report, Information Technology: OMB and 
Agencies Need to Improve the Implementation and Use of Earned Value 
Techniques to Help Manage Major System Acquisitions (GAO-10-2). The 
report provides a comprehensive overview of the utilization of Earned 
Value Management (EVM) best practices and the degree to which they are 
implemented on major IT projects across the Federal Government. 

GAO looked at sixteen major IT acquisition projects in eight different 
agencies. Two of the sixteen were Department of Commerce projects, both 
of which fall under the U.S. Census Bureau and are vital to conducting 
the 2010 Decennial Census: The Decennial Response Integration System 
(DRIS) and the Field Data Collection Automation system (FDCA). The 
draft report provides three recommendations for the heads of the 
agencies, one of which addresses the development of Department-wide EVM 
policies in seven key areas, and the other two address the extent to 
which the sixteen individual IT investment programs are complying with 
11 key EVM practices. Specifically, the GAO recommends that the eight 
department heads: 

1) modify agency-wide EVM policies to ensure that they address the 
weaknesses identified in the report (e.g., for the Department of 
Commerce, the failure to establish a standard Department-wide structure 
for defining work products); 

2) direct key system acquisition programs to fully implement the eleven 
key EVM practices discussed in the report; and; 

3) take action to reverse the current negative performance trends, as 
shown in the earned value data, to mitigate potential cost and schedule 
overruns. 

Regarding the second and third recommendations, I am pleased that the 
Census Bureau's DRIS contract was one of only three that were found to 
have fully implemented all eleven key EVM practices among those 
reviewed by GAO. Thus, these recommendations apply to the FDCA project. 
While I am also pleased that the FDCA program was found to have fully 
implemented six of the key practices—especially considering the 
difficulties it experienced in prior years—and to have at least 
partially implemented the remaining five, your audit illustrates the 
work that still needs to be done. As you know, the Census Bureau re-
baselined the FDCA program and revised the scope of this contract in 
October 2009. Since that time, the Commerce Investment Review Board 
(IRB) has been meeting quarterly with the FDCA program management team 
and Senior Decennial Census Staff to analyze EVM data and to track cost 
and schedule performance. In order to strengthen the IRB's oversight of 
major IT acquisitions, we have recently elevated the IRB within the 
Department of Commerce (Department); it will now be chaired by the 
Deputy Secretary of Commerce. These actions should move the FDCA 
program, and all of the Department's major IT acquisition projects, 
toward our objective of full compliance with GAO's key EVM practices. 

Regarding the first recommendation, the Department of Commerce 
understands and appreciates the value of standardized work structures. 
However, we maintain that the development of standardized work 
structures should take place at the operating unit level, given the 
wide diversity of missions and project complexity among the 
Department's operating units. 

Finally, we suggest the following edits (in italics) on page 38 of the 
draft report to clarify that the cost changes to the DRIS contract are 
not due solely to the decision to revert to a paper-based Non Response 
Follow Up in the 2010 Decennial Census: 

"The DRIS program's estimated lifecycle costs have increased by $372 
million, most of which is due to increases in both paper and telephone 
workloads. The paper workload increased due to an April 2008 redesign 
of the 2010 Census that reverted planned automated operations to paper-
based processes and requires DRIS to process an additional estimated 40 
million paper forms. The telephone workload increased as a result of 
Decennial Census program decisions to: (1) conduct all coverage follow-
up (CFU) cases by telephone, which will allow us to resolve more 
potential coverage errors than a combination of telephone and field 
follow-up; and (2) conduct CFU on more types of cases. The CFU cases 
derive from situations where questionnaire response data indicate 
potential coverage problems for a household." 

Thank you again for the opportunity to participate in this GAO study. 
Should you or a member of your staff have any questions, please contact 
Izella Dornell, Director of the Department's Program Management Office, 
at (202) 482-1888 or idomell@doc.gov. 

Sincerely, 

Signed by: 

Gary Locke: 

[End of section] 

Appendix IV: Comments from the Department of Defense: 

Office Of The Under Secretary Of Defense: 
Acquisition, Technology	And Logistics: 
3000 Defense Pentagon: 
Washington, DC 20301-3000: 

September 15, 2009: 

Mr. David A. Powner: 
Director, Information Technology Management Issues: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, DC 20548: 

Dear Mr. Powner: 

This is the Department of Defense (DoD) response to the GAO draft 
report GAO-10-02, "Information Technology: Agencies Need to Improve the 
Implementation and Use of Earned Value Techniques to Help Manage Major 
System Acquisitions," dated August 20, 2009 (GAO Code 310894). Detailed 
comments on the report recommendations are enclosed. 

We appreciate the opportunity to comment on the draft report. Should 
you have any additional questions, please contact Mr. Michael Pelkey, 
703-614-1253, michael.pelkey@osd.mil. 

Sincerely, 

Signed by: 

Susan Hildner, for: 

Shay D. Assad: 
Director, Defense Procurement and Acquisition Policy: 

Enclosure: As stated: 

[End of letter] 

GAO Draft Report Dated August 20, 2009: 
GAO-10-02 (GAO Code 310894): 

"Information Technology: Agencies Need To Improve The Implementation 
And Use Of Earned Value Techniques To Help Manage Major System 
Acquisitions" 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: The GAO recommends that the Secretary of Defense 
modify polices governing earned value management (EVM) to ensure that 
they address the weaknesses that we identified, taking into 
consideration the criteria used in this report. 

DOD Response: Concur. The Department agrees with the Key Components of 
an Effective EVM Policy identified in Table 1 of the draft report. The 
only weakness in the Department's implementation of these Key 
Components identified by the GAO is that some agencies do not have a 
formal EVM training program for all personnel with program management 
and investment oversight responsibilities. The DoD includes EVM 
training modules in the training required for certification in every 
acquisition competency under the Defense Acquisition Workforce 
Improvement Act. 

A Defense Support Team recently reviewed the Department's 
implementation of EVM and recommended a review of Defense Acquisition 
University training curricula to determine the quality and 
applicability of EVM training in acquisition career fields. Based on 
the results of this analysis, enhanced EVM training modules or new 
courses may be implemented. 

Recommendation 2: The GAO recommends that the Secretary of Defense 
direct key system acquisition programs to implement the EVM practices 
that address the detailed weaknesses that we identified in Appendix II, 
taking into consideration the criteria used in this report. 

DOD Response: Concur. The Department agrees with the Key EVM Practices 
for System Acquisition Programs identified in Table 3 of the draft 
report. The DoD Federal Acquisition Regulations Supplement and DoD 
Instruction 5000.02 establish requirements for use of EVM in all 
acquisition programs with cost- or incentive-type contracts over $20 
million. EVM data is required and routinely used at all levels of 
acquisition program management and oversight. These requirements, and 
related oversight procedures, are implemented via procedures, processes 
and guidance documents at both the Departmental and Component levels.
As the Department's Acquisition Executive, the USD(AT&L) requires that 
all levels of program management and oversight routinely use EVM data 
At program reviews and milestone decisions, both the status of the 
contractor's EVM system and EVM metrics describing program status are 
considered. The Defense Contract Management Agency is the preeminent 
authority on validation of contractor EVM systems and contractor 
compliance with their approved systems. The Department will direct that 
Program Managers of the Air and Space Operations Center — Weapon System 
(AOC-WS), Joint Tactical Radio System — Handheld, Manpack, and Small 
Form Fit (JTRS HMS), and Warfighter Information Network — Tactical (WIN-
T) programs, at their next program review or milestone, describe the 
actions taken to address the weaknesses identified by the GAO. 

Recommendation 3: The GAO recommends that the Secretary of Defense 
direct key system acquisition programs to take action to reverse 
current negative performance trends, as shown in the earned value data, 
to mitigate the potential cost and schedule overruns. 

DOD Response: Concur. The Department concurs that it is essential to 
maintain appropriate oversight of acquisition programs, including the 
use of EVM data to understand program status and anticipate potential 
problems. The Department has a comprehensive acquisition program 
management oversight process that mandates the use of EVM data for 
these purposes, as detailed in DoD Directive 5000.01 and DoD 
Instruction 5000.02. Program Managers provide monthly reports on cost, 
schedule, and technical performance trends, and the planned steps to 
mitigate known problems and risks. The Department will issue direction 
to Program Managers to take action to reverse negative cost or schedule 
trends identified in EVM data. 

[End of section] 

Appendix V: Comments from the Department of Justice: 

U.S. Department of Justice: 
Washington, DC 20530: 

September 19, 2009: 

Mr. David A. Powner: 
Director, Information Technology Management Issues: 
United States Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Powner: 

The Department of Justice has reviewed the Government Accountability 
Office's (GAO) draft report "Information Technology: OMB and Agencies 
Need to Improve the Implementation and Use of Earned Value Techniques 
to Help Manage Major System Acquisitions (GAO-10-2). The GAO made three 
recommendations in its draft report. After discussion with your office, 
it was agreed that recommendation number two was inadvertently directed 
to the Department and we did not need to respond. Consequently. we are 
responding to recommendation numbers one and three only. 

GAO Recommendation #1: Modify policies governing EVM to ensure that 
they address weaknesses that we identified, taking into consideration 
the criteria used in this report. 

DOJ Response: The DOJ concurs with this recommendation. The Department 
received full credit in five out of the seven EVM policy assessment 
areas. The two areas in which the Department received partial credit 
due to weaknesses identified by GAO were in the areas of: 1) training 
requirements and 2) use of standard structures for defining the work 
products. 

In the area of training requirements, GAO recommended that our EVM 
policy require training for all personnel with investment oversight and 
program management responsibilities. In response, the Department has 
modified the DOJ Information Resources Management Policy Order 2880.1 B 
on EVM training. The policy order now requires that the DOJ Chief 
Information Officer enforce EVM training for all personnel with 
investment oversight and program management responsibilities. The D0.1 
EVM Implementation Guide. which implements the policy order further 
describes the guidance on this requirement to include all executive 
personnel with oversight responsibilities that need to understand EVM 
concepts to make sound investment decisions and all program personnel 
who work with or would like to be more proficient in EVM. 

In the area of standard structures for defining work products, GAO 
stated the Department has yet to standardize its product structures. 
The Department recently implemented a new requirement and currently, 
the DOJ EVM Implementation Guide requires that the D0.1 components use 
a standard, product-oriented Work Breakdown Structure (WBS) for their 
major developmental IT efforts. The Department will require the use of 
a standard, product-oriented WBS in future statements of work for major 
IT developmental efforts. 

GAO Recommendation #3: Direct key system acquisition programs to take 
action to reverse current negative performance trends, as shown in the 
earned value data, to mitigate potential cost and schedule overruns. 

DOJ Response: The DOJ concurs with this recommendation. The GAO 
reviewed the FBI's Next Generation Identification (NGI) program. The 
GAO estimated that the NGI major development contract will overrun its 
current budget by an estimated $1.6M. The NGI program office is 
committed to remaining within its overall program budget. To provide 
additional focus on cost, a formal risk has been opened to monitor and 
address EVM thresholds. The NGI program office has established an 
Executive Change Control Board to prevent scope increases. If any 
changes were proposed that would affect scope within NGI, they would be 
subject to review and approval by the oversight board which is chaired 
by the FBI Deputy Director, The major development contractor has 
established a Tiger Team to identify root causes for past variances; 
reviewed the work remaining; and applied corrective actions including 
implementing "system administrator day-in-the-life" testing. hi 
addition, the major development contractor is utilizing a rolling wave 
planning approach, in which the contractor establishes periods of 
performance at the lowest-level tasking as the next consecutive segment 
of work is detailed. Effort taking place outside the planning window is 
identified in Planning Packages. As the planning window (90-day 
minimum) opens, the effort is then planned in detail into Work 
Packages. Finally, an Independent Verification and Validation vendor 
has been established that independently reports on a monthly basis to 
the Executive Assistant Director for Science and Technology on the 
status of the program. 

The Department appreciates this opportunity to comment on the draft 
report prepared by the GAO. 

Should you have any questions regarding this topic, please do not 
hesitate to contact Richard Theis, DOJ Audit Liaison on 202-514-0469. 

Sincerely, 

Signed by: 

Lee J. Lofthus: 
Assistant Attorney General for Administration: 

[End of section] 

Appendix VI: Comments from the National Aeronautics and Space 
Administration: 

National Aeronautics and Space Administration: 	
Office of the Administrator: 	
Washington, DC 20546-0001: 

September 18, 2009: 

Mr. David A. Powner: 
Director: 
Information Technology Management Issues: 
United States Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Powner: 

NASA appreciates the opportunity to comment on your draft report 
entitled, "Information Technology: Agencies Need to Improve the 
Implementation and Use of Earned Value Techniques to Help Manage Major 
System Acquisitions," (GAO-10-02). 

While we acknowledge that opportunities exist for improvement regarding 
the implementation of Earned Value Management (EVM), the NASA projects 
included in the scope of the audit integrate and rely on various 
elements of Information Technology (IT), but are not IT-specific 
projects. Governance of the NASA projects identified in the draft 
report is derived from NASA space flight guidance rather than NASA 
information technology guidance. 

In the draft report, GAO makes three recommendations intended to 
address weaknesses identified in federal agencies' (including NASA's) 
policies and practices using EVM. Specifically, GAO recommends the 
following: 

Recommendation 1: Modify policies governing EVM to ensure that they 
address the weaknesses that we identified, taking into consideration 
the criteria used in this report. 

Response: Partially concur. NASA is revising NASA Procedural 
Requirements (NPR) for programs and projects to include expanded and 
strengthened policies governing EVM application and processes. 
Specifically, policy revisions address the following areas raised by 
your review: 

1) Standard structure for defining IT work products: The three NASA 
projects included in the scope of the GAO audit were space flight 
projects governed by space flight project management policy. GAO did 
not audit any of NASA's IT projects which are governed by IT project 
management policy. As cited in the GAO report, NASA's IT project 
management policy does not currently require a standardized work break 
down structure for IT projects. It should be noted that the amount of 
FY 2010 spending (per the FY 2010 budget request) on IT projects is 
approximately $978M, or 5 percent of NASA's budget. Of that, only $170M 
is development, modernization and enhancement (DME), or 1 percent of 
NASA's budget. There are currently no DME IT investments governed by IT 
project management policy that meet the EVM requirement of $20M. 
However, NASA recognizes the importance of a standardized
WBS for future IT DME projects and is developing a standard WBS for 
those activities. Estimated completion date for approval of the IT 
project management policy interim directive is October 2010. 

2) EVM Training Requirements: NASA agrees with the GAO that training is 
an essential input to the proper application of EVM throughout the 
Agency. NASA also believes that ongoing enhancements to training 
courses and programs build continuous improvement into NASA's EVM. 
However, it has been NASA's view that mandating EVM training without 
having the context in which it can be immediately applied is not as 
effective as voluntary, timely, and relevant EVM training in terms of 
the student's ability to both retain and apply learning. NASA's Academy 
of Program, Project and Engineering Leadership (APPEL) EVM and Schedule 
Management training courses are updated with relevant policy, 
methodology, lessons learned, and analytical knowledge to benefit our 
program and project teams at all levels. These APPEL courses are 
augmented by NASA's online and self-paced on-line EVM training 
offerings and tool-based and role-based training courses in place at 
NASA field centers. Additionally, over the past year, APPEL has 
increased the number of project-dedicated, tailored training 
engagements to focus on the unique challenges of specific projects. 

Over the past two years, NASA has offered 19 EVM classes and has 
trained 409 participants. Of those offerings, one was project specific 
and 15 were provided onsite at Centers, providing an opportunity for 
multiple project team members to attend and discuss EVM as it applies 
to their respective projects. Project-based EVM training has been very 
successful and has the potential to increase EVM effectiveness by 
taking the subject and lessons learned directly to the project 
practitioners "just-in-time." Training that is provided "just-in-time" 
and tailored to current project issues is immediately applicable on-the-
job, effective, and dynamic. This is in addition to the many program 
and project management personnel who have been trained in EVM prior to 
2007 and who have applied it on projects during their career. 

3) Rebaselining Criteria: NASA recognizes the need to enhance the re-
baselining policy. The policy governing EVM and baseline revision is 
contained in space flight project management policy. An interim 
directive including some revisions to the rebaselining policy will be 
issued in September 2009. The policy will continue to be refined taking 
into account the GAO recommendations as part of a formal revision which 
is due by October 2010. IT project management policy will be updated 
following the space flight project management policy to reflect 
applicable changes to the baseline policy. 

Recommendation 2: Direct key system acquisition programs to implement 
the EVM practices that address the detailed weaknesses that we 
identified in Appendix II, taking into consideration the criteria used 
in this report. 

Response: Concur. NASA acknowledges the identified weaknesses and will 
work toward closing the gaps. NASA acknowledges that EVM is a valuable 
performance assessment tool and utilizes it as such. NASA also employs 
a number of other performance assessment tools and a comprehensive 
governance and review process at the project, program, Mission 
Directorate, Center and Agency levels to assess project performance and 
reduce mission risk. NASA believes that mission success is not 
dependent on the use of a single tool or method, but rather is a 
complex proactive integrated management, governance, and review process 
that leverages a variety of tools to provide in-sight into project 
performance which allows the agency to mitigate risk and take 
corrective action when necessary. 

Recommendation 3: Direct key system acquisition programs to take action 
to reverse current negative performance trends, as shown in the earned 
value data, to mitigate the potential cost and schedule overruns. 

Response: Concur. The three NASA projects reviewed by GAO are high 
risk, high complexity projects. Each of these projects is reviewed on a 
monthly basis by the Program, Science Mission Directorate, managing 
NASA Center and ultimately at the Agency level. This hierarchical 
review process ensures awareness of project performance at all levels 
within the Agency enabling leadership engagement, proactive and timely 
risk mitigation, and corrective actions. This review hierarchy provides 
comprehensive, integrated, and objective information that describes the 
"performance-to-plan" of the Agency's programs, projects, and 
institutional capabilities. These reviews are action oriented and 
ensure open, cross-functional communication among NASA's organizations 
to enhance Agency performance. As part of this monthly review 
structure, NASA will continue to address any negative performance 
trends, as shown in the earned value data, and continue to develop 
mitigation plans to minimize cost and schedule overruns for each 
project. NASA holds flexibility above the level at which earned value 
is reported on contracts to manage variances. 

We will continue to work to mitigate the EVM weaknesses identified by 
the GAO as they pertain to NASA. If you have any questions or require 
additional information, please contact Sandra Smalley at 202-358-4731.
Thank you again for the opportunity to review this draft report, and we 
are looking forward to your final report to Congress. 

Sincerely, 

Signed by: 

Lori B. Garver: 
Deputy Administrator: 

[End of section] 

Appendix VII: Comments from the Department of Veterans Affairs: 

The Secretary Of Veterans Affairs: 
Washington: 

September 21, 2009: 

Mr. David A. Powner: 
Director, Information Technology Management Issues: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Powner: 

The Department of Veterans Affairs (VA) has reviewed the Government 
Accountability Office's (GAO) draft report, Information Technology: 
Agencies Need to Improve the Implementation and Use of Earned Value 
Techniques to Help Manage Major System Acquisitions (GA0-10-2) and 
generally agrees with GAO's conclusions and concurs with GAO's 
recommendations. 

VA agrees with GAO's specific findings related to the VA VistA FM 
Program. GAO's assessment of the key Earned Value Management (EVM) 
practices is fair and accurately reflects the state of the program. 
While GAO was conducting this Governmentwide review of EVM, VA was 
conducting reviews of the state of all of its Office of Information and 
Technology (01T) projects and reached similar findings. As a result, 9 
of the original 26 projects placed under the Program Management 
Accountability System (PMAS) are from the VistA FM Program. These 
projects are now being managed and monitored under PMAS. OIT is 
currently refining the processes related to PMAS; these processes do 
include the key practices of EVM as outlined in the draft report. VA 
will include how it plans to implement the EVM techniques under PMAS in 
its comments to the final report 60 days after it is issued by GAO. 

VA appreciates the opportunity to comment on your draft report. 

Sincerely, 

Signed by: 

Eric K. Shinseki: 

Enclosure: 

[End of letter] 

Department of Veterans Affairs (VA) Comments to Government 
Accountability Office (GAO) Draft Report, Information Technology: 
Agencies Need to Improve the Implementation and Use of Earned Value 
Techniques to Help Manage Major System Acquisitions (GAO-10-2): 

Technical comments: 

Page 9: Background, 1st paragraph: 

Recommendation: The project information on the OMB IT Dashboard should 
be independently validated by required external audits to the project 
before posting status on schedule, cost and evaluation. Currently, OMB 
relies on self-regulation on the part of program and project 
management. The fact that the sole-reported VA VistA-FM project had 
"partially met 4 key [EVM] practices, and did not meet 7 others, 
despite repotting compliance with the ANSI standard in its 2010 
business case submission" is an example of what can occur when no 
effective oversight or review is in place. 

Page 16 to page 17, Criteria for implementing EVM on all IT major 
investments: 

This section notes, VA "only partially met this key practice because 
its policy did not clearly state whether programs or major sub-
components of programs (projects and sub-projects) had to comply with 
EVM". 

General Comment: This section aptly highlights the need for visibility 
at the project level vs. the program level. Major investment progress 
reported at the program level does not show adequate detail for OMB at 
the project level for portfolio type E-300 investments. Overall, the 
program could be doing well, but individual projects could have 
problems that might be missed. 

Page 22, 1st full paragraph: 

This section notes on performance baselines, "Weaknesses in these 
schedules included improper sequencing of activities, such as 
incomplete or missing linkages between tasks on future work 
activities". 

Recommendation(s): 

1) There should be completeness in Work Breakdown Structure (WBS) 
dictionaries which elaborate on the nature and substance of key 
scheduled activities. 

2) This section should emphasize the inclusion of independent 
verification and validation (IV&V) and other review activities and the 
allocation of "realistic" project resources and scheduled time frames 
for those activities. 

[End of section] 

Appendix VIII: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

David A. Powner, (202) 512-9286 or pownerd@gao.gov: 

Staff Acknowledgments: 

In addition to the contact name above, individuals making contributions 
to this report included Carol Cha (Assistant Director), Neil Doherty, 
Kaelin Kuhn, Jason Lee, Lee McCracken, Colleen Phillips, Karen Richey, 
Teresa Smith, Matthew Snyder, Jonathan Ticehurst, Kevin Walsh, and 
China Williams. 

[End of section] 

Related GAO Products: 

Defense Acquisitions: Assessments of Selected Weapon Programs. GAO-09-
326SP. Washington, D.C.: March 30, 2009. 

Discusses the Department of Defense's Joint Tactical Radio System-- 
Handheld, Manpack, Small Form Fit and Warfighter Information Network-- 
Tactical programs. 

Information Technology: Census Bureau Testing of 2010 Decennial Systems 
Can Be Strengthened. GAO-09-262. Washington, D.C.: March 5, 2009. 

Discusses the Department of Commerce's Decennial Response Integration 
System and Field Data Collection Automation programs. 

NASA: Assessments of Selected Large-Scale Projects. GAO-09-306SP. 
Washington, D.C.: March 2, 2009. 

Discusses the National Aeronautics and Space Administration's James 
Webb Space Telescope and Mars Science Laboratory programs. 

Air Traffic Control: FAA Uses Earned Value Techniques to Help Manage 
Information Technology Acquisitions, but Needs to Clarify Policy and 
Strengthen Oversight. GAO-08-756. Washington, D.C.: July 18, 2008. 

Discusses the Department of Transportation's En Route Automation 
Modernization and Surveillance and Broadcast System programs. 

Information Technology: Agriculture Needs to Strengthen Management 
Practices for Stabilizing and Modernizing Its Farm Program Delivery 
Systems. GAO-08-657. Washington, D.C.: May 16, 2008. 

Discusses the U.S. Department of Agriculture's Farm Program 
Modernization program. 

Information Technology: Improvements for Acquisition of Customs Trade 
Processing System Continue, but Further Efforts Needed to Avoid More 
Cost and Schedule Shortfalls. GAO-08-46. Washington, D.C.: October 25, 
2007. 

Discusses the Department of Homeland Security's Automated Commercial 
Environment program. 

Defense Acquisitions: The Global Information Grid and Challenges Facing 
Its Implementation. GAO-04-858. Washington, D.C.: July 28, 2004. 

Discusses the Department of Defense's Warfighter Information Network-- 
Tactical program. 

[End of section] 

Footnotes: 

[1] OMB Memorandum, M-05-23 (Aug. 4, 2005). 

[2] The eight agencies were the Departments of Agriculture, Commerce, 
Defense, Homeland Security, Justice, Transportation, and Veterans 
Affairs, and the National Aeronautics and Space Administration. 

[3] 44 U.S.C. § 3504(h), 3506(h). 

[4] GAO, Information Technology: Management and Oversight of Projects 
Totaling Billions of Dollars Need Attention, [hyperlink, 
http://www.gao.gov/products/GAO-09-624T] (Washington, D.C.: Apr. 28, 
2009); Information Technology: Treasury Needs to Better Define and 
Implement Its Earned Value Management Policy, [hyperlink, 
http://www.gao.gov/products/GAO-08-951] (Washington, D.C.: Sept. 22, 
2008); Information Technology: Further Improvements Needed to Identify 
and Oversee Poorly Planned and Performing Projects, [hyperlink, 
http://www.gao.gov/products/GAO-07-1211T] (Washington, D.C.: Sept. 20, 
2007); Information Technology: Improvements Needed to More Accurately 
Identify and Better Oversee Risky Projects Totaling Billions of 
Dollars, [hyperlink, http://www.gao.gov/products/GAO-06-1099T] 
(Washington, D.C.: Sept. 7, 2006); and Information Technology: Agencies 
and OMB Should Strengthen Processes for Identifying and Overseeing High 
Risk Projects, [hyperlink, http://www.gao.gov/products/GAO-06-647] 
(Washington, D.C.: June 15, 2006). 

[5] GAO, Information Technology: OMB Can Make More Effective Use of Its 
Investment Reviews, [hyperlink, http://www.gao.gov/products/GAO-05-276] 
(Washington, D.C.: Apr. 15, 2005). 

[6] [hyperlink, http://www.gao.gov/products/GAO-08-951] and GAO, Air 
Traffic Control: FAA Uses Earned Value Techniques to Help Manage 
Information Technology Acquisitions, but Needs to Clarify Policy and 
Strengthen Oversight, [hyperlink, 
http://www.gao.gov/products/GAO-08-756] (Washington, D.C.: July 18, 
2008). 

[7] OMB Memorandum, M-05-23 (Aug. 4, 2005). 

[8] Recognizing the importance of ensuring quality earned value data, 
ANSI and the Electronic Industries Alliance (EIA) jointly established a 
national standard for EVM systems in May 1998 (ANSI/EIA-748-A-1998). 
This standard, commonly called the ANSI standard, is comprised of 
guidelines to instruct programs on how to establish a sound EVM system. 
This document was updated in July 2007 and is referred to as ANSI/EIA-
748-B. 

[9] An integrated baseline review is an evaluation of a program’s 
baseline plan to determine whether all program requirements have been 
addressed, risks have been identified, mitigation plans are in place, 
and available and planned resources are sufficient to complete the 
work. 

[10] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[11] [hyperlink, http://www.gao.gov/products/GAO-08-756]. 

[12] [hyperlink, http://www.gao.gov/products/GAO-08-951]. 

[13] GAO, Polar-Orbiting Environmental Satellites: With Costs 
Increasing and Data Continuity at Risk, Improvements Needed in Tri-
agency Decision Making, [hyperlink, 
http://www.gao.gov/products/GAO-09-564] (Washington, D.C.: June 17, 
2009); Polar-Orbiting Operational Environmental Satellites: 
Restructuring Is Under Way, but Technical Challenges and Risks Remain, 
[hyperlink, http://www.gao.gov/products/GAO-07-498] (Washington, D.C.: 
Apr. 27, 2007); Polar-Orbiting Operational Environmental Satellites: 
Cost Increases Trigger Review and Place Program’s Direction on Hold, 
[hyperlink, http://www.gao.gov/products/GAO-06-573T] (Washington, D.C.: 
Mar. 30, 2006); Polar-Orbiting Operational Environmental Satellites: 
Technical Problems, Cost Increases, and Schedule Delays Trigger Need 
for Difficult Trade-off Decisions, [hyperlink, 
http://www.gao.gov/products/GAO-06-249T] (Washington, D.C.: Nov. 16, 
2005); and Polar-Orbiting Environmental Satellites: Information on 
Program Cost and Schedule Changes, [hyperlink, 
http://www.gao.gov/products/GAO-04-1054] (Washington, D.C.: Sept. 30, 
2004). 

[14] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[15] In 13 cases, programs limited the use of EVM to system development 
work on contract. As such, earned value data will reflect contractor 
performance only. In the 3 other cases, the Farm Program Modernization, 
Surveillance and Broadcast System, and Veterans Health Information 
Systems and Technology Architecture—Foundations Modernization, programs 
expanded the use of EVM to the entire program; therefore, the earned 
value data will reflect total program performance (contractor and 
government). 

[16] These programs include the Field Data Collection Automation, 
Automated Commercial Environment, Juno, and Veterans Health Information 
Systems and Technology Architecture—Foundations Modernization. 

[17] GAO, Information Technology: Treasury Needs to Better Define and 
Implement Its Earned Value Management Policy, [hyperlink, 
http://www.gao.gov/products/GAO-08-951] (Washington, D.C.: Sept. 22, 
2008). 

[18] There were 30 investments that met this criterion. 

[19] These investments include the Department of Defense's Navy 
Enterprise Resource Planning, and the Department of Homeland Security's 
Secure Border Initiative net and U.S. Visitor and Immigration Status 
Indicator Technology. 

[20] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[21] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: