This is the accessible text file for GAO report number GAO-12-629 
entitled 'Information Technology Cost Estimation: Agencies Need to 
Address Significant Weaknesses in Policies and Practices' which was 
released on July 27, 2012. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to the Ranking Member, Committee on Homeland Security and 
Governmental Affairs, U.S. Senate: 

July 2012: 

Information Technology Cost Estimation: 

Agencies Need to Address Significant Weaknesses in Policies and 
Practices: 

GAO-12-629: 

GAO Highlights: 

Highlights of GAO-12-629, a report to the Ranking Member, Committee on 
Homeland Security and Governmental Affairs, U.S. Senate. 

Why GAO Did This Study: 

The federal government plans to spend at least $75 billion on 
information technology (IT) investments in fiscal year 2012. The size 
of this investment highlights the importance of reliably estimating 
the costs of IT acquisitions. A reliable cost estimate is critical to 
the success of any IT program, providing the basis for informed 
decision making and realistic budget formation. Without the ability to 
generate such estimates, programs risk missing their cost, schedule, 
and performance targets. 

GAO was asked to (1) assess selected federal agencies’ implementation 
of cost-estimating policies and procedures, and (2) evaluate whether 
selected IT investments at these agencies have reliable cost estimates 
to support budget and program decisions. To do so, GAO compared 
policies and procedures to best practices at eight agencies. GAO also 
reviewed documentation supporting cost estimates for 16 major 
investments at these eight agencies—representing about $51.5 billion 
of the planned IT spending for fiscal year 2012. 

What GAO Found: 

While the eight agencies GAO reviewed—the Departments of Agriculture, 
Commerce, Defense, Homeland Security, Justice, Labor, and Veterans 
Affairs, and the Environmental Protection Agency—varied in the extent 
to which their cost-estimating policies and procedures addressed best 
practices, most had significant weaknesses. For example, six of the 
eight agencies had established a clear requirement for programs to 
develop life-cycle cost estimates. However, most of the eight agencies’
 policies lacked requirements for cost-estimating training, a standard 
structure for defining work products, and a central, independent cost-
estimating team, among other things. The weaknesses in agencies’ 
policies were due, in part, to the lack of a priority for establishing 
or enhancing department or agency-level cost-estimating functions. 
Until agencies address weaknesses in their policies, it will be 
difficult for them to make effective use of program cost estimates for 
informed decision making, realistic budget formation, and meaningful 
progress measurement. 

The 16 major acquisition programs had developed cost estimates and 
were using them, in part, to support program and budget decisions. 
However, all but 1 of the estimates were not fully reliable—meaning 
that they did not fully reflect all four characteristics of a reliable 
cost estimate identified in the GAO cost-estimating guide: 
comprehensive, well-documented, accurate, and credible (see figure). 
For example, the estimates for many of these investments did not 
include all life-cycle costs, such as costs for operating and 
maintaining the system; did not adequately document the source data 
and methodologies used to develop the estimate; were not regularly 
updated so that they accurately reflected current status; and lacked 
credibility because they were not properly adjusted to account for 
risks and uncertainty. The inadequate implementation of cost-
estimating best practices was largely due to weaknesses in agencies’ 
policies. Until cost-estimating best practices are fully implemented, 
these programs face an increased risk that managers will not be able 
to effectively use their cost estimates as a sound basis for informed 
program and budget decision making. 

Figure: Assessment of Cost-Estimating Practices for Case Study 
Programs: 

[Refer to PDF for image: stacked horizontal bar graph] 

Number of investments: 

Comprehensive: 
Not met: 3; 
Partially met: 12; 
Fully met: 1. 

Well-documented: 
Not met: 5; 
Partially met: 10; 
Fully met: 1. 

Accurate: 
Not met: 6; 
Partially met: 8; 
Fully met: 2. 

Credible: 
Not met: 10; 
Partially met: 5; 
Fully met: 1. 

Source: GAO analysis of agency data. 

[End of figure] 

What GAO Recommends: 

GAO is recommending that the selected agencies modify cost-estimating 
policies to be consistent with best practices and update future cost 
estimates of the selected acquisition programs to address identified 
weaknesses. The seven agencies that commented on a draft of this 
report generally agreed with GAO’s results and recommendations, 
although the Environmental Protection Agency disagreed with the 
assessment of one of its investments. However, GAO stands by its 
assessment. 

View [hyperlink, http://www.gao.gov/products/GAO-12-629]. For more 
information, contact Valerie C. Melvin at (202) 512-6304 or 
melvinv@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Selected Agencies' Cost-Estimating Policies and Procedures Have 
Significant Weaknesses: 

Selected Agencies' Program Cost Estimates Do Not Provide a Fully 
Reliable Basis for Program and Budget Decisions: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Case Studies of Selected Programs' Cost-Estimating 
Practices: 

Appendix III: Original and Current Life-Cycle Cost Estimates for Case 
Study Programs: 

Appendix IV: Comments from the Department of Agriculture: 

Appendix V: Comments from the Department of Commerce: 

Appendix VI: Comments from the Department of Defense: 

Appendix VII: Comments from the Environmental Protection Agency: 

Appendix VIII: Comments from the Department of Homeland Security: 

Appendix IX: Comments from the Department of Labor: 

Appendix X: Comments from the Pension Benefit Guaranty Corporation: 

Appendix XI: Comments from the Department of Veterans Affairs: 

Appendix XII: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Key Components of an Effective Cost-Estimating Policy: 

Table 2: Assessment of Selected Agencies' Cost-Estimating Policies: 

Table 3: Four Characteristics of a Reliable Cost Estimate: 

Table 4: Assessment of Cost-Estimating Practices for Case Study 
Programs: 

Table 5: Case Study Programs: 

Table 6: Assessment of the PHIS Program's Cost Estimate: 

Table 7: Assessment of the WBSCM Program's Cost Estimate: 

Table 8: Assessment of the CLASS Program's Cost Estimate: 

Table 9: Assessment of the PE2E-SE Program's Cost Estimate: 

Table 10: Assessment of the CANES Program's Cost Estimate: 

Table 11: Assessment of the TMC Program's Cost Estimate: 

Table 12: Assessment of the FSMP Program's Cost Estimate: 

Table 13: Assessment of the SEMS Program's Cost Estimate: 

Table 14: Assessment of the IPAWS Program's Cost Estimate: 

Table 15: Assessment of the Rescue 21 Program's Cost Estimate: 

Table 16: Assessment of the NGCODIS Cost Estimate: 

Table 17: Assessment of the UFMS Program's Cost Estimate: 

Table 18: Assessment of the OIS Program's Cost Estimate: 

Table 19: Assessment of the BA Program's Cost Estimate: 

Table 20: Assessment of the HDR Program's Cost Estimate: 

Table 21: Assessment of the VBMS Program's Cost Estimate: 

Table 22: Original and Current Life-Cycle Cost Estimates for Case 
Study Programs (as of April 2012): 

Abbreviations: 

BA: Benefit Administration: 

CANES: Consolidated Afloat Networks and Enterprise Services: 

CLASS: Comprehensive Large Array-data Stewardship System: 

CODIS: Combined DNA Index System: 

DOD: Department of Defense: 

EPA: Environmental Protection Agency: 

HDR: Health Data Repository: 

IPAWS: Integrated Public Alert and Warning System: 

IT: information technology: 

FSMP: Financial System Modernization Project: 

NGCODIS: Next Generation Combined DNA Index System: 

TMC: Tactical Mission Command: 

OMB: Office of Management and Budget: 

OIS: OSHA Information System: 

OSHA: Occupational Safety and Health Administration: 

PBGC: Pension Benefit Guaranty Corporation: 

PE2E-SE: Patents End-to-End: Software Engineering: 

PHIS: Public Health Information System: 

SEMS: Superfund Enterprise Management System: 

UFMS: Unified Financial Management System: 

VBMS: Veterans Benefits Management System: 

WBSCM: Web-Based Supply Chain Management: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

July 11, 2012: 

The Honorable Susan M. Collins: 
Ranking Member: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

Dear Senator Collins: 

In fiscal year 2012, the federal government plans to spend at least 
$75 billion on information technology (IT) investments, many of which 
involve systems and technologies to modernize legacy systems, increase 
communication and networking capabilities, and transition to new 
systems designed to significantly improve the government's ability to 
carry out critical mission functions in the 21st century.[Footnote 1] 
Given the size of this investment, it is important that IT 
acquisitions are based on reliable estimates of costs over their full 
acquisition life cycles. The ability to generate a reliable cost 
estimate is critical to the success of any IT program, as it provides 
the basis for informed decision making, realistic budget formulation, 
and meaningful progress measurement. Without this ability, programs 
are at risk of experiencing cost overruns, missed deadlines, and 
performance shortfalls. 

This report responds to your request that we evaluate the 
implementation of cost-estimating processes at selected federal 
government departments and agencies. Specifically, our objectives were 
to (1) assess the extent to which selected departments and agencies 
have appropriately implemented cost-estimating policies and 
procedures, and (2) evaluate whether selected IT investments at these 
departments and agencies have reliable cost estimates to support 
budget and program decisions. 

To assess the extent to which selected departments and agencies have 
appropriately implemented cost-estimating policies and procedures, we 
reviewed cost-estimating policies and procedures from eight agencies. 
[Footnote 2] The eight agencies were selected from across different 
ranges of planned IT spending in fiscal year 2010.[Footnote 3] The 
number of agencies selected from each range was based on the relative 
number of IT investments within each range, and the specific agencies 
selected were those with the highest amount of planned IT spending in 
fiscal year 2010. Specifically, we chose one agency with greater than 
$10 billion in planned IT spending,[Footnote 4] five agencies with 
between $1 billion and $10 billion in planned spending, and two 
agencies with less than $1 billion in planned spending. We compared 
the agencies' policies and procedures with the best practices 
identified in GAO's cost-estimating guide[Footnote 5] to determine the 
comprehensiveness of each agency's established policies for cost 
estimating. For each policy component, we assessed it as either being 
not met--the agency did not provide evidence that it addressed the 
policy component or provided evidence that it minimally addressed the 
policy component; partially met--the agency provided evidence that it 
addressed about half or a large portion of the policy component; or 
fully met--the agency provided evidence that it fully addressed the 
policy component. In addition, we interviewed relevant agency 
officials, including officials responsible for developing cost-
estimating policies. 

To evaluate whether selected IT investments at these departments and 
agencies have reliable cost estimates to support budget and program 
decisions, we reviewed individual programs' relevant cost-estimating 
documentation, including, for example, the current life-cycle cost 
estimate and schedule and technical baseline information, from 16 
major investments at the eight agencies.[Footnote 6] The 16 programs 
selected for case study (2 per agency) were among the largest in terms 
of planned spending; considered major IT investments[Footnote 7]; and 
had a higher percentage of development versus steady-state[Footnote 8] 
spending, among other things. We compared the programs' life-cycle 
cost estimates and underlying support with the best practices 
identified in GAO's cost-estimating guide[Footnote 9] to determine the 
extent to which the estimates are reliable and are being used to 
support budget and program decisions. Specifically, we assessed 
program practices against the four characteristics of a reliable 
estimate--comprehensive, well-documented, accurate, and credible. For 
each characteristic, we assessed multiple practices as being not met--
the program did not provide evidence that it implemented the practices 
or provided evidence that it only minimally implemented the practices; 
partially met--the program provided evidence that it implemented about 
half or a large portion of the practices; or fully met--the program 
provided evidence that it fully implemented the practices. We then 
summarized these assessments by characteristic. In addition, we 
interviewed relevant agency officials, including key personnel on the 
programs that we selected for case study. 

We conducted this performance audit from July 2011 through July 2012, 
in accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. Appendix I 
contains further details about our objectives, scope, and methodology. 

Background: 

Given the size and significance of the government's investment in IT, 
it is important that projects be managed effectively to ensure that 
public resources are wisely invested. Effectively managing projects 
entails, among other things, developing reliable and high-quality cost 
estimates that project realistic life-cycle costs. A life-cycle cost 
estimate provides an exhaustive and structured accounting of all 
resources and associated cost elements required to develop, produce, 
deploy, and sustain a particular program. In essence, life cycle can 
be thought of as a "cradle to grave" approach to managing a program 
throughout its useful life. Because a life-cycle cost estimate 
encompasses all past (or sunk), present, and future costs for every 
aspect of the program, regardless of funding source, it provides a 
wealth of information about how much programs are expected to cost 
over time. 

We have previously reported[Footnote 10] that a reliable cost estimate 
is critical to the success of any government acquisition program, as 
it provides the basis for informed investment decision making, 
realistic budget formulation and program resourcing, meaningful 
progress measurement, proactive course correction, and accountability 
for results. Having a realistic, up-to-date estimate of projected 
costs--one that is continually revised as the program matures--can be 
used to support key program decisions and milestone reviews. In 
addition, the estimate is often used to determine the program's budget 
spending plan, which outlines how and at what rate the program funding 
will be spent over time. Because a reasonable and supportable budget 
is essential to a program's efficient and timely execution, a reliable 
estimate is the foundation of a good budget. However, we have also 
found that developing reliable cost estimates has been difficult for 
agencies across the federal government.[Footnote 11] Too often, 
programs cost more than expected and deliver results that do not 
satisfy all requirements. 

In 2006, the Office of Management and Budget (OMB) updated its Capital 
Programming Guide, which requires agencies to develop a disciplined 
cost-estimating capability to provide greater information management 
support, more accurate and timely cost estimates, and improved risk 
assessments to help increase the credibility of program cost 
estimates.[Footnote 12] Further, according to OMB, programs must 
maintain current and well-documented estimates of costs, and these 
estimates must encompass the full life cycle of the program. Among 
other things, OMB states that generating reliable cost estimates is a 
critical function necessary to support OMB's capital programming 
process. Without this ability, programs are at risk of experiencing 
cost overruns, missed deadlines, and performance shortfalls. 

Building on OMB's requirements, in March 2009, we issued a guide on 
best practices for estimating and managing program costs that 
highlights the policies and practices adopted by leading organizations 
to implement an effective cost-estimating capability.[Footnote 13] 
Specifically, these best practices identify the need for 
organizational policies that define a clear requirement for cost 
estimating; require compliance with cost-estimating best practices; 
require management review and acceptance of program cost estimates; 
provide for specialized training; establish a central, independent 
cost-estimating team; require a standard structure for defining work 
products; and establish a process to collect and store cost-related 
data. In addition, the cost-estimating guide identifies four 
characteristics of a reliable cost estimate that management can use 
for making informed program and budget decisions: a reliable cost 
estimate is comprehensive, well-documented, accurate, and credible. 
Specifically, an estimate is: 

* comprehensive when it accounts for all possible costs associated 
with a program, is structured in sufficient detail to ensure that 
costs are neither omitted nor double counted, and documents all cost-
influencing assumptions; 

* well-documented when supporting documentation explains the process, 
sources, and methods used to create the estimate, contains the 
underlying data used to develop the estimate, and is adequately 
reviewed and approved by management; 

* accurate when it is not overly conservative or optimistic, is based 
on an assessment of the costs most likely to be incurred, and is 
regularly updated so that it always reflects the current status of the 
program; and: 

* credible when any limitations of the analysis because of uncertainty 
or sensitivity surrounding data or assumptions are discussed, the 
estimate's results are cross-checked, and an independent cost estimate 
is conducted by a group outside the acquiring organization to 
determine whether other estimating methods produce similar results. 

We have previously reported on weaknesses associated with the 
implementation of sound cost-estimating practices at various agencies 
and the impact on budget and program decisions. For example, 

* In January 2012, we reported that the Internal Revenue Service did 
not have comprehensive guidance for cost estimating.[Footnote 14] 
Specifically, the agency's guidance did not clearly discuss the 
appropriate uses of different types of cost estimates. Further, our 
review of the agency's Information Reporting and Document Matching 
program's cost estimate found it was unreliable. Among other things, 
the program's projected budget of $115 million through fiscal year 
2016 was only partly supported by the cost estimate, which included 
costs only through fiscal year 2014. As a result, the agency did not 
have a reliable basis for the program's budget projection. We made 
multiple recommendations to improve the quality of the agency's cost 
and budget information, including ensuring that the Information 
Reporting and Document Matching program's cost estimate is reliable 
and that the agency's cost-estimating guidance is consistent and 
clearly requires the use of current and reliable cost estimates to 
inform budget requests. The agency partially agreed with these 
recommendations and stated that they have taken steps to ensure that 
their cost-estimating practices and procedures follow consistent 
documented guidance. 

* In January 2010, we reported that the Department of Energy lacked 
comprehensive policy for cost estimating, making it difficult for the 
agency to oversee development of high-quality cost estimates.[Footnote 
15] Specifically, the agency's policy did not describe how estimates 
should be developed and did not establish a central office for cost 
estimating. Further, we reviewed four programs at the department, each 
estimated to cost approximately $900 million or more, and reported 
that they did not have reliable cost estimates. For example, three of 
the cost estimates did not include costs for the full life cycles of 
the programs, omitting operations and maintenance costs or portions of 
program scope. Additionally, three of the cost estimates did not use 
adequate data, one of which relied instead on professional opinion. 
Further, the cost estimates did not fully incorporate risk--
specifically, they did not address correlated risks among project 
activities. As a result, these programs were more likely to exceed 
their estimates and require additional funding to be completed. We 
made multiple recommendations to improve cost estimating at the 
department, including updating its cost-estimating policy and guidance 
and ensuring cost estimates are developed in accordance with best 
practices. The Department of Energy generally agreed with our 
recommendations and stated that it had several initiatives underway to 
improve cost-estimating practices, including the development of a new 
cost-estimating policy and guidance, a historical cost database to 
support future estimates, and additional training courses. 

* Finally, we reported in December 2009 that the Department of 
Veterans Affairs had 18 construction projects that had experienced 
cost increases due, in part, to unreliable cost estimates.[Footnote 
16] For example, many estimates were completed quickly, one of which 
was a rough-order-of-magnitude estimate that was not intended to be 
relied on as a budget-quality estimate of full project costs. 
Additionally, we found that some projects had not conducted a risk 
analysis to quantify the impact of risk on the total estimated costs. 
As a result, in some cases, projects had to change scope to meet their 
initial estimate and, in others, additional funds had to be requested 
from Congress to allow the agency to complete the project. We 
recommended that the department improve cost estimating at major 
construction projects by conducting cost risk analyses and mitigating 
risks that may influence projects' costs. The Department of Veterans 
Affairs agreed with our recommendation and stated that it was taking 
steps, such as developing a multiyear construction plan to ensure that 
reliable projections of program costs are available for budgeting 
purposes, and planning to improve its risk analyses. 

Selected Agencies' Cost-Estimating Policies and Procedures Have 
Significant Weaknesses: 

According to OMB,[Footnote 17] agencies should develop a disciplined 
cost-estimating capability to provide greater information management 
support, more accurate and timely cost estimates, and improved risk 
assessments to help increase the credibility of program cost 
estimates. In addition, we have reported[Footnote 18] that leading 
organizations establish cost-estimating policies and procedures that: 

* define a clear requirement for cost estimating; 

* identify and require compliance with cost-estimating best practices, 
and validate their use; 

* require that estimates be reviewed and approved by management; 

* require and enforce training in cost estimating; 

* establish a central, independent cost-estimating team; 

* require, at a high level, a standard, product-oriented work 
breakdown structure; and: 

* establish a process for collecting and storing cost-related data to 
support future estimates. 

Table 1 describes the key components of an effective cost-estimating 
policy. 

Table 1: Key Components of an Effective Cost-Estimating Policy: 

Component: Clear requirement for cost estimating; 
Description: A clear requirement should be established for cost 
estimating, especially for all major investments. Specifically, 
agencies should clearly require every program to develop a cost 
estimate that accounts for the full program life cycle. Further, if 
agencies choose to specify more or less detail and review for 
different investments, they should clearly identify, document, and 
disseminate thresholds differentiating investments based on their size 
or strategic importance. In particular, major investments require 
special management attention. A program life-cycle cost estimate can 
increase the probability of a program's success by supporting 
effective program and budget decision making, as cost estimates are 
necessary to support decisions about funding one program over another, 
develop budget requests, develop performance measurement baselines, 
and support effective resource allocation. 

Component: Compliance with cost-estimating best practices; 
Description: The use of cost-estimating best practices, such as those 
outlined in the GAO cost guide, should be identified, required, and 
validated. These practices include, among other things, gathering cost 
data, conducting a risk and uncertainty analysis, and updating the 
estimate. Identifying and requiring the use of best practices when 
developing cost estimates should result in estimates that are 
defensible, consistent, and trustworthy. It is also important that 
cost estimators and organizations independent of the program office 
validate that program cost estimates are reliable, including assessing 
whether the estimates were developed in accordance with best practices. 

Component: Management review and approval; 
Description: The policy should require that cost estimates be reviewed 
and approved by management, and define certain aspects of this 
process. To facilitate presenting estimates to management, the entity 
to whom the estimates will be presented and the general format of the 
information provided should be identified. Examples of information 
typically provided with an estimate include, among other things, the 
quality and reliability of the technical baseline and data used, what 
level of confidence the estimate represents after all risks have been 
quantified, and the amount of contingency reserve needed to increase 
the estimate's confidence to an acceptable level. Lastly, management's 
approval of the estimate should be documented. 

Component: Training requirements; 
Description: Training in cost estimating should be required and 
enforced for personnel with program management and investment 
oversight responsibilities. Additionally, when government managers 
rely on contractors to develop cost estimates, special care must be 
taken to ensure that the government staff have enough training and 
experience to determine whether the cost estimate conforms to best 
practices. 

Component: Central, independent cost-estimating team; 
Description: The cost-estimating team and process should be 
centralized independent of program offices. Regardless of agency size, 
this facilitates the use of standardized processes, the identification 
of resident experts, a better sharing of resources, and commonality 
and consistency of tools and training. 

Component: Standard structure for defining work products; 
Description: A standard, product-oriented work breakdown structure 
should be established, at a high level. Such a structure deconstructs 
a program's end product into successive levels with smaller specific 
elements until the work is subdivided to a level suitable for 
management control. This ensures the use of high-quality estimating 
structures that allow programs to plan and track cost and schedule by 
defined deliverables, and results in more consistent cost estimates. 
It also enables the agency to compare costs across programs, as not 
standardizing the work breakdown structure causes extreme difficulty 
in comparing costs from one contractor or program to another. 
Additionally, standardizing the work breakdown structure enables an 
organization to collect and share data among programs. 

Component: Process to collect and store cost-related data; 
Description: A process should be established to collect and store 
complete actual cost-related data from past estimates. This ensures 
that data are available to support future estimates by making data 
available in retrievable cost databases, which is essential because 
cost estimating requires current and relevant cost data to remain 
credible. Additionally, to ensure the data can be used reliably for 
future estimates, the data collection effort needs to include schedule 
and technical data to allow future cost estimators to understand the 
history behind the data. 

Source: GAO-09-3SP. 

[End of table] 

While the eight agencies varied in the extent to which their cost-
estimating policies and procedures addressed best practices, most did 
not address several key components of an effective policy. 
Specifically, only the Department of Defense's (DOD) policy was fully 
consistent with all seven components. While the Department of Homeland 
Security addressed most components of an effective cost-estimating 
policy, other agencies' policies had significant weaknesses, 
particularly in cost-estimating training and in establishing a process 
to collect and store cost-related data. 

Table 2 provides a detailed assessment of each agency's policies 
against the components of an effective cost-estimating policy. In 
addition, a discussion of each policy component follows the table. 

Table 2: Assessment of Selected Agencies' Cost-Estimating Policies: 

Agency: Agriculture; 
Clear requirement for cost estimating: Fully met; 
Compliance with cost-estimating best practices: Partially met; 
Management review and approval: Partially met; 
Training requirements: Not met; 
Central, independent cost-estimating team: Not met; 
Standard structure for defining work products: Not met; 
Process to collect and store cost-related data: Not met. 

Agency: Commerce; 
Clear requirement for cost estimating: Partially met; 
Compliance with cost-estimating best practices: Not met; 
Management review and approval: Not met; 
Training requirements: Not met; 
Central, independent cost-estimating team: Not met; 
Standard structure for defining work products: Not met; 
Process to collect and store cost-related data: Not met. 

Agency: Defense; 
Clear requirement for cost estimating: Fully met; 
Compliance with cost-estimating best practices: Fully met; 
Management review and approval: Fully met; 
Training requirements: Fully met; 
Central, independent cost-estimating team: Fully met; 
Standard structure for defining work products: Fully met; 
Process to collect and store cost-related data: Fully met. 

Agency: Environmental Protection Agency; 
Clear requirement for cost estimating: Fully met; 
Compliance with cost-estimating best practices: Partially met; 
Management review and approval: Partially met; 
Training requirements: Not met; 
Central, independent cost-estimating team: Not met; 
Standard structure for defining work products: Not met; 
Process to collect and store cost-related data: Not met. 

Agency: Homeland Security; 
Clear requirement for cost estimating: Fully met; 
Compliance with cost-estimating best practices: Fully met; 
Management review and approval: Fully met; 
Training requirements: Partially met; 
Central, independent cost-estimating team: Fully met; 
Standard structure for defining work products: Partially met; 
Process to collect and store cost-related data: Not met. 

Agency: Justice; 
Clear requirement for cost estimating: Fully met; 
Compliance with cost-estimating best practices: Partially met; 
Management review and approval: Not met; 
Training requirements: Not met; 
Central, independent cost-estimating team: Not met; 
Standard structure for defining work products: Partially met; 
Process to collect and store cost-related data: Not met. 

Agency: Labor; 
Clear requirement for cost estimating: Fully met; 
Compliance with cost-estimating best practices: Fully met; 
Management review and approval: Fully met; 
Training requirements: Partially met; 
Central, independent cost-estimating team: Not met; 
Standard structure for defining work products: Not met; 
Process to collect and store cost-related data: Not met. 

Agency: Veterans Affairs; 
Clear requirement for cost estimating: Partially met; 
Compliance with cost-estimating best practices: Not met; 
Management review and approval: Partially met; 
Training requirements: Not met; 
Central, independent cost-estimating team: Fully met; 
Standard structure for defining work products: Partially met; 
Process to collect and store cost-related data: Not met. 

Key: 

Fully met--the agency provided evidence that it fully addressed the 
policy component. 

Partially met--the agency provided evidence that it addressed about 
half or a large portion of the policy component. 

Not met--the agency did not provide evidence that it addressed the 
policy component or provided evidence that it minimally addressed the 
policy component. 

Source: GAO analysis of agency data. 

[End of table] 

* Clear requirement for cost estimating: Six of the eight agencies 
fully addressed this policy component by establishing a clear 
requirement for all programs to perform life-cycle cost estimates, and 
in certain cases specified more stringent requirements for programs 
designated as major investments. Among these, four agencies--the 
Department of Agriculture, the Environmental Protection Agency (EPA), 
the Department of Labor, and the Department of Justice--established 
this requirement as part of their policies for programs to perform a 
cost-benefit analysis. For example, Labor required a life-cycle cost 
estimate as part of a cost-benefit analysis for both major and 
nonmajor investments, with less detail required for nonmajor 
investments. The other two agencies--DOD and Homeland Security--
defined a separate requirement for programs to develop life-cycle cost 
estimates. For the two agencies that did not fully establish a clear 
requirement for cost estimating, the Department of Veterans Affairs 
partially addressed this component because its policy only requires 
cost estimates to be prepared for project increments, rather than the 
full program life cycle. In addition, the Department of Commerce 
partially addressed this component because its policies only require 
cost estimates to be prepared for contracts, rather than for the full 
program life cycle (including government and contractor costs). 
Officials at both agencies stated that the responsibility for 
establishing requirements for cost estimating had been delegated to 
their component agencies. Further, officials at these two agencies 
described steps planned to address this and other weaknesses. For 
example, Veterans Affairs officials stated that the agency's recently 
established Office of Corporate Analysis and Evaluation (part of the 
Office of Planning and Policy) is planning to establish a centralized 
cost-estimating policy that includes clear criteria for cost 
estimating, which it expects to complete in fiscal year 2012. Further, 
Commerce officials stated that the agency is currently in the process 
of updating its policy and guidance to address this and other 
weaknesses, which it plans to complete by October 2012. If the updated 
policies and guidance address the weaknesses we identified, decision 
makers should have an improved view of their programs' life-cycle 
costs. 

* Compliance with cost-estimating best practices: Three of the eight 
agencies (DOD, Homeland Security, and Labor) fully addressed this 
policy component by identifying and requiring the use of cost-
estimating best practices by their programs, and defining a process to 
validate their use. For example, Homeland Security draws on the GAO 
cost guide[Footnote 19] to identify cost-estimating best practices, 
and also provides agency-specific cost-estimating requirements for 
implementing the practices, such as identifying the cost-estimate 
documentation required. The agency's policy also requires that 
estimates for key programs be validated. For the three agencies that 
partially addressed this policy component--Agriculture, EPA, and 
Justice--all provided guidance to their programs specific to 
conducting a cost-benefit analysis; however, this guidance did not 
fully address important cost-estimating practices, such as conducting 
a risk and uncertainty analysis, updating the estimate, or comparing 
the estimate to an independent estimate. Their guidance also did not 
identify a mechanism for validating estimates. Lastly, two agencies--
Commerce and Veterans Affairs--had not addressed this policy 
component, which corresponds to our finding that these agencies did 
not have requirements for programs to prepare cost estimates. Among 
the five agencies that did not fully address this policy component, 
officials commonly stated that the responsibility for requiring 
compliance with best practices had been delegated to their component 
agencies or that addressing cost-estimating shortcomings had not been 
a priority. Without fully complying with best practices for developing 
cost estimates, programs are less likely to prepare reliable cost 
estimates, hindering agency decision making. 

* Management review and approval: Three of the eight agencies (DOD, 
Homeland Security, and Labor) fully addressed this policy component by 
requiring that program cost estimates be reviewed and approved by 
management, including defining the information to be presented and 
requiring that approval be documented. For example, Labor's policy 
requires that senior management at both the component agency 
responsible for the program and the Office of the Chief Information 
Officer approve the estimate, based on a briefing that includes 
information about the estimate such as the largest cost drivers, major 
risks, and the findings of the integrated baseline review,[Footnote 
20] and that this approval is documented. For the three agencies that 
partially addressed this policy component (Agriculture, EPA, and 
Veterans Affairs), all required that estimated costs be presented to 
management, but none fully defined the information to be presented, 
such as the confidence level associated with the estimate. Lastly, 
neither Justice nor Commerce had departmental requirements for 
management review and approval of the cost estimate. Officials at both 
agencies stated that this responsibility had been delegated to their 
component agencies. However, without requiring management review and 
approval of program cost estimates at the department level, agencies 
have reduced ability to enforce cost-estimating policies and ensure 
that cost estimates meet management's needs for reliable information 
about programs' estimated costs. 

* Training requirements: Only one agency--DOD--fully addressed this 
policy component by requiring cost-estimating training and enforcing 
this requirement. For example, DOD requires training in cost 
estimating via its Defense Acquisition Workforce Improvement Act 
[Footnote 21] certifications, among other things, for at least one 
staff member for each major program, as well as for personnel with 
investment oversight responsibility. While the two agencies that 
partially addressed this policy component (Homeland Security and 
Labor) provided cost-estimating training and had a mechanism to track 
participation, their policies did not address providing training to 
personnel with investment oversight responsibility, such as officials 
from Homeland Security who are responsible for reviewing and approving 
programs at key milestones in their life cycles. Among the five 
agencies whose policies did not address requiring and enforcing 
training in cost estimating (Agriculture, Commerce, EPA, Justice, and 
Veterans Affairs), four of these agencies referred to OMB's Federal 
Acquisition Certification for Program and Project Managers[Footnote 
22] as providing for such training. However, this certification 
program does not require classes on cost estimating, and furthermore, 
is not intended for nor provided to individuals with investment 
oversight responsibility. Additionally, officials at two of the five 
agencies--Commerce and Veterans Affairs--stated that training in cost 
estimating had not been viewed as a priority. Without requiring and 
enforcing training in cost estimating, agencies cannot effectively 
ensure that staff have the skills and knowledge necessary to prepare 
and use cost estimates to make reliable budget and program decisions. 

* Central, independent cost-estimating team: Three of the eight 
agencies (DOD, Homeland Security, and Veterans Affairs) fully 
addressed this policy component by establishing central, independent 
cost-estimating teams, all of which have responsibility for, among 
other things, developing cost-estimating guidance and validating that 
program cost estimates are developed in accordance with best 
practices.[Footnote 23] In addition, among these three agencies, the 
teams established at DOD and Veterans Affairs are also charged with 
improving cost-estimating training. The remaining five agencies had 
not established a central, independent cost-estimating team. Among 
these, officials commonly cited the lack of a priority at the 
department or agency level for cost-estimating initiatives, although 
in one case a component agency at Agriculture--the Food Safety and 
Inspection Service--established its own centralized cost-estimating 
team. While this will likely enhance cost estimating at the component 
agency, not centralizing the cost-estimating function in the 
department could result in ad hoc processes and a lack of commonality 
in the estimating tools and training across the department. 
Additionally, officials from Labor stated they believe the 
department's IT budget is too small to cost-effectively centralize the 
cost-estimating function; however, doing so would likely, among other 
things, facilitate a better sharing of resources and could be 
accomplished in a manner commensurate with agency size. Agencies that 
do not establish a central and independent cost-estimating team may 
lack the ability to improve the implementation of cost-estimating 
policies, support cost-estimating training, and validate the 
reliability of program cost estimates at the department or agency 
level. 

* Standard structure for defining work products: DOD was the only 
agency to fully address this policy component by developing and 
requiring the use of standard, product-oriented work breakdown 
structures. Specifically, the agency provided multiple standard work 
breakdown structures, along with detailed guidance, for different 
types of programs (e.g., automated information systems, space systems, 
aircraft systems), and required their use. Three agencies--Homeland 
Security, Justice, and Veterans Affairs--partially addressed this 
policy component in that they provided one or more product-oriented 
work breakdown structures in their policies, but did not require 
programs to use them for cost estimating. Among these, Justice 
officials stated that a standard work breakdown structure was only 
required for their earned value management[Footnote 24] processes. 
Further, both Veterans Affairs and Homeland Security stated that they 
intend to require the use of a standard work breakdown structure in 
the future, but had not yet determined a time frame for establishing 
this requirement. Lastly, four of the selected agencies--Agriculture, 
Commerce, EPA, and Labor--had not established a standard structure. 
Among these, officials from Agriculture, EPA, and Labor stated that 
they believe it is difficult to standardize how programs define work 
products, in part, because their programs conduct different types of 
work and have different needs. While this presents a challenge, 
agencies could adopt an approach similar to DOD's and develop various 
standard work structures based on the kinds of work being performed. 
Commerce officials stated that they plan to establish a standard 
structure for defining work products in the future, but have not yet 
determined a time frame for completing this. Without establishing a 
standard structure for defining work products, agencies will not be 
positioned to ensure that they can effectively compare programs and 
collect and share data among programs. 

* Process to collect and store cost-related data: Only one agency--
DOD--fully addressed this policy component by establishing a process 
to collect and store cost-related data. Specifically, the agency has a 
central repository for collecting actual costs, software data, and 
related business data, which serves as a resource to support cost 
estimating across the agency. Among the seven agencies that have not 
established a process for collecting and storing cost-related data, 
Homeland Security's policy assigns responsibility for doing so to the 
central cost-estimating team; however, the team has not yet 
implemented the process. Additionally, Veterans Affairs officials 
stated that collecting such data would depend on the use of a standard 
structure for defining work products, which they have not yet put in 
place. Agriculture and Commerce officials stated that cost-estimating 
initiatives have not been a priority, although in one case a component 
agency at Commerce--the United States Patent and Trademark Office--
took the initiative to establish a process to collect and store cost-
related data from past estimates. While this should improve cost 
estimating at the component agency, without establishing an agencywide 
process to collect and store cost-related data, agencies will find it 
difficult to improve the data available to all programs and to 
increase the efficiency of developing cost estimates. 

Until the selected agencies address the identified weaknesses in their 
cost-estimating policies, it will be difficult for them to make 
effective use of program cost estimates for informed decision making, 
realistic budget formation, and meaningful progress measurement. 

Selected Agencies' Program Cost Estimates Do Not Provide a Fully 
Reliable Basis for Program and Budget Decisions: 

A reliable cost estimate is critical to the success of any government 
acquisition program, as it provides the basis for informed investment 
decision making, realistic budget formulation and program resourcing, 
and meaningful progress measurement. According to OMB,[Footnote 25] 
programs must maintain current and well-documented cost estimates, and 
these estimates must encompass the full life cycle of the programs. In 
addition, our research has identified a number of best practices that 
provide a basis for effective program cost estimating and should 
result in reliable cost estimates that management can use for making 
informed decisions. These practices can be organized into four 
characteristics--comprehensive, well-documented, accurate, and 
credible. These four characteristics of a reliable cost estimate are 
explained in table 3. 

Table 3: Four Characteristics of a Reliable Cost Estimate: 

Characteristic: Comprehensive; 
Explanation: The cost estimate should include both government and 
contractor costs of the program over its full life cycle, from 
inception of the program through design, development, deployment, and 
operation and maintenance, to retirement of the program. It should 
also completely define the program, reflect the current schedule, and 
be technically reasonable. Comprehensive cost estimates should be 
structured in sufficient detail (at least three levels of cost 
elements) to ensure that costs are neither omitted nor double 
counted.[A] Specifically, the cost estimate should be based on a 
product-oriented work breakdown structure that allows a program to 
track cost and schedule by defined deliverables, such as hardware or 
software components. Finally, where information is limited and 
judgments must be made, the cost estimate should document all cost-
influencing ground rules and assumptions. 

Characteristic: Well-documented; 
Explanation: A good cost estimate--while taking the form of a single 
number--is supported by detailed documentation that describes how it 
was derived and how the expected funding will be spent in order to 
achieve a given objective. Therefore, the documentation should capture 
in writing such things as the source data used, the calculations 
performed and their results, and the estimating methodology used to 
derive each work breakdown structure element's cost. Moreover, this 
information should be captured in such a way that the data used to 
derive the estimate can be traced back to and verified against their 
sources so that the estimate can be easily replicated and updated. The 
documentation should also discuss the technical baseline description 
and how the data were normalized. Finally, the final cost estimate 
should be reviewed and accepted by management on the basis of 
confidence in the estimating process and the estimate produced by the 
process. 

Characteristic: Accurate; 
Explanation: The cost estimate should provide for results that are 
unbiased, and it should not be overly conservative or optimistic. An 
estimate is accurate when it is based on an assessment of most likely 
costs, adjusted properly for inflation, and contains few, if any, 
minor mistakes. In addition, the estimate should be grounded in a 
historical record of cost estimating and actual experiences on other 
comparable programs. Finally, a cost estimate should be updated 
regularly to reflect material changes in the program, such as when 
schedules or other assumptions change, and actual costs, so that it is 
always reflecting current status. 

Characteristic: Credible; 
Explanation: The cost estimates should discuss any limitations of the 
analysis because of uncertainty or biases surrounding data or 
assumptions. Major assumptions should be varied, and other outcomes 
recomputed to determine how sensitive they are to changes in the 
assumptions (i.e., sensitivity analysis). A risk and uncertainty 
analysis should be performed to determine the level of risk associated 
with the estimate. For management to make good decisions, the program 
estimate must reflect the degree of uncertainty, so that a level of 
confidence can be given about the estimate. Having a range of costs 
around a point estimate is more useful to decision makers because it 
conveys the level of confidence in achieving the most likely cost and 
also informs them on cost, schedule, and technical risks.[B] Further, 
the estimate's results should be cross-checked, and an independent 
cost estimate conducted by a group outside the acquiring organization 
should be developed to determine whether other estimating methods 
produce similar results. 

Source: GAO-09-3SP. 

[A] The appropriate number of levels for a work breakdown structure 
varies from program to program and depends on a program's complexity 
and risk. However, each work breakdown structure should, at the very 
least, include three levels. The first level represents the program as 
a whole and therefore contains only one element--the program's name. 
The second level contains the major program segments, and level three 
contains the lower-level components or subsystems for each segment. 

[B] A point estimate is the most likely value for the cost estimate, 
given the underlying data. The level of confidence for the point 
estimate is the probability that the point estimate will actually be 
met. For example, if the confidence level for a point estimate is 80 
percent, there is an 80 percent chance that the final cost will be at 
or below the point estimate and a 20 percent chance that costs will 
exceed the point estimate. 

[End of table] 

Nearly All Programs Did Not Fully Meet the Characteristics of a 
Reliable Cost Estimate: 

While all 16 major acquisition programs we reviewed had developed cost 
estimates and were using them to inform decision making, all but one 
of the estimates were not fully reliable and did not provide a sound 
basis for informed program and budget decisions. The 16 acquisition 
programs had developed cost estimates and were using their estimates, 
in part, to support program and budget decisions. For example, most 
programs used their cost estimate as the basis for key program 
decisions, such as approval to proceed to full production of a system. 
In addition, most programs were using their estimates as an input to 
their annual budget request process. 

However, nearly all of these programs had estimates that did not fully 
reflect important cost-estimating practices. Specifically, of the 16 
case study programs, only 1 fully met all four characteristics of a 
reliable cost estimate, while the remaining 15 programs varied in the 
extent to which they met the four characteristics. Table 4 identifies 
the 16 case study programs and summarizes our results for these 
programs. Following the table is a summary of the programs' 
implementation of cost-estimating practices. Additional details on the 
16 case studies are provided in appendix II. 

Table 4: Assessment of Cost-Estimating Practices for Case Study 
Programs: 

Agency: Agriculture; 
Program: Public Health Information System; 
Comprehensive: Partially met; 
Well-documented: Not met; 
Accurate: Not met; 
Credible: Not met. 

Agency: Agriculture; 
Program: Web-Based Supply Chain Management; 
Comprehensive: Not met; 
Well-documented: Not met; 
Accurate: Not met; 
Credible: Not met. 

Agency: Commerce; 
Program: Comprehensive Large Array-data Stewardship System; 
Comprehensive: Partially met; 
Well-documented: Not met; 
Accurate: Not met; 
Credible: Not met. 

Agency: Commerce; 
Program: Patents End-to-End: Software Engineering; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Not met. 

Agency: Defense; 
Program: Tactical Mission Command; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Not met. 

Agency: Defense; 
Program: Consolidated Afloat Networks and Enterprise Services; 
Comprehensive: Fully met; 
Well-documented: Fully met; 
Accurate: Fully met; 
Credible: Fully met. 

Agency: Environmental Protection Agency; 
Program: Financial System Modernization Project; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Not met. 

Agency: Environmental Protection Agency; 
Program: Superfund Enterprise Management System; 
Comprehensive: Not met; 
Well-documented: Not met; 
Accurate: Not met; 
Credible: Partially met. 

Agency: Homeland Security; 
Program: Integrated Public Alert and Warning System; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Fully met; 
Credible: Partially met. 

Agency: Homeland Security; 
Program: Rescue 21; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Partially met. 

Agency: Justice; 
Program: Unified Financial Management System; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Not met. 

Agency: Justice; 
Program: Next Generation Combined DNA Index System; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Partially met. 

Agency: Labor; 
Program: OSHA[A] Information System; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Not met. 

Agency: Labor; 
Program: PBGC[B] Benefit Administration; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Partially met; 
Credible: Partially met. 

Agency: Veterans Affairs; 
Program: Health Data Repository; 
Comprehensive: Not met; 
Well-documented: Not met; 
Accurate: Not met; 
Credible: Not met. 

Agency: Veterans Affairs; 
Program: Veterans Benefits Management System; 
Comprehensive: Partially met; 
Well-documented: Partially met; 
Accurate: Not met; 
Credible: Not met. 

Key: 

Fully met--the program provided evidence that it fully implemented the 
cost-estimating practices for this characteristic. 

Partially met--the program provided evidence that it implemented about 
half or a large portion of the cost-estimating practices for this 
characteristic. 

Not met--the program did not provide evidence that it implemented the 
practices or provided evidence that it only minimally implemented the 
cost-estimating practices for this characteristic. 

Source: GAO analysis of program data. 

[A] Occupational Safety and Health Administration. 

[B] Pension Benefit Guaranty Corporation. PBGC is a wholly owned 
government corporation administered by a presidentially appointed, 
Senate-confirmed Director and overseen by a Board of Directors 
consisting of the Secretaries of Labor, the Treasury, and Commerce. 
Although not a component of the Department of Labor, for 
administrative purposes, PBGC is included within the department's 
budget submission documentation. Therefore, PBGC's IT investments 
(including Benefit Administration) were included among the Department 
of Labor's IT investments in the OMB Fiscal Year 2010 Exhibit 53, 
which provided the basis for our selection of the 16 case study 
programs. 

[End of table] 

Most Programs' Cost Estimates Partially Reflected Key Practices for 
Developing a Comprehensive Estimate: 

Most programs partially implemented key practices needed to develop a 
comprehensive cost estimate. Specifically, of the 16 programs, 1 fully 
implemented the practices for establishing a comprehensive cost 
estimate, 12 partially implemented the practices, and 3 did not 
implement them. 

* DOD's Consolidated Afloat Networks and Enterprise Services program 
fully implemented key practices for developing a comprehensive cost 
estimate. Specifically, the program's cost estimate included both the 
government and contractor costs for the program over its full life 
cycle, from inception through design, development, deployment, 
operation and maintenance, and retirement of the program. Further, the 
cost estimate reflected the current program and technical parameters, 
such as the acquisition strategy and physical characteristics of the 
system. In addition, the estimate clearly described how the various 
cost subelements were summed to produce the amounts for each cost 
category, thereby ensuring that all pertinent costs were included, and 
no costs were double counted. Lastly, cost-influencing ground rules 
and assumptions, such as the program's schedule, labor rates, and 
inflation indexes, were documented. 

* Twelve programs partially implemented key practices for developing a 
comprehensive cost estimate. Most of these programs fully identified 
cost-influencing ground rules and assumptions and included government 
and contractor costs for portions of the program life cycle. However, 
10 of the 12 programs did not include the full costs for all life-
cycle phases and other important aspects of the program, such as costs 
expected to be incurred by organizations outside of the acquiring 
program (e.g., by other agency subcomponents), all costs for operating 
and maintaining the system, and costs for the retirement of the 
system. Without fully accounting for all past, present, and future 
costs for every aspect of the program, regardless of funding source, 
the programs' estimated costs are likely understated and thereby 
subject to underfunding and cost overruns. 

In addition, 10 of the 12 programs did not provide evidence that their 
cost estimates completely defined the program or reflected the current 
program schedule by documenting a technical baseline description to 
provide a common definition of the current program, including detailed 
technical, program, and schedule descriptions of the system. For 
example, in 2008, Homeland Security's Rescue 21 program documented the 
system's technical characteristics, along with a high-level schedule 
for the program. Since 2008, however, certain technical 
characteristics of the program had changed, such as additional 
deployment sites needed to address communication service gaps 
identified by local commanders at previously deployed locations. In 
addition, the planned deployment dates for several locations of the 
system had been delayed. As a result, the program's cost estimate did 
not fully reflect the current scope and schedule of the program. 
Understanding the program--including the acquisition strategy, 
technical definition, characteristics, system design features, and 
technologies to be included--is critical to developing a reliable cost 
estimate. Without these data, programs will not be able to identify 
the technical and program parameters that bind the estimate. 

* Three programs did not implement key practices for developing a 
comprehensive cost estimate in that their estimates did not adequately 
(1) include all costs over the program's full life cycle; (2) 
completely define the program or the current schedule; (3) include a 
detailed, product-oriented work breakdown structure; and (4) document 
cost-influencing ground rules and assumptions. For example, the cost 
estimate for Veterans Affairs' Health Data Repository program did not 
include sufficient detail to show that it accounted for all phases of 
the program's life cycle (e.g., design, development, and deployment). 
Further, the estimate did not include important technical baseline 
information, including the technical, program, and schedule aspects of 
the system being estimated. Lastly, the estimate only used high-level 
budget codes rather than a detailed, product-oriented cost element 
structure to decompose the work, and ground rules and assumptions 
(e.g., labor rates and base-year dollars) were not documented. Without 
implementing key practices for developing comprehensive cost 
estimates, management and oversight organizations cannot be assured 
that a program's estimate is complete and accounts for all possible 
costs, thus increasing the likelihood that the estimate is understated. 

Most Programs' Cost Estimates Partially Reflected Key Practices for 
Developing a Well-Documented Estimate: 

The majority of programs partially implemented key practices needed to 
develop a well-documented cost estimate. Specifically, of the 16 
programs, 1 fully implemented the practices for establishing a well-
documented cost estimate, 10 partially implemented the practices, and 
5 did not implement them. 

* DOD's Consolidated Afloat Networks and Enterprise Services program 
fully implemented key practices for developing a well-documented cost 
estimate. Specifically, the program's cost estimate captured in 
writing the source data used (e.g., historical data and program 
documentation), the calculations performed and their results, and the 
estimating methodology used to derive each cost element. In addition, 
the program documented a technical baseline description that included, 
among other things, the relationships with other systems and planned 
performance parameters. Lastly, the cost estimate was reviewed both by 
the Naval Center for Cost Analysis and the Assistant Secretary of the 
Navy for Research, Development, and Acquisition, which helped ensure a 
level of confidence in the estimating process and the estimate 
produced. 

* Ten programs partially implemented key practices for developing a 
well-documented cost estimate. Most of these programs included a 
limited description of source data and methodologies used for 
estimating costs, and documented management approval of the cost 
estimate. However, 9 of the 10 programs did not include complete 
documentation capturing source data used, the calculations performed 
and their results, and the estimating methodology used to derive each 
cost element. Among other things, the 9 programs had weaknesses in one 
or more of the following areas: relying on expert opinion but lacking 
historical data or other documentation to back up the opinions; not 
documenting their estimate in a way that a cost analyst unfamiliar 
with the program could understand what was done and replicate it; and 
lacking supporting data that could be easily updated to reflect actual 
costs or program changes. Without adequate documentation to support 
the cost estimate, questions about the approach or data used cannot be 
answered and the estimate may not be useful for updates or information 
sharing. 

In addition, 8 of the 10 programs did not provide management with 
sufficient information about how the estimate was developed in order 
to make an informed approval decision. For example, while the EPA's 
Financial System Modernization Project's cost estimate was approved, 
management was not provided information specific to how the estimate 
was developed, including enough detail to show whether it was 
accurate, complete, and high in quality. Because cost estimates should 
be reviewed and accepted by management on the basis of confidence in 
the estimating process and the estimate produced by the process, it is 
imperative that management understand how the estimate was developed, 
including the risks associated with the underlying data and methods, 
in making a decision to approve a cost estimate. 

* Five programs did not implement key practices for developing a well-
documented cost estimate in that their estimates did not adequately 
(1) include detailed documentation that described how the estimate was 
derived, (2) capture the estimating process in such a way that the 
estimate can be easily replicated and updated, (3) discuss the 
technical baseline description, and (4) provide evidence that the 
estimate was fully reviewed and accepted by management. In particular, 
three of the five programs relied on their budget submission 
documentation, known as the OMB Exhibit 300,[Footnote 26] as their 
life-cycle cost estimate. The cost estimate information included in 
these programs' Exhibit 300 budget submissions was limited to the 
final estimates in certain phases of the program's life cycle, such as 
planning, development, and operations and maintenance. Because a well-
documented estimate includes detailed documentation of the source 
data, calculations and results, and explanations of why particular 
methods and references were chosen, the programs that relied on their 
Exhibit 300 budget submissions as their cost estimates lacked the 
level of rigor and supporting documentation necessary for a well-
documented cost estimate. Without a well-documented estimate, a 
program's credibility may suffer because the documentation cannot 
explain the rationale of the methodology or the calculations, a 
convincing argument of the estimate's validity cannot be presented, 
and decision makers' questions cannot be effectively answered. 

Most Programs' Cost Estimates Partially Reflected or Did Not Reflect 
Key Practices for Developing an Accurate Estimate: 

Most programs partially implemented or did not implement key practices 
needed to develop an accurate cost estimate. Specifically, of the 16 
programs, 2 fully implemented the practices for establishing an 
accurate cost estimate, 8 partially implemented the practices, and 6 
did not implement them. 

* DOD's Consolidated Afloat Networks and Enterprise Services and 
Homeland Security's Integrated Public Alert and Warning System 
programs fully implemented key practices for developing an accurate 
cost estimate. Specifically, the programs' estimates were based on an 
assessment of most likely costs, in part because a risk and 
uncertainty analysis was conducted to determine where the programs' 
estimates fell against the range of all possible costs. In addition, 
the programs' estimates were grounded in a historical record of cost 
estimating and actual experiences from comparable programs. For 
example, the cost estimate for the Integrated Public Alert and Warning 
System program relied, in part, on actual costs already incurred by 
the program as well as data from three comparable programs, including 
a legacy disaster management system. Moreover, the programs' cost 
estimates were adjusted for inflation and updated regularly to reflect 
material changes in the programs, such as when the schedule changed. 

* Eight programs partially implemented key practices for developing an 
accurate cost estimate. Most of these programs accounted for inflation 
when projecting future costs. However, four of the eight programs did 
not rely, or could not provide evidence of relying, on historical 
costs and actual experiences from comparable programs. For example, 
officials from Pension Benefit Guaranty Corporation's Benefit 
Administration program stated that they relied on historical data 
along with expert opinion in projecting costs, but the officials did 
not provide evidence of the data sources or how the historical data 
were used. Because historical data can provide estimators with insight 
into actual costs on similar programs--including any cost growth that 
occurred in the original estimates--without documenting these data, 
these programs lacked an effective means to challenge optimistic 
assumptions and bring more realism to their estimates. 

In addition, six of the eight programs did not provide evidence that 
they had regularly updated their estimates to reflect material changes 
in the programs so that they accurately reflected the current status. 
For example, Justice's Unified Financial Management System program 
developed a cost estimate in 2009; however, according to program 
documentation, program scope and projected costs have since changed 
and, as a result, the 2009 estimate no longer reflects the current 
program. Cost estimates that are not regularly updated with current 
information can make it more difficult to analyze changes in program 
costs, impede the collection of cost and technical data to support 
future estimates, and may not provide decision makers with accurate 
information for assessing alternative decisions. 

* Six programs did not implement key practices for developing an 
accurate cost estimate in that their estimates were not adequately (1) 
based on an assessment of most likely costs, (2) grounded in 
historical data and actual experiences from comparable programs, (3) 
adjusted for inflation, and (4) updated to ensure that they always 
reflect the current status of the program. For example, the cost 
estimate for Agriculture's Public Health Information System was not 
based on an assessment of most likely costs because a risk and 
uncertainty analysis was not conducted to determine where the estimate 
fell against the range of all possible costs. In addition, the 
estimate was based primarily on the program team's expertise, but was 
not grounded in historical costs or actual experiences from comparable 
programs. Lastly, the estimate was not adjusted for inflation and 
lacked adequate detail to determine whether the program's latest 
updates to the cost estimate, completed in 2011, accurately reflected 
the current status of the program. Without implementing key practices 
for developing an accurate cost estimate, a program's estimate is more 
likely to be biased by optimism and subject to cost overruns, and may 
not provide management and oversight organizations with accurate 
information for making well-informed decisions. 

Most Programs' Cost Estimates Did Not Reflect Key Practices for 
Developing a Credible Estimate: 

The majority of programs did not implement all key practices needed to 
develop a credible cost estimate. Specifically, of the 16 programs, 1 
fully implemented the practices for establishing a credible cost 
estimate, 5 partially implemented the practices, and 10 did not 
implement them. 

* DOD's Consolidated Afloat Networks and Enterprise Services program 
fully implemented key practices for developing a credible cost 
estimate. Specifically, the program performed a complete uncertainty 
analysis (i.e., both a sensitivity analysis and Monte Carlo simulation 
[Footnote 27]) on the estimate. For example, in performing the 
sensitivity analysis, the program identified a range of possible costs 
based on varying key parameters, such as the technology refresh cycle 
and procurement costs. In addition, the program performed cross checks 
(using different estimating methods) on key cost drivers, such as 
system installation costs. Lastly, an independent cost estimate was 
conducted by the Naval Center for Cost Analysis and the results were 
reconciled with the program's cost estimate, which increased the 
confidence in the credibility of the resulting estimate. 

* Five programs partially implemented key practices for developing a 
credible cost estimate. Specifically, three of the five programs 
performed aspects of a sensitivity analysis, such as varying one or 
two assumptions to assess the impact on the estimate; however, these 
programs did not perform other important components, such as 
documenting the rationale for the changes to the assumptions or 
assessing the full impact of the changes to the assumptions by 
determining a range of possible costs. For example, the Pension 
Benefit Guaranty Corporation's Benefits Administration program 
performed a sensitivity analysis by varying three program assumptions, 
one of which was the contractor's hourly rate, to assess the impact on 
the cost estimate. However, the program did not provide evidence to 
support why the adjusted hourly labor rate was used nor apply a range 
of increases and decreases to the hourly labor rate to determine the 
level of sensitivity of this assumption on the cost estimate. A 
comprehensive sensitivity analysis that is well documented and 
traceable can provide programs with a better understanding of the 
variables that most affect the cost estimate and assist in identifying 
the cost elements that represent the highest risk. 

In addition, three of the five programs adjusted the cost estimate to 
account for risk and uncertainty, but did not provide evidence to 
support how costs were risk adjusted or determine the level of 
confidence associated with the cost estimate.[Footnote 28] For 
example, Homeland Security's Integrated Public Alert and Warning 
System program's cost estimate did not include information on the 
risks considered in its risk and uncertainty analysis or consider the 
relationship between multiple cost elements when accounting for risks. 
Without conducting an adequate risk and uncertainty analysis, the cost 
estimate may be unrealistic because it does not fully reflect the 
aggregate variability from such effects as schedule slippage, mission 
changes, and proposed solutions not meeting users' needs. 

* Ten programs did not implement key practices for developing a 
credible cost estimate in that the programs did not adequately (1) 
assess the uncertainty or bias surrounding data and assumptions by 
conducting a sensitivity analysis, (2) determine the level of risk 
associated with the estimate by performing a risk and uncertainty 
analysis, (3) cross-check the estimates for key cost drivers, and (4) 
commission an independent cost estimate to be conducted by a group 
outside the acquiring organization to determine whether other 
estimating methods would produce similar results. For example, 
Agriculture's Web-Based Supply Chain Management program did not 
conduct a sensitivity analysis to better understand which variables 
most affected the cost estimate, nor did the program conduct a risk 
and uncertainty analysis to quantify the impact of risks on the 
estimate. Further, cost drivers were not cross-checked to see if 
different estimating methodologies produced similar results, and an 
independent cost estimate was not conducted to independently validate 
the results of the program's estimate. Without implementing key 
practices for developing a credible cost estimate, a program may lack 
an understanding of the limitations associated with the cost estimate 
and be unprepared to deal with unexpected contingencies. 

Inadequate Implementation Was Largely Due to Weaknesses in Policy: 

The lack of reliable cost estimates across the investments exists in 
part because of the weaknesses previously identified in the eight 
agencies' cost-estimating policies. More specifically, program 
officials at five agencies--Agriculture, Commerce, EPA, Justice, and 
Veterans Affairs--attributed weaknesses in their programs' cost 
estimates, in part, to the fact that agency policies did not require 
cost-estimating best practices--deficiencies which we also identified 
in these agencies' policies. For example, officials at Commerce's 
Comprehensive Large Array-data Stewardship System program stated that, 
when the program developed its cost estimate, no agency guidance 
existed regarding the process to follow in developing the estimate. In 
addition, officials at Veterans Affairs' Veteran's Benefits Management 
System program stated that they did not perform a risk analysis on 
their cost estimate because agency guidance on how such an analysis 
should be performed did not exist. In certain cases, officials stated 
that program cost estimates were initially developed prior to 2007, 
when a comprehensive federal resource for cost-estimating best 
practices, such as GAO's cost guide,[Footnote 29] did not exist. 
However, all 16 programs included in our review have either developed 
new estimates or updated previous estimates since 2007; nonetheless, 
as previously mentioned, most of the selected agencies' policies did 
not fully address compliance with cost-estimating best practices, 
including the five agencies mentioned above. If these agencies had 
updated their policies, programs would have been more likely to follow 
a standard, high-quality process in developing or updating their cost 
estimates. 

Until important cost-estimating practices are fully implemented, the 
likelihood that these programs will have to revise their current cost 
estimates upward is increased. Collectively, 13 of the 16 programs 
have already revised their original life-cycle cost estimates upward 
by almost $5 billion due, in part, to weaknesses in program cost-
estimating practices (see app. III for details on changes in the 
programs' cost estimates over time). For example, in many cases, cost 
estimates had to be revised upwards to reflect the incorporation of 
full costs for all life-cycle phases (e.g., development or operations 
and maintenance), which had not originally been included. This 
resulted, in some cases, in significant increases to estimated life-
cycle costs. Other reasons that programs cited for revising their life-
cycle cost estimates upward included changes to program or system 
requirements, schedule delays, technology upgrades, and system 
defects, among other things. Further, as previously mentioned, 13 of 
the 16 case study programs still have cost estimates that do not 
include the full costs for all life-cycle phases, which significantly 
increases the risk that these programs' cost estimates will continue 
to be revised upward in the future. 

Without reliable cost estimates, the 15 programs that did not fully 
meet best practices will not have a sound basis for informed program 
decision making, realistic budget formulation and program resourcing, 
and meaningful progress measurement. Consequently, nearly all of these 
programs' cost estimates may continue to be understated and subject to 
underfunding and cost overruns. 

Conclusions: 

Given the enormous size of the federal government's investment in IT, 
it is critical that such investments are based on reliable estimates 
of program costs. While all of the selected agencies have established 
policies that at least partially addressed a requirement for programs 
to develop full life-cycle cost estimates, most of the agencies' 
policies have significant weaknesses. With the exception of DOD, these 
policies omit or lack sufficient guidance on several key components of 
a comprehensive policy including, for example, management review and 
acceptance of program cost estimates, the type of work structure 
needed to effectively estimate costs, and training requirements for 
all relevant personnel. Without comprehensive policies, agencies may 
not have a sound basis for making decisions on how to most effectively 
manage their portfolios of projects. 

Most programs' estimates at least partially reflected cost-estimating 
best practices, such as documenting cost-influencing ground rules and 
assumptions; however, with the exception of DOD's Consolidated Afloat 
Networks and Enterprise Services program, the programs we reviewed had 
not established fully reliable cost estimates, increasing the 
likelihood that the estimates are incomplete and do not account for 
all possible costs. For example, without including costs for all 
phases of a program's life cycle and performing a comprehensive risk 
and uncertainty analysis, a program's estimated costs could be 
understated and subject to underfunding and cost overruns, putting it 
at risk of being reduced in scope or requiring additional funding to 
meet its objectives. Many of the weaknesses found in these programs 
can be traced back to inadequate agency cost-estimating policies. 
Without better estimates of acquisition life-cycle costs, neither the 
programs nor the agencies have reliable information for supporting 
program and budget decisions. Consequently, the likelihood of cost 
overruns, missed deadlines, and performance shortfalls is 
significantly increased. 

Recommendations for Executive Action: 

To address weaknesses identified in agencies' policies and practices 
for cost estimating, we are making the following recommendations: 

We recommend that the Secretaries of Agriculture, Commerce, Homeland 
Security, Labor, and Veterans Affairs, the Attorney General, and the 
Administrator of the Environmental Protection Agency direct 
responsible officials to modify policies governing cost estimating to 
ensure that they address the weaknesses that we identified. 

We also recommend that the Secretaries of Agriculture, Commerce, 
Homeland Security, Labor, and Veterans Affairs, the Attorney General, 
the Administrator of the Environmental Protection Agency, and the 
Director of the Pension Benefit Guaranty Corporation direct 
responsible officials to update future life-cycle cost estimates of 
the system acquisition programs discussed in this report using cost-
estimating practices that address the detailed weaknesses that we 
identified. 

Lastly, although DOD fully addressed the components of an effective 
cost-estimating policy, in order to address the weaknesses we 
identified with a key system acquisition discussed in this report, we 
recommend that the Secretary of Defense direct responsible officials 
to update future life-cycle cost estimates of the Tactical Mission 
Command program using cost-estimating practices that address the 
detailed weaknesses that we identified. 

Agency Comments and Our Evaluation: 

We provided the selected eight agencies and the Pension Benefit 
Guaranty Corporation with a draft of our report for review and 
comment. A management analyst in the Department of Justice's Internal 
Review and Evaluation Office, Justice Management Division, responded 
orally that the department had no comments. Six of the agencies and 
the Pension Benefit Guaranty Corporation provided written comments, 
and the Department of Labor provided oral and written comments. These 
agencies generally agreed with our results and recommendations, 
although EPA disagreed with our assessment of the cost-estimating 
practices used for one of its programs. These agencies also provided 
technical comments, which we incorporated in the report as appropriate. 

The comments of the agencies and the corporation are summarized below: 

* The U.S. Department of Agriculture's Acting Chief Information 
Officer stated that the department concurred with the content of the 
report. Agriculture's comments are reprinted in appendix IV. 

* The Acting Secretary of Commerce stated that the department fully 
concurred with our findings and recommendations. Among other things, 
the Acting Secretary described a number of ongoing actions to address 
the weaknesses we identified, such as modifying departmental policies 
governing cost estimating to include an additional cost-estimating 
training course and cost-estimating training requirements. In 
addition, the department stated that forthcoming policy and guidance 
are intended to ensure that the cost estimates for high-profile 
programs are comprehensive, accurate, credible, and well-documented. 
Commerce's comments are reprinted in appendix V. 

* DOD's Director of Cost Assessment and Program Evaluation stated that 
the department partially concurred with our recommendation but agreed 
with the criteria, methodology, and assessment of the DOD programs. 
The director added, however, that there is no plan to formally update 
the Tactical Mission Command life-cycle cost estimate, as the program 
is in the system deployment phase of its acquisition lifecycle. We 
recognize that the programs included in our study are at varying 
stages of their acquisition life cycles and that updates to their cost 
estimates may not be justified. Accordingly, our recommendation to DOD 
is specific to only future life-cycle cost estimates. In this regard, 
if any significant changes occur in the program during deployment of 
the system that warrant an update to the cost estimate, it will be 
important that the program uses best practices that address the 
weaknesses we identified. DOD's comments are reprinted in appendix VI. 

* EPA's Assistant Administrator of the Office of Solid Waste and 
Emergency Response and its Assistant Administrator and Chief 
Information Officer of the Office of Environmental Information stated, 
in regard to our assessment of cost-estimating policies, that EPA 
recognized that its policies did not require cost-estimating best 
practices and that the agency will update its Systems Life Cycle 
Management procedures accordingly. The officials acknowledged that 
sound fiscal management practices should be followed in all aspects of 
the agency's information technology operations, including cost 
estimating for the development of new systems. 

In regard to our assessment of cost-estimating practices for two 
system acquisition programs, EPA stated that it did not have any 
comments on our assessment of the Financial System Modernization 
Project; however, it did not believe our assessment accurately 
reflected the cost-estimating practices employed for the development 
of the Superfund Enterprise Management System. In particular, the 
Office of Solid Waste and Emergency Response stated in its written 
response and in technical comments that it believed it had met the 
spirit and intent of the cost-estimating best practices in GAO's cost 
guide, even though the program may have used different processes or 
documentation in order to do so. We recognize and agree that 
organizations should tailor the use of the cost-estimating best 
practices as appropriate based on, for example, the development 
approach being used, and we took this factor into consideration during 
our review of the 16 acquisition programs. However, we stand by our 
assessment of the Superfund Enterprise Management System program's 
cost estimate on the basis of the weaknesses described in appendix II 
of this report. In particular, as we discuss, the program's cost 
estimate lacked key supporting documentation, including costs not 
documented at a sufficient level of detail; the lack of documented 
source data, calculations, and methodologies used to develop the 
estimate; and a lack of documentation on the source of and rationale 
for the inflation factor used. In addition, the lack of detailed cost-
estimate information precluded us from making the linkage between the 
cost estimate and other important program documents, such as the 
system's technical baseline and schedule, in order to determine 
whether the estimate reflects the current program and status. Because 
rigorous documentation is essential for justifying how an estimate was 
developed and for presenting a convincing argument for an estimate's 
validity, weaknesses in this area contributed significantly to 
weaknesses across multiple best practices areas, including the 
estimate's comprehensiveness and accuracy. Further, regarding the 
Office of Solid Waste and Emergency Response's comment that our cost-
estimating guide was not published until 3 years after development of 
the Superfund Enterprise Management System commenced, we disagree that 
this would preclude the program from satisfying cost-estimating best 
practices. Specifically, the program updated its cost estimate in 
2011, 2 years after the issuance of the GAO cost guide. At that time, 
the program could have revised its cost estimate using available best 
practice guidance. 

Lastly, we disagree that the draft report erroneously concluded that 
the Superfund Enterprise Management System cost estimate increased 
from $39.3 million to $62.0 million in just 2 years. In its written 
response, the Office of Solid Waste and Emergency Response stated that 
the revised cost estimate was a direct result of an increase in the 
duration of operations and maintenance from fiscal year 2013 (in the 
$39.3 million estimate) to fiscal year 2017 (in the $62.0 million 
estimate). However, according to documentation provided by the 
Superfund Enterprise Management System program, the $39.3 million 
estimate, which was completed in 2009, was based on a 10-year life 
cycle (from fiscal year 2007 to fiscal year 2017) and included costs 
for operations and maintenance through fiscal year 2017. Subsequently, 
in 2011, the program revised its estimate to approximately $62.0 
million, which was also based on a 10-year life cycle (from fiscal 
year 2007 to fiscal year 2017) and included operations and maintenance 
costs through 2017. The revised estimate is an increase of about $22.7 
million over the initial estimate. According to program documentation, 
this change in the cost estimate was primarily due to the inclusion of 
additional operations and maintenance costs for data and content 
storage and hosting for the fully integrated system between fiscal 
year 2014 and fiscal year 2017, which were erroneously omitted from 
the 2009 estimate. Based on these factors, we maintain that our report 
reflects this information appropriately. EPA's comments are reprinted 
in appendix VII. 

* The Department of Homeland Security's Director of the Departmental 
GAO-Office of the Inspector General Liaison Office stated that the 
department concurred with our recommendations. Among other things, the 
department stated that its Office of Program Accountability and Risk 
Management intends to develop a revised cost-estimating policy that 
will further incorporate cost-estimating best practices, as well as 
work to provide cost-estimating training to personnel on major 
programs throughout the department. Homeland Security's comments are 
reprinted in appendix VIII. 

* In oral comments, the Administrative Officer in the Department of 
Labor's Office of the Assistant Secretary for Administration and 
Management stated that the department generally agreed with our 
recommendations. Further, in written comments, the Assistant Secretary 
for Administration and Management stated that the department, through 
several initiatives, such as its Post Implementation Review process 
and training to IT managers, will continue to improve upon its IT cost 
estimation. The department also commented on certain findings in our 
draft report. In particular, the Assistant Secretary stated that, 
given the department's relatively small IT portfolio, establishing a 
central, independent office dedicated to cost estimating is not 
justified. We recognize that agency IT portfolios vary in size; 
however, as noted in our report, agencies should establish a central 
cost-estimating team commensurate with the size of their agency, which 
could consist of a few resident experts instead of a full independent 
office. Regarding our second recommendation, according to the 
Assistant Secretary, the Occupational Safety and Health Administration 
(OSHA) stated that it believes our assessment of the credibility of 
the OSHA Information System program's 2010 cost estimate was too low 
and did not reflect additional information provided in support of the 
program's 2008 cost estimate. In our assessment of the program's 2010 
estimate we acknowledge evidence provided from the 2008 estimate; 
however, this evidence did not adequately show that important 
practices for ensuring an estimate's credibility, including making 
adjustments to account for risk and conducting a sensitivity analysis, 
were performed on the 2010 cost estimate. In addition, OSHA stated 
that an independent estimate was conducted at the outset of the 
program by an industry-leading IT consulting firm as recommended by 
the Department of Labor Office of the Inspector General. While we 
acknowledge that this was done in 2005, the resulting estimate was the 
only one developed at the time and thus was not used as a means of 
independent validation--i.e., to determine whether multiple estimating 
methods produced similar results. Therefore, the independent estimate 
conducted in 2005 would not increase the credibility of the program's 
current cost estimate. Labor's comments are reprinted in appendix IX. 

* The Director of the Pension Benefit Guaranty Corporation stated that 
the corporation was pleased that its selected IT investment met at 
least half, or a large portion, of our quality indicators for cost 
estimating. Further, the Director stated that the corporation will 
evaluate and improve future life-cycle cost estimates for the Benefit 
Administration investment. The Pension Benefit Guaranty Corporation's 
comments are reprinted in appendix X. 

* The Chief of Staff for the Department of Veterans Affairs stated 
that the department concurred with our recommendations and has efforts 
under way to improve its cost-estimating capabilities. Among other 
things, the Chief of Staff stated that the department plans to 
complete, by the end of the first quarter of fiscal year 2013, an 
evaluation of the utility of establishing an organizational function 
focused solely on multiyear cost estimation. In addition, to improve 
cost-estimating practices on its IT efforts, the department stated 
that it has additional training planned in early fiscal year 2013. 
Veterans Affairs' comments are reprinted in appendix XI. 

As agreed with your office, unless you publicly announce the contents 
of this report earlier, we plan no further distribution until 30 days 
from the report date. At that time, we will send copies of this report 
to the Secretaries of Agriculture, Commerce, Defense, Homeland 
Security, Labor, and Veterans Affairs; the Attorney General; the 
Administrator of the Environmental Protection Agency; the Director of 
the Pension Benefit Guaranty Corporation; and other interested 
parties. In addition, the report will be available at no charge on the 
GAO website at [hyperlink, http://www.gao.gov]. 

If you or your staff have any questions concerning this report, please 
contact me at (202) 512-6304 or by e-mail at melvinv@gao.gov. Contact 
points for our Offices of Congressional Relations and Public Affairs 
are on the last page of this report. Key contributors to this report 
are listed in appendix XII. 

Sincerely yours, 

Signed by: 

Valerie C. Melvin: 
Director Information Management and Technology Resources Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to (1) assess the extent to which selected 
departments and agencies have appropriately implemented cost-
estimating policies and procedures, and (2) evaluate whether selected 
information technology (IT) investments at these departments and 
agencies have reliable cost estimates to support budget and program 
decisions. For this review, we assessed eight federal agencies and 16 
investments. 

To select these agencies and investments, we relied on the Office of 
Management and Budget's Fiscal Year 2010 Exhibit 53[Footnote 30] 
which, at the time we made our selections, contained the most current 
and complete data on 28 agencies' planned IT spending.[Footnote 31] To 
ensure that we selected agencies with varying levels of spending on 
IT, we sorted them into three ranges based on their planned spending 
in fiscal year 2010: 

* greater than or equal to $10 billion; 

* greater than or equal to $1 billion but less than $10 billion; and: 

* greater than $0, but less than $1 billion. 

The number of agencies selected from each range was based on the 
relative number of IT investments within each range, and the specific 
agencies selected were those with the highest amount of planned IT 
spending in fiscal year 2010. Specifically, we selected one agency 
with greater than $10 billion in planned IT spending,[Footnote 32] 
five agencies with between $1 billion and $10 billion in planned 
spending, and two agencies with less than $1 billion in planned 
spending. In doing so, we limited our selections to those agencies at 
which we could identify two investments that met our selection 
criteria for investments (see the following paragraph for a discussion 
of our investment selection methodology). These agencies were the 
Departments of Agriculture, Commerce, Defense, Homeland Security, 
Justice, Labor, and Veterans Affairs, and the Environmental Protection 
Agency. We excluded the Departments of Education, Health and Human 
Services, and the Treasury, and the General Services Administration 
from our selection, even though they initially met our agency 
selection criteria, because we could not identify two investments at 
these agencies that met our investment selection criteria. 

To ensure that we examined significant investments, we sorted each 
agency's planned IT investments based on its total planned spending 
for fiscal year 2010. Limiting the number of investments to two per 
agency, we then selected investments based on a consideration of 
whether they were considered major, or mission critical, by the 
agencies[Footnote 33]; had significant development or technical 
refresh work under way; and from different subcomponents of the 
agency.[Footnote 34] In doing so, we also excluded investments if they 
were a combination of smaller investments, were primarily an 
infrastructure investment, had a high percentage of steady-state 
[Footnote 35] spending versus development spending, had less than $5 
million in planned spending for fiscal year 2010, or were the subjects 
of recent or ongoing GAO audit work. 

To assess the extent to which selected agencies had appropriately 
implemented cost-estimating policies and procedures, we analyzed 
agency policies and guidance for cost estimating. Specifically, we 
compared these policies and guidance documents to best practices 
recognized within the federal government and private industry for cost 
estimating. These best practices are contained in the GAO Cost Guide 
and include, for example, establishing a clear requirement for cost 
estimating, requiring management review and approval of cost 
estimates, and requiring and enforcing training in cost 
estimating.[Footnote 36] For each policy component, we assessed it as 
either being not met--the agency did not provide evidence that it 
addressed the policy component or provided evidence that it minimally 
addressed the policy component; partially met--the agency provided 
evidence that it addressed about half or a large portion of the policy 
component; or fully met--the agency provided evidence that it fully 
addressed the policy component. We also interviewed key agency 
officials to obtain information on their ongoing and future cost-
estimating plans. 

To evaluate whether the selected IT investments have reliable cost 
estimates to support budget and program decisions, we analyzed program 
documentation, including program life-cycle cost estimates, business 
cases, and budget documentation; program and management review 
briefings and decision memoranda; integrated master schedules; and 
earned value management and other reports. Specifically, we compared 
program documentation to cost-estimating best practices as identified 
in the GAO cost guide and assessed programs against the four 
characteristics of a reliable estimate--comprehensive, well-
documented, accurate, and credible. For each characteristic, we 
assessed multiple practices as being not met--the program did not 
provide evidence that it implemented the practices or provided 
evidence that it only minimally implemented the practices; partially 
met--the program provided evidence that it implemented about half or a 
large portion of the practices; or fully met--the program provided 
evidence that it fully implemented the practices. We then summarized 
these assessments by characteristic. We also interviewed program 
officials to obtain clarification on how cost-estimating practices are 
implemented and how the cost estimates are used to support budget and 
program decisions. 

We conducted this performance audit from July 2011 to July 2012 in 
accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Case Studies of Selected Programs' Cost-Estimating 
Practices: 

We conducted case studies of 16 major system acquisition programs 
(listed in table 5). For each of these programs, the remaining 
sections of this appendix provide the following: a brief description 
of the program and its life-cycle cost estimate, and an assessment of 
the program's cost estimate against the four characteristics of a 
reliable cost estimate--comprehensive, well-documented, accurate, and 
credible. 

Table 5: Case Study Programs: 

Agency: Agriculture; 
Program: Public Health Information System; 
Program: Web-Based Supply Chain Management. 

Agency: Commerce; 
Program: Comprehensive Large Array-data Stewardship System; 
Program: Patents End-to-End: Software Engineering. 

Agency: Defense; 
Program: Consolidated Afloat Networks and Enterprise Services; 
Program: Tactical Mission Command. 

Agency: Environmental Protection Agency; 
Program: Financial System Modernization Project; 
Program: Superfund Enterprise Management System. 

Agency: Homeland Security; 
Program: Integrated Public Alert and Warning System; 
Program: Rescue 21. 

Agency: Justice; 
Program: Next Generation Combined DNA Index System; 
Program: Unified Financial Management System. 

Agency: Labor; 
Program: OSHA[A] Information System; 
Program: PBGC[B] Benefit Administration. 

Agency: Veterans Affairs; 
Program: Health Data Repository; 
Program: Veterans Benefits Management System. 

Source: GAO analysis of program data. 

[A] Occupational Safety and Health Administration. 

[B] Pension Benefit Guaranty Corporation. PBGC is a wholly owned 
government corporation administered by a presidentially appointed, 
Senate-confirmed Director and overseen by a Board of Directors 
consisting of the Secretaries of Labor, the Treasury, and Commerce. 
Although not a component of the Department of Labor, for 
administrative purposes, PBGC is included within the department's 
budget submission documentation. Therefore, PBGC's IT investments 
(including Benefit Administration) were included among the Department 
of Labor's IT investments in the Office of Management and Budget 
Fiscal Year 2010 Exhibit 53, which provided the basis for our 
selection of the 16 case study programs. 

[End of table] 

The key below defines "fully met," "partially met," and "not met" as 
assessments of programs' implementation of cost-estimating best 
practices. 

Key description: The program provided evidence that it fully 
implemented the cost-estimating practices; 
Key: Fully met. 

Key description: The program provided evidence that it implemented 
about half or a large portion of the cost-estimating practices; 
Key: Partially met. 

Key description: The program did not provide evidence that it 
implemented the practices or provided evidence that it only minimally 
implemented the cost-estimating practices; 
Key: Not met. 

Public Health Information System: 

[Side bar: Investment Details: 
U.S. Department of Agriculture (Food Safety and Inspection Service): 
Program start date: 2007; 
Full operational capability: 
* Current: 2013; 
* Original: 2010; 
Total life-cycle cost: 
* Current: $82.3 million; 
* Original: Not applicable[A]; 
Current life-cycle phase: Mixed (development/operations and 
maintenance). 
Source: Agency data. End of side bar] 

[A] PHIS was originally part of the Public Health Information 
Consolidation Projects investment and, therefore, did not have a life-
cycle cost estimate at the time of origination. End of side bar] 

The Public Health Information System (PHIS) program is designed to 
modernize the Food Safety and Inspection Service's systems for 
ensuring the safety of meat, poultry, and egg products. According to 
the agency, the current systems environment includes multiple, 
disparate legacy systems that do not effectively support agency 
operations. PHIS is intended to replace these legacy systems with a 
single, web-based system that addresses major business areas such as 
domestic inspection, import inspection, and export inspection. The 
program intends to implement functionality to support domestic 
inspection and import inspection in 2012, and export inspection in 
2013. 

In 2007, PHIS was a development contract within the larger Public 
Health Information Consolidation Projects investment. In 2011, after 
PHIS was separated out as its own major investment and the program was 
rebaselined, the PHIS program developed its own cost estimate of $82.3 
million. This includes $71.4 million for development and $10.9 million 
for operations and maintenance over a 12-year life cycle. 

The PHIS program's current cost estimate does not exhibit all of the 
qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a 
comprehensive estimate, it does not reflect key practices for 
developing a well-documented, accurate, or credible estimate. Table 6 
provides details on our assessment of the PHIS program's cost estimate. 

Table 6: Assessment of the PHIS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes 
government and contractor costs over portions of the life cycle of the 
program (e.g., development and retirement), but does not include at 
least 10 years of operations and maintenance costs to account for at 
least one software technical refresh cycle. Further, while the 
estimate is supported by a limited technical baseline description, 
such as high-level schedule milestones, the information lacks adequate 
detail and cannot be used to determine whether the estimate completely 
defines and reflects the current program. A technical baseline 
description should provide a common definition of the program, 
including detailed technical, program, and schedule descriptions of 
the system. In addition, the estimate is only decomposed into yearly 
government and contractor costs rather than into a cost element 
structure with sufficient detail to provide assurance that cost 
elements are neither omitted nor double-counted. Lastly, the estimate 
does not include ground rules and assumptions (e.g., labor rates and 
base-year dollars). Documenting all assumptions is imperative to 
ensuring that management fully understands the conditions under which 
the estimate was structured. 

Characteristic: Well-documented; 
Assessment: Not met; 
Key examples of rationale for assessment: While the estimate was 
documented in the program's December 2011 request to the department's 
Office of the Chief Information Officer to revise the PHIS cost and 
schedule baseline, the document only included limited technical 
baseline information, such as high-level schedule milestones, and the 
resulting cost estimates. In particular, the program's documentation 
did not include the data sources, calculations and their results, and 
methodologies used in developing the estimate. Instead, PHIS program 
officials stated that the team relied on its expertise and previous 
experiences. However, without rigorous documentation, an analyst 
unfamiliar with the program would not be able to understand and 
replicate the cost estimate and the estimate is not useful for updates 
or information sharing. Further, although the estimate has been 
reviewed and approved by management, the information presented to 
management did not include adequate detail, such as information about 
how the estimate was developed and the risks associated with the 
underlying data and methods. Without such information, management 
cannot have confidence in the estimating process or the estimate 
produced by the process. 

Characteristic: Accurate; 
Assessment: Not met; 
Key examples of rationale for assessment: In developing the estimate, 
the program relied on the team's expertise and previous experiences in 
place of historical costs or actual experiences from comparable 
programs, which can be used to challenge optimistic assumptions and 
bring more realism to the estimate. Officials stated this was due to a 
lack of available historical cost data. Further, the estimate is not 
based on an assessment of most likely costs, because the program did 
not rely on historical data and a risk and uncertainty analysis was 
not conducted to determine where the estimate fell against the range 
of all possible costs. In addition, the estimate was not adjusted for 
inflation. Adjusting for inflation is important because cost data must 
be expressed in consistent terms, or cost overruns can result. Lastly, 
the estimate lacks adequate detail to be able to determine whether the 
program's recent updates to the estimate reflect the current status of 
the program. For example, the program expects to spend over $12 
million on development between 2014 and 2018, despite planning to 
reach full operational capability in 2013 and without establishing a 
technical basis for these costs. As a result, decision makers cannot 
have confidence that the estimate accurately represents the program's 
full costs. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: Estimated costs for key cost 
drivers were not cross-checked using different methodologies to see if 
the results were similar, a sensitivity analysis was not performed to 
better understand which variables most affected the cost estimate, and 
a risk and uncertainty analysis was not conducted to determine the 
confidence level associated with the estimate and recommend 
contingency reserves. Although program officials stated that 
contingency funding is held by the agency, best practices state that 
contingency funding should be incorporated into the program's estimate 
because, often, it can take months to receive additional funding to 
address an emerging program issue. Lastly, no steps were taken--such 
as an independent cost estimate--to independently validate the results 
of the program's estimate. Without taking these steps, the program 
lacks a full understanding of the limitations in the estimate and is 
not prepared to deal with unexpected contingencies. 

Source: GAO analysis the PHIS program's cost estimate. 

[End of table] 

Web-Based Supply Chain Management: 

[Side bar: Investment Details: 
U.S. Department of Agriculture (Agricultural Marketing Service): 
Program start date: 2002; 
Full operational capability:
* Current: 2010; 
* Original: 2008; 
Total life-cycle cost: 
* Current: $378.4 million; 
* Original: $142.9 million; 
Current life-cycle phase: Operations and maintenance; 
Source: Agency data. End of side bar] 

The Web-Based Supply Chain Management (WBSCM) program is designed to 
modernize the U.S. Department of Agriculture's commodity management 
operations, including the purchasing and distribution of approximately 
$2.5 billion in food products for distribution to needy recipients 
through domestic and foreign food programs. To accomplish this, the 
program is replacing a legacy system with a web-based commercial-off-
the-shelf solution. In 2010, the program achieved full operational 
capability. Ongoing efforts are focused on addressing a significant 
number of system defects identified since deployment. 

In 2003, WBSCM developed an initial cost estimate of $142.9 million. 
This included $105.5 million for development and $37.4 million for 
operations and maintenance over a 7-year life cycle. Subsequently, 
after revising the estimate each year as part the program's Office of 
Management and Budget Exhibit 300[Footnote 37] submission, in 2011, 
WBSCM revised its cost estimate to $378.4 million, an increase of 
about $235.5 million over its initial cost estimate. This includes 
$104.9 million for development and $273.5 million for operations and 
maintenance over a 18-year life cycle. These changes are due to, among 
other things, incorporating additional years of operations and 
maintenance costs, a recently planned system upgrade, and additional 
costs associated with addressing system defects. 

The WBSCM program's current cost estimate does not exhibit any of the 
qualities of a reliable cost estimate. Specifically, the estimate did 
not reflect key practices for developing a comprehensive, well-
documented, accurate, or credible estimate. Table 7 provides details 
on our assessment of the WBSCM program's cost estimate. 

Table 7: Assessment of the WBSCM Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Not met; 
Key examples of rationale for assessment: The cost estimate did not 
clearly include all life-cycle costs. While it includes costs for 
development and operations and maintenance of the system, it lacks 
sufficient detail to show whether costs associated with design, 
deployment, and retirement of the system are included. Further, the 
estimate lacks adequate detail to ensure that it completely defines 
and reflects the current program. Specifically, the estimate is not 
supported by a technical baseline, which would provide a common 
definition of the program, including detailed technical, program, and 
schedule descriptions of the system. In addition, the estimate does 
not have a cost element structure at a sufficient level of detail, 
which would provide assurance that cost elements are neither omitted 
nor double-counted, as well as improve traceability between estimated 
costs and the program's scope. Lastly, the estimate does not include 
ground rules and assumptions (e.g., labor rates and base-year 
dollars). Documenting all assumptions is imperative to ensuring that 
management fully understands the conditions under which the estimate 
was structured. Without a comprehensive cost estimate, decision makers 
cannot be assured of having a complete view of program costs. 

Characteristic: Well-documented; 
Assessment: Not met; 
Key examples of rationale for assessment: While the program's cost 
estimate was documented in its most recent Office of Management and 
Budget Exhibit 300 submission, this document only included the 
resulting cost estimates. In particular, this document did not provide 
the technical basis of the estimate, nor the data sources, 
calculations and their results, and methodologies used in developing 
the estimate. As a result, questions about the approach or data used 
to create the estimate cannot be answered, an analyst unfamiliar with 
the program would not be able to understand and replicate the 
program's cost estimate, and the estimate is not useful for updates or 
information sharing. Further, although officials stated that the cost 
estimate was provided to the department's Office of the Chief 
Information Officer for review and approval, the information presented 
to management did not include adequate detail, such as information 
about how the estimate was developed, and the program could not 
provide documentation demonstrating management's review and approval. 
Because a cost estimate is not considered valid until management has 
approved it, it is imperative that management understand how the 
estimate was developed. Without such information, management cannot 
have confidence in the estimating process and the estimate produced by 
the process. 

Characteristic: Accurate; 
Assessment: Not met; 
Key examples of rationale for assessment: In developing the estimate, 
the program did not rely on historical costs and actual experiences 
from comparable programs, which can be used to challenge optimistic 
assumptions and bring more realism to the estimate. Further, the 
estimate lacks adequate detail to ensure that updates to the estimate 
reflect the current status of the program. For example, the program's 
estimate recently increased by $90 million due to a planned software 
upgrade, but officials stated that there is no supporting 
documentation showing how these costs were derived, thus making it 
unclear whether the increase accurately reflects the planned work to 
be completed. In addition, the estimate was not properly adjusted for 
inflation. Adjusting for inflation is important because cost data must 
be expressed in consistent terms, or cost overruns can result. Lastly, 
the estimate is not based on an assessment of most likely costs, 
because the program relied heavily on the prime contractor and expert 
opinion in place of historical data, and a risk and uncertainty 
analysis was not conducted to determine where the estimate fell 
against the range of all possible costs. As a result, decision makers 
cannot have confidence that the estimate accurately represents the 
program's full life-cycle cost. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: The WBSCM estimate is not 
credible because steps were not taken to understand the limitations 
associated with the estimate. Specifically, costs were not cross-
checked using different methodologies to see if the results were 
similar, a sensitivity analysis was not performed to better understand 
which variables most affect the cost estimate, and a risk and 
uncertainty analysis was not conducted to determine the confidence 
level associated with the estimate. WBSCM program officials stated 
that they believed there were not any significant cost or schedule 
risks remaining; however, without actually taking steps to understand 
the limitations associated with the estimate, the program cannot have 
confidence that this is actually the case. Lastly, no steps were 
taken--such as an independent cost estimate--to independently validate 
the results of the program's estimate. 

Source: GAO analysis the WBSCM program's cost estimate. 

[End of table] 

Comprehensive Large Array-data Stewardship System: 

[Side bar: Investment Details: 
Department of Commerce (National Oceanic and Atmospheric 
Administration); 
Program start date: 2001; 
Full operational capability:
* Current: 2018; 
* Original: 2018; 
Total life-cycle cost:
* Current: $240.0 million; 
* Original: $195.5 million; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source Agency data. End of side bar] 

The Comprehensive Large Array-data Stewardship System (CLASS) is 
designed to provide environmental data archiving and access. The 
National Atmospheric and Oceanic Administration has been acquiring 
these data for more than 30 years, from a variety of observing systems 
throughout the agency and from a number of its partners. Currently, 
large portions of the nation's environmental data are stored and 
maintained in disparate systems, with nonstandard archive and access 
capabilities. With significant increases expected in both the data 
volume and the number and sophistication of users over the next 15 
years, CLASS is intended to provide a standard, integrated solution 
for environmental data archiving and access managed at the enterprise 
level. CLASS is currently developing satellite data archiving and 
access capabilities for several satellite programs, including the next 
generation of geostationary satellites--known as the Geostationary 
Operational Environmental Satellites-R Series, which are planned for 
launch beginning in 2015. 

In 2006, the National Oceanic and Atmospheric Administration developed 
the initial CLASS cost estimate of approximately $195.5 million. This 
included $118.3 million for development and $77.2 million for 
operations and maintenance over a 9-year life cycle. Subsequently, 
after revising the cost estimate three times, in 2011, CLASS 
established its current cost estimate of approximately $240.0 million, 
an increase of about $44.5 million over its initial cost estimate. 
This includes $176.0 million for development and $64.0 million for 
operations and maintenance over a 17-year life cycle. CLASS program 
officials stated that the increase in the estimate was due, in part, 
to additional data archiving requirements and external program delays. 

The CLASS program's current cost estimate does not exhibit all 
qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a 
comprehensive estimate, it does not reflect key practices for 
developing a well-documented, accurate, or credible estimate. Table 8 
provides details on our assessment of CLASS program's cost estimate. 

Table 8: Assessment of the CLASS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate includes 
certain contractor costs, such as those for planning and development, 
and is based on a cost element structure that is at an appropriate 
level of detail, in that it includes multiple levels of cost 
subelements that are summed to produce the totals for each cost 
category. However, the estimate did not include any estimated costs 
for operations and maintenance after the program achieves full-
operational capability in 2018, nor did it include any government 
costs (e.g., personnel costs). Moreover, while the cost element 
structure is appropriately detailed, it is not product oriented and 
does not define the work activities included in each cost element. 
Without a product-oriented structure and clearly defined cost 
elements, the program will not be able to identify which deliverables, 
such as a hardware or software component, are causing cost or schedule 
overruns. In addition, it cannot be determined whether the estimate 
completely defines the program, in part, because the program's 
requirements have not been finalized. Lastly, no cost-influencing 
ground rules and assumptions (e.g., labor rates and inflation indexes) 
were documented. Documenting all assumptions is imperative to ensuring 
that management fully understands the conditions under which the 
estimate was structured. 

Characteristic: Well-documented; 
Assessment: Not met; 
Key examples of rationale for assessment: The cost estimate was not 
supported by detailed documentation that describes how it was derived. 
More specifically, the documentation did not capture in writing the 
source data used, the calculations performed and their results, and 
the estimating methodology used to derive each cost element. As a 
result, the estimate is not captured in a way such that it can be 
easily replicated and updated. Further, it cannot be determined 
whether the technical baseline is consistent with the cost estimate 
because, among other things, the program's requirements have not been 
baselined and finalized. Lastly, the program's current $240.0 million 
cost estimate has not been reviewed and approved by management. 
Because a cost estimate is not considered valid until management has 
approved it, it is imperative that management understand how the 
estimate was developed, including the risks associated with the 
underlying data and methods. 

Characteristic: Accurate; 
Assessment: Not met; 
Key examples of rationale for assessment: The cost estimate was not 
accurate because it was not based on an assessment of most likely 
costs. More specifically, the program did not conduct a risk and 
uncertainty analysis to determine where the estimate fell against the 
range of all possible costs, and to identify the most likely cost 
estimate. Moreover, the estimate was not adjusted for inflation. 
Adjusting for inflation is important because, in the development of an 
estimate, cost data must be expressed in consistent terms, or cost 
overruns can result. In addition, although officials stated that 
historical costs from similar contracts were used to develop the cost 
estimate, the supporting documentation provided did not provide 
evidence that these data were used. Lastly, the program's cost 
estimate does not reflect current status. More specifically, since 
documenting its initial estimate in 2006, the CLASS program has 
experienced cost, schedule, and scope changes, including changes to 
the program's requirements. However, the cost estimate documentation 
has not been regularly updated. The CLASS Program Manager stated that 
they are currently updating the cost estimate, which is planned to be 
completed in fiscal year 2012. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: The program did not perform 
a sensitivity analysis or a risk and uncertainty analysis on the cost 
estimate. Because uncertainty cannot be avoided, it is necessary to 
identify the cost elements that represent the most risk. A sensitivity 
analysis reveals how the cost estimate is affected by a change in a 
single assumption, which helps the cost estimator understand which 
variables most affect the cost estimate. Moreover, a risk and 
uncertainty analysis can assess the variability in the cost estimate 
so that a level of confidence can be given about the estimate. In 
addition, cross-checks were not performed on major cost elements using 
different estimating methodologies to see if the results were similar. 
When cross-checks demonstrate that alternative methods produce similar 
results, then confidence in the estimate increases, leading to greater 
credibility. Lastly, while the program had an independent government 
cost estimate conducted in 2007, it only provided an independent 
assessment of the prime contractor's proposal, and not the program's 
full life-cycle cost estimate. In addition, as previously mentioned, 
CLASS has experienced cost, schedule, and scope changes since 2007, 
and an independent cost estimate has not been conducted since. 

Source: GAO analysis of the CLASS program's cost estimate. 

[End of table] 

Patents End-to-End: Software Engineering: 

[Side bar: Investment Details: 
Department of Commerce (United States Patent and Trademark Office); 
Program start date: 2010; 
Full operational capability:
* Current: 2013; 
* Original: 2013; 
Total life-cycle cost:
* Current: $188.2 million; 
* Original: $130.2 million; 
Current life-cycle phase: Development; 
Source: Agency data. End of side bar] 

The Patents End-to-End: Software Engineering (PE2E-SE) program is 
designed to provide a fully electronic patent application process. 
According to the U.S. Patent and Trademark Office, the agency's 
current enterprise architecture is unable to meet current demands, and 
it has relied on inefficient and outdated automated legacy systems 
that inhibit the timely examination of patent applications. PE2E-SE 
intends to provide an electronic filing and processing application 
that enables examiners to meet current needs for the timely 
examination of patents. To accomplish this, PE2E-SE is following an 
Agile[Footnote 38] development approach and intends to implement a 
system using a text-based extensible Markup Language standard that is 
flexible, scalable, and leverages modern technologies with open 
standards. In fiscal year 2012, the program plans to build new 
functionality, such as new text search tools, and deploy the system to 
a limited set of examiners. 

In 2010, PE2E-SE developed an initial cost estimate of $130.2 million. 
This estimate only included costs for development, over a 3-year life 
cycle. Subsequently, in 2012 and after multiple revisions, PE2E-SE 
revised its cost estimate to $188.2 million, an increase of $58.0 
million. This includes $122.8 million for development and $65.4 
million for operations and maintenance over a 7-year life cycle. 
According to program officials, these changes are primarily due to 
incorporating costs for operations and maintenance into the estimate. 

The PE2E-SE program's current cost estimate does not exhibit all of 
the qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a 
comprehensive, well-documented, and accurate estimate, it does not 
reflect key practices for developing a credible estimate. Table 9 
provides details on our assessment of the PE2E-SE program's cost 
estimate. 

Table 9: Assessment of the PE2E-SE Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes 
government and contractor costs over most of the program's life cycle 
(e.g., design, development, and deployment) and completely defines the 
program and reflects the current schedule. Further, the estimate 
includes cost-influencing ground rules and assumptions, such as the 
types of technology to be used, and monitors the validity of these 
assumptions over time. However, the estimate does not include all 
costs associated with operations and maintenance (e.g., to provide for 
at least one hardware and software technical refresh beyond the end of 
development) and does not include costs associated with system 
retirement. Further, detailed estimates are structured by function 
instead of being product-oriented, which makes it difficult to plan 
and track costs by deliverables. As a result, the program will not be 
able to identify which deliverables, such as a hardware or software 
component, are causing cost or schedule overruns. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program documented a 
technical baseline description that provides the technical, 
programmatic, and schedule basis for the estimate and documented 
review and approval of all aspects of the estimate, including 
estimates for program subcomponents (i.e., smaller projects within 
PE2E-SE lasting less than a year). Further, the documentation captures 
high-level source data, from the Patent and Trademark Office's 
previous failed effort to modernize the patent examining process, on 
which the estimate is based. However, detailed data to support 
estimates for program subcomponents are not captured, and the program 
also did not document the methodologies followed or the detailed 
calculations performed in completing the estimate. As a result, an 
analyst unfamiliar with the program would be unable to understand and 
use the program's cost estimate, and the estimate is less useful for 
information-sharing and updating. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate is updated with 
actual costs to reflect current program status. Further, at a high 
level, the estimate is based on historical cost data from the office's 
previous failed effort to modernize the patent examining process, 
although estimates for program subcomponents are based primarily on 
team expertise. By using more historical data to support detailed 
estimates, or data from comparable programs, PE2E-SE could more 
effectively challenge optimistic assumptions and bring more realism to 
the cost estimate. Without more supporting data, however, and without 
conducting a risk and uncertainty analysis to determine where the 
estimate fell against the range of all possible costs, the PE2E-SE 
program cannot be assured that the estimate represents the most likely 
costs to be incurred. Lastly, the estimate was not adjusted for 
inflation. Adjusting for inflation is important because cost data must 
be expressed in consistent terms, or cost overruns can result. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: Within the estimate, cost 
drivers were not cross-checked to see if different estimating 
methodologies produced similar results. Further, a risk and 
uncertainty analysis was not conducted to quantify the impact of risks 
to the estimate. While officials stated that some contingency funding 
is included, without conducting a risk and uncertainty analysis the 
program cannot be assured that adequate reserves exist to address 
contingencies that may arise. Additionally, a sensitivity analysis was 
not conducted to better understand which variables most affect the 
cost estimate. Lastly, no steps were taken--such as an independent 
cost estimate--to independently validate the results of the program's 
estimate. An independent estimate is considered one of the most 
reliable methods for validating the estimate and increases the 
likelihood that management will have confidence in the credibility of 
the program's cost estimate. As a result of these weaknesses, the 
program does not have an understanding of the limitations associated 
with the estimate and cannot know whether its estimate is realistic. 

Source: GAO analysis of the PE2E-SE program's cost estimate. 

[End of table] 

Consolidated Afloat Networks and Enterprise Services: 

[Side bar: Investment Details; 
Department of Defense (Department of the Navy); 
Program start date: 2008; 
Full operational capability:
* Current: 2023; 
* Original: 2016; 
Total life-cycle cost: 
* Current: $12.741 billion; 
* Original: $12.741 billion; 
Current life-cycle phase: Development; 
Source: Agency data. End of side bar] 

The Consolidated Afloat Networks and Enterprise Services (CANES) 
program is designed to consolidate and standardize the Department of 
the Navy's existing network infrastructures and services. According to 
the department, the current network infrastructure is highly segmented 
and includes several legacy environments that have created 
inefficiencies in the management and support of shipboard networks. 
The CANES program is intended to, among other things, reduce and 
eliminate existing standalone afloat networks, provide a technology 
platform that can rapidly adjust to changing warfighting requirements, 
and reduce the shipboard hardware footprint. To accomplish this, the 
program will rely primarily on commercial off-the-shelf software 
integrated with network infrastructure hardware components. The CANES 
program is currently planning to procure and conduct preinstallation 
activities of four limited fielding units by the end of fiscal year 
2012, and achieve full operational capability in 2023. 

In 2010, the Navy's Space and Naval Warfare Systems Command Cost 
Analysis Division developed a program life-cycle cost estimate for the 
CANES program, and the Naval Center for Cost Analysis developed an 
independent cost estimate. Subsequently, these organizations worked 
collaboratively to develop the program's life-cycle cost estimate of 
approximately $12.7 billion. This included approximately $4.0 billion 
for development and approximately $8.8 billion for operations and 
maintenance over a 23-year life cycle. 

The CANES program's cost estimate exhibits all of the qualities of a 
reliable cost estimate. Specifically, the estimate reflects key 
practices for developing a comprehensive, well-documented, accurate, 
and credible estimate. Table 10 provides details on our assessment of 
the CANES program's cost estimate. 

Table 10: Assessment of the CANES Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Fully met; 
Key examples of rationale for assessment: The estimate includes both 
the government and contractor costs specific to design, development, 
deployment, operation and maintenance, and retirement of the program. 
Moreover, the cost estimate reflects the current program and technical 
parameters, such as the acquisition strategy and physical 
characteristics of the system.[A] In addition, the estimate clearly 
describes how the various cost subelements are summed to produce the 
amounts for each cost category, thereby ensuring that all pertinent 
costs are included, and no costs are double counted. Lastly, cost-
influencing ground rules and assumptions, such as the program's 
schedule, labor rates, and inflation rates are documented. 

Characteristic: Well-documented; 
Assessment: Fully met; 
Key examples of rationale for assessment: The estimate captured in 
writing the source data used (e.g., historical data and program 
documentation), the calculations performed and their results, and the 
estimating methodology used to derive each cost element. The cost 
estimate is also well documented in that a technical baseline has been 
documented that includes, among other things, the relationships with 
other systems and planned performance parameters. Also, the cost 
estimate was reviewed both by the Naval Center for Cost Analysis and 
the Assistant Secretary of the Navy for Research, Development, and 
Acquisition, which ensures a level of confidence in the estimating 
process and the estimate produced. 

Characteristic: Accurate; 
Assessment: Fully met; 
Key examples of rationale for assessment: The cost estimate is based 
on an assessment of most likely costs. More specifically, a risk and 
uncertainty analysis was performed that determined that the estimate 
is at the 53 percent confidence level--meaning that there is a 53 
percent chance that the estimate will be met. Using this information, 
management can more proactively monitor the program's costs and better 
prepare contingencies to monitor and mitigate risks. In addition, the 
estimate was grounded in historical costs and actual experiences from 
other comparable programs, including the five legacy systems that 
CANES is intended to replace. Lastly, the estimate was adjusted for 
inflation--for example, the cost estimate is presented in both base-
year dollars (with the effects of inflation removed) as well as then-
year dollars (with inflation included). 

Characteristic: Credible; 
Assessment: Fully met; 
Key examples of rationale for assessment: The CANES program performed 
a complete uncertainty analysis (i.e., both a sensitivity analysis and 
Monte Carlo simulation[B] ) on the estimate. More specifically, a 
sensitivity analysis was conducted that identified a range of possible 
costs based on varying key parameters, such as the technology refresh 
cycle and procurement costs. A risk and uncertainty analysis was also 
conducted using a Monte Carlo simulation that identified the 
distribution of total possible costs and the confidence level (53 
percent) associated with the cost estimate. As a result, decision 
makers are more informed of program cost, schedule, and technical 
risks and can better prepare mitigation strategies. Lastly, an 
independent cost estimate was conducted by the Naval Center for Cost 
Analysis and the results were reconciled with the CANES program's cost 
estimate. Because an independent cost estimate is considered one of 
the best and most reliable methods for validating an estimate, 
management can have increased confidence in the credibility of the 
resulting estimate. 

Source: GAO analysis of the CANES program's cost estimate. 

[A] The number of ships that the CANES system will be deployed to 
recently changed from 193 ships to 175 ships due to the 
decommissioning of certain ships earlier than anticipated. While the 
cost estimate does not fully reflect the current ship deployment 
schedule, the program continually monitors the deployment schedule, 
tracks changes between the deployment schedule and the cost estimate, 
and is in the process of updating the estimate for an upcoming 
milestone review. Therefore, we determined that the program adequately 
met the intent of this best practice. 

[B] A Monte Carlo simulation assesses the aggregate variability of the 
cost estimate to determine a confidence range around the cost estimate. 

[End of table] 

Tactical Mission Command: 

[Side bar: Investment Details; 
Department of Defense (Department of the Army); 
Program start date: 2005; 
Full operational capability: 
* Current: 2016; 
* Original: 2018; 
Total life-cycle cost: 
* Current: $2.692 billion; 
* Original: $1.969 billion; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source: Agency data. End of side bar] 

The Tactical Mission Command (TMC)[Footnote 39] is designed to be the 
tactical battle command system for commanders and staffs from 
battalions through the Army Service Component Commands. TMC is 
intended to provide commanders and staff with improved battle command 
capabilities, including increasing the speed and quality of command 
decisions. In the near term, TMC is to address gaps in the Army's 
tactical battle command capability by delivering enhanced 
collaborative tools and enterprise services, and, in the long term, 
TMC is to address rapid improvements in technological capabilities 
through technology refresh. A key component--known as the Command Post 
of the Future--is intended to provide commanders and key staff with an 
executive-level decision support capability enhanced with real-time 
collaborative tools. These capabilities are expected to enhance 
situational awareness and support an execution-focused battle command 
process. Currently, the program is working to complete development of 
Command Post of the Future 7.0, which the program plans to complete by 
the end of fiscal year 2012. 

In 2008, the TMC program developed an initial cost estimate of 
approximately $2.0 billion. This included approximately $1.9 billion 
for development and $116.5 million for maintenance over a 14-year life 
cycle. According to program officials, each subsequent year, in 
preparation for the annual Weapons System Review, the program updated 
its life-cycle cost estimate. In 2011 the TMC program established its 
current cost estimate of approximately $2.7 billion, an increase of 
approximately $723 million over its initial cost estimate. This 
included approximately $2.0 billion for development and $650.7 million 
for operations and maintenance over a 23-year life cycle. Program 
officials stated that the increase in the estimate was due, in part, 
to changes in the life-cycle time frames, fielding schedules, number 
of units planned for deployment, and other software development 
changes. 

The TMC program's current cost estimate does not exhibit all qualities 
of a reliable cost estimate. Specifically, while the estimate 
partially reflects key practices for developing a comprehensive, well-
documented, and accurate estimate, it does not reflect key practices 
for developing a credible estimate. Table 11 provides details on our 
assessment of TMC program's cost estimate. 

Table 11: Assessment of the TMC Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes both 
the government and contractor costs specific to design, development, 
deployment, and operation and maintenance over the program's 23-year 
life cycle. Further, the estimate is based on a cost element structure 
that is at an appropriate level of detail, in that it includes 
multiple levels of cost subelements that are summed to produce the 
totals for each cost category. However, while the program documented 
the system's requirements in 2006, it cannot be determined whether the 
cost estimate fully reflects the current program and schedule. 
Specifically, officials stated that, in preparation for the program's 
annual Weapon System Review process, the cost estimate was updated to 
reflect changes to the 2006 requirements and the system deployment 
schedule; 
however, the program did not provide sufficient evidence showing these 
changes and how they impacted the estimate. Further, while the cost 
element structure is at an appropriate level of detail, this structure 
does not map to the program's work breakdown structure used for the 
day-to-day management of the program. Without consistency in these 
structures, the program cannot track estimated against realized costs. 
Lastly, the program documented certain cost-influencing assumptions 
(e.g., inflation indexes and base-year dollars used); 
however, it did not identify other important assumptions, such as cost 
limitations (e.g., unstable funding stream or staff constraints) and 
system quantities. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate 
documentation captures the calculations performed and their results, 
and the methodologies used to derive certain cost elements. For 
example, the program's cost-estimating software tool contains a series 
of input variables, such as the number of active Army units, used to 
calculate certain hardware and software costs for the program. 
However, the software tool does not include the data sources, 
methodologies, or calculations for several important cost elements, 
such as those associated with development and software maintenance and 
assurance fees. Program officials stated that costs associated with 
these elements are estimated in spreadsheets outside of the cost-
estimating tool; 
however, the tool does not identify the specific spreadsheet 
associated with these cost elements or the data sources used to 
project those costs. As a result, the cost estimate is not captured in 
a way that can be easily updated and replicated. In addition, as 
previously mentioned, the program documented the system's requirements 
in 2006; 
however, it has not adequately documented changes to these 
requirements and their impact on the cost estimate. Lastly, the 
program did not provide evidence that the current cost estimate had 
been approved by senior management. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program provided 
supporting documentation showing that the estimate accounted for 
inflation and was grounded in historical data from the program. For 
example, in calculating software-related costs during deployment of 
the system, the program relied on software costs from fiscal years 
2009 and 2010, along with the anticipated fielding schedule, to 
project costs for future years. However, the program did not provide 
evidence that the estimate had been regularly updated. While program 
officials stated that the estimate is updated every 6 months based on 
a prioritized set of requirements as defined by the logistics team and 
other technical experts, the program did not provide evidence that 
this process is occurring. Lastly, the estimate does not reflect an 
assessment of most likely costs. More specifically, a risk and 
uncertainty analysis was not conducted to determine where the estimate 
fell against the range of all possible costs, and to identify the most 
likely estimate. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: The program did not perform 
a sensitivity analysis nor a risk and uncertainty analysis. According 
to TMC program officials, "what if" drills were performed to assess 
the cost impact of potential changes to the program, such as a 
requirement to deploy 20 additional units per year. However, these 
drills do not gauge the sensitivity of program assumptions to identify 
a range of possible costs. Further, a risk and uncertainty analysis 
was not performed to assess the variability in the cost estimate so 
that a level of confidence could be determined. According to 
officials, there are currently no high risks in developing the 
system's software. However, without actually taking steps to 
understand the limitations associated with the estimate, the program 
cannot have confidence that this is actually the case. In addition, 
the program did not perform cross-checks on key cost drivers using 
different estimating methodologies to see if the results were similar. 
Lastly, an independent cost estimate was not conducted by a group 
outside of the acquiring organization to validate the program's cost 
estimate. 

Source: GAO analysis of the TMC program's cost estimate. 

[End of table] 

Financial System Modernization Project: 

[Side bar: Investment Details; 
Environmental Protection Agency (Office of the Chief Financial 
Officer); 
Program start date: 2002; 
Full operational capability: 
* Current: 2011; 
* Original: 2008; 
Total life-cycle cost: 
* Current: $169.3 million; 
* Original: $163.2 million; 
Current life-cycle phase: Operations and maintenance; 
Source: Agency data. End of side bar] 

The Financial System Modernization Project (FSMP) replaced the 
Environmental Protection Agency's legacy core financial system. The 
system is intended to address agency-identified shortcomings in its 
previous financial systems, such as inconsistent data, limited system 
interoperability, low system usability, and costly maintenance. FSMP 
includes key functionality for performing cost and project management, 
general ledger, payment management, and receivables management. 
According to the agency, the system is intended to, among other 
things, eliminate repetitive data entry, integrate legacy systems, and 
enable agency staff to manage workflow among the Office of the Chief 
Financial Officer and between other business lines (e.g., acquisitions 
and grants management). The system was deployed in October 2011. 

In 2005, the FSMP program developed an initial cost estimate of 
approximately $163.2 million. This included $42.8 million for 
development and $120.4 million for operations and maintenance over a 
25-year life cycle. After revising the cost estimate three times, in 
2010 the program established its current cost estimate of 
approximately $169.3 million, an increase of approximately $6 million 
over its initial cost estimate. This includes $103.7 million for 
development and $65.7 million for operations and maintenance over a 15-
year life cycle. Program officials stated that the changes to the 
program's life-cycle cost estimate are due, in part, to changes in the 
Environmental Protection Agency's policies and guidance, such as using 
a 15-year program life cycle instead of the 25-year life cycle used in 
the program's original estimate. In addition, officials stated that 
the FSMP program has undergone significant schedule and scope changes, 
including delaying the system's deployment date from 2008 to 2011 and 
reducing in the planned system components (e.g., budget formulation)--
all of which have impacted the program's life-cycle cost estimate. 

The FSMP program's current cost estimate does not exhibit all 
qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a 
comprehensive, well-documented, and accurate estimate, it does not 
reflect key practices for developing a credible estimate. Table 12 
provides details on our assessment of FSMP program's cost estimate. 

Table 12: Assessment of the FSMP Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes 
government and contractor costs of the program over most of its life 
cycle (e.g., design, development, and deployment). Further, the 
estimate is documented at an appropriate level of detail in that it 
includes multiple levels of cost subelements that are summed to 
produce the totals for each cost category. In addition, the estimate 
reflects the program's high-level schedule milestones, such as 
reaching full operational capability in 2011. However, the estimate 
does not include costs associated with the retirement of the program. 
In addition, it cannot be determined if the estimate completely 
defines the program because key documents, such as the Concept of 
Operations, have not been updated and program officials stated that 
hundreds of documents exist capturing the changing specifications of 
the system since they were determined in 2005. Further, while the 
estimate is based on documented assumptions from 2009, these 
assumptions have not been updated to account for material changes in 
the program that have occurred since then, including changes to the 
program's schedule, planned system components, and training 
requirements. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program's documentation 
describes, at a summary level, the estimating methodologies used to 
derive major cost elements and the data sources used, such as subject 
matter experts and issued task orders. In addition, the cost estimate 
was reviewed and approved by management as part of the program's July 
2010 rebaseline effort. However, the cost estimate documentation lacks 
important details about the source data, such as the specific subject 
matter experts involved, circumstances affecting the data, and whether 
the data have been adjusted for inflation. Moreover, the data used to 
derive the estimate cannot easily be traced back to, and verified 
against, their sources so that the estimate can be easily replicated 
and updated. For example, there are differences in the costs estimates 
and their supporting task orders, and those differences, including the 
specific circumstances affecting the task orders, are not well 
documented. Lastly, while the estimate was approved, management was 
not briefed on how the estimate was developed, including enough detail 
to show whether it is accurate, complete, and high in quality. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate was regularly 
updated based on actual costs from the program, and to reflect 
material changes. More specifically, the program established an 
initial cost estimate in 2005, which it later updated in 2008, 2009, 
and 2010, to reflect significant changes to the program's scope and 
schedule, among other things. In addition, the estimate accounted for 
inflation in certain cost elements, such as operations and 
maintenance. However, the estimate was not grounded in a historical 
record of cost estimating and actual experiences on other comparable 
programs. For example, while the program did assess the scope, 
requirements, and software solutions selected on the financial systems 
modernization projects at other federal agencies, the review did not 
look at the costs associated with these systems and subsequently use 
that information in developing the FSMP cost estimate. In addition, 
because a risk and uncertainty analysis has not been performed, the 
program did not determine where the estimate fell against the range of 
all possible costs and thus cannot be assured that the estimate 
represents the most likely costs to be incurred. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: The FSMP program did not 
perform a sensitivity analysis or a risk and uncertainty analysis on 
the cost estimate. Because uncertainty cannot be avoided, it is 
necessary to identify the cost elements that represent the most risk. 
A sensitivity analysis reveals how the cost estimate is affected by a 
change in a single assumption, which helps the cost estimator 
understand which variables most affect the cost estimate. Moreover, a 
risk and uncertainty analysis can assess the variability in the cost 
estimate so that a level of confidence can be given about the 
estimate. In addition, cross-checks were not performed on major cost 
elements using different estimating methodologies to see if the 
results were similar. When cross-checks demonstrate that alternative 
methods produce similar results, then confidence in the estimate 
increases, leading to greater credibility. Lastly, while the program 
had an independent government cost estimate conducted in 2005, it only 
provided an independent assessment of the prime contractor's proposal, 
and not the program's full life-cycle cost estimate. In addition, 
since 2005, the program has experienced cost, schedule, and scope 
changes, and an independent cost estimate has not been conducted since. 

Source: GAO analysis of the FSMP program's cost estimate. 

[End of table] 

Superfund Enterprise Management System: 

[Side bar: Investment Details; 
Environmental Protection Agency (Office of Solid Waste and Emergency 
Response); 
Program start date: Fiscal year 2007; 
Full operational capability: 
* Current: 2013; 
* Original: 2013; 
Total life-cycle cost: 
* Current: $62.0 million; 
* Original: $39.3 million; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source: Agency data. End of side bar] 

The Superfund Enterprise Management System (SEMS) is to replace three 
legacy systems and multiple applications used to comply with the 
Comprehensive Environmental Response, Compensation, and Liability Act 
of 1980[Footnote 40]--commonly known as Superfund, which provides 
federal authority to respond directly to releases or threatened 
releases of hazardous substances that may endanger public health or 
the environment. In addition, SEMS is designed to implement innovative 
software tools that will allow for more efficient operation of the 
Superfund program. Of the three legacy systems expected to be replaced 
by SEMS, two have already been integrated, and the one remaining 
system is expected to be fully integrated in 2013, at which time SEMS 
is planned to achieve full operational capability. 

In 2009, the SEMS program developed an initial cost estimate of 
approximately $39.3 million. This included $20.8 million for 
development, $14.7 million for operations and maintenance, and $3.8 
million for government personnel costs over a 10-year life cycle. 
Subsequently, in 2011, the program revised its estimate to 
approximately $62.0 million, an increase of about $22.7 million over 
its initial cost estimate. This includes $22.8 million for development 
and $39.2 million for operations and maintenance over a 10-year life 
cycle. Program officials stated that the increase in the estimate was 
primarily due to incorporating additional operations and maintenance 
costs that were erroneously omitted from the initial estimate. 

The SEMS program's current cost estimate does not exhibit all 
qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a credible 
estimate, it does not reflect key practices for developing a 
comprehensive, well-documented, or accurate estimate. Table 13 
provides details on our assessment of SEMS program's cost estimate. 

Table 13: Assessment of the SEMS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Not met; 
Key examples of rationale for assessment: While the estimate included 
government and contractor costs for certain phases of the program's 
life cycle, such as planning and development of the system, it did not 
include at least 10 years of operations and maintenance costs beyond 
the program's planned deployment in 2013. According to program 
officials, only 4 years of operations and maintenance costs were 
included because agency guidance only requires program costs to be 
estimated through 2017. Moreover, it cannot be determined whether the 
estimate defines the program and reflects the current schedule because 
it is not supported by detailed documentation (see the assessment of 
well-documented below). This is due, in part, to the lack of a cost 
element structure at a sufficient level of detail. Such a structure 
would provide assurance that cost elements are neither omitted nor 
double counted, as well as provide improved traceability to the 
program's scope. Lastly, no cost-influencing ground rules and 
assumptions (e.g., labor rates and base-year dollars) were documented. 
Documenting all assumptions is imperative to ensuring that management 
fully understands the conditions under which the estimate was 
structured. 

Characteristic: Well-documented; 
Assessment: Not met; 
Key examples of rationale for assessment: The documentation of the 
estimate only includes the resulting cost estimates and does not 
capture in writing the source data used, the calculations performed 
and their results, and the estimating methodology used to derive each 
cost element. As a result, the estimate is not captured in such a way 
that it can be easily replicated and updated. Further, while the 
program has documented certain information about the system's 
technical baseline, such as physical characteristics of the system and 
planned interfaces with other systems, it cannot be determined whether 
the technical baseline is consistent with the cost estimate because 
the supporting details of the cost estimate are not documented. 
Lastly, while the high-level estimate was reviewed and approved by 
management, management was not briefed on how the estimate was 
developed, which is needed to convey a level of confidence in the 
estimating process and the estimate produced by the process. 

Characteristic: Accurate; 
Assessment: Not met; 
Key examples of rationale for assessment: The estimate relied largely 
on expert opinion as the basis of the cost estimate and did not use 
historical costs and actual experiences from comparable programs. 
While SEMS program officials stated that they relied on cost data from 
three legacy systems to estimate costs for operating and maintaining 
SEMS, the program did not have supporting documentation showing how 
the data were used. In addition, because a risk and uncertainty 
analysis has not been performed, the SEMS program did not determine 
where the estimate fell against the range of all possible costs, and 
cannot be assured that the estimate represents the most likely costs 
to be incurred. Further, while limited operations and maintenance 
costs were adjusted for inflation, the cost estimate documentation did 
not include information regarding the source or rationale of the 
inflation factors used. Lastly, while the cost estimate has been 
previously updated, it cannot be determined whether the estimate 
reflects current status information because the program has not 
adequately documented detailed supporting information of the cost 
estimate. 

Characteristic: Credible; 
Assessment: Partially met; 
Key examples of rationale for assessment: The SEMS program had a cost-
benefit analysis completed in September 2010 by a group outside of the 
program office to validate the SEMS cost estimate, which yielded a 
cost estimate within 1 percent of the SEMS program's cost estimate. 
Because an independent estimate is considered one of the most reliable 
methods for validating the estimate, management can have increased 
confidence in the credibility of the program's cost estimate. However, 
the program did not perform a sensitivity analysis or a risk and 
uncertainty analysis on the SEMS cost estimate. Because uncertainty 
cannot be avoided, it is necessary to identify the cost elements that 
represent the most risk. A sensitivity analysis reveals how the cost 
estimate is affected by a change in a single assumption, which helps 
the cost estimator understand which variables most affect the cost 
estimate. Further, a risk and uncertainty analysis can assess the 
variability in the cost estimate from such effects as schedule 
slippage and proposed solutions not meeting user needs. Lastly, cross-
checks were not performed on major cost elements using different 
estimating methodologies to see if the results were similar. When 
cross-checks demonstrate that alternative methods produce similar 
results, then confidence in the estimate increases, leading to greater 
credibility. 

Source: GAO analysis of the SEMS program's cost estimate. 

[End of table] 

Integrated Public Alert and Warning System: 

[Side bar: Investment Details; 
Department of Homeland Security (Federal Emergency Management Agency); 
Program start date: 2004; 
Full operational capability: 
* Current: 2017; 
* Original: 2018; 
Total life-cycle cost: 
* Current: $311.4 million; 
* Original: $259 million; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source: Agency data. End of side bar] 

The Integrated Public Alert and Warning System (IPAWS) is designed to 
provide a reliable, integrated, and comprehensive system to alert and 
warn the American people before, during, and after disasters. To 
accomplish this, the program is developing the capability to 
disseminate national alerts to cellular phones and expanding the 
existing Emergency Alert System to cover 90 percent of the American 
public. In 2011, IPAWS established standards for alert messages, began 
cellular carrier testing, and conducted a nationwide test of the 
expanded Emergency Alert System capabilities. The program intends to 
deploy the cellular alerting capability nationwide in 2012 and 
complete its expansion of the Emergency Alert System in 2017. 

In 2009, IPAWS developed its initial estimate of $259 million, which 
included $252.1 million for development and $6.9 million for 
government personnel costs, but did not include operations and 
maintenance costs. In 2011, the program revised its estimate to $311.4 
million, an increase of about $52.3 million. This includes $268.9 
million for development and $42.5 million for operations and 
maintenance over an 11-year life cycle. According to program 
officials, the increase in the cost estimate is primarily due to the 
inclusion of costs to operate and maintain the system during 
development. 

The IPAWS program's current cost estimate does not exhibit all 
qualities of a reliable cost estimate. Specifically, while the 
estimate fully reflects key practices for developing an accurate 
estimate, it only partially reflects key practices for developing a 
comprehensive, well-documented, and credible estimate. Table 14 
provides details on our assessment of IPAWS program's cost estimate. 

Table 14: Assessment of the IPAWS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes 
government and contractor costs over most of the program's life cycle 
(e.g., design, development, and deployment); 
is supported by a technical baseline which provides the technical, 
programmatic, and schedule basis for the estimate; 
uses a detailed cost element structure to ensure that no cost elements 
are omitted or double-counted; 
and identifies cost-influencing ground rules and assumptions, such as 
inflation indices and government furnished equipment. However, the 
estimate does not include any operations and maintenance costs beyond 
the end of development in 2017 (to provide for at least one hardware 
and software technical refresh cycle) and does not include costs 
associated with system retirement. Furthermore, while the estimate 
uses a detailed cost element structure, the structure is not product-
oriented, which would allow costs to be planned and tracked by work 
products. Without a product-oriented structure, the program will not 
be able to identify which deliverables, such as a hardware or software 
component, are causing cost or schedule overruns. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program documented a 
high-level mapping of cost elements to data sources and estimating 
methodologies, but the supporting detailed cost model is not aligned 
with this mapping and does not clearly relate data sources to specific 
calculations, identify estimating methodologies, or describe 
calculations step-by-step. As a result, an analyst unfamiliar with the 
program would find it difficult to understand and use the program's 
estimate and supporting cost model, making the estimate less useful 
for information sharing or updating. Additionally, while the program 
provided evidence that management reviewed and approved the estimate, 
key information about the estimate was not provided to management. For 
example, management was briefed on the estimate and technical-and risk-
related information, but this briefing did not include the confidence 
level associated with the point estimate. 

Characteristic: Accurate; 
Assessment: Fully met[A]; 
Key examples of rationale for assessment: The program relied on 
historical program costs and data from comparable programs in 
preparing and updating the estimate. For example, the program used 
data from a legacy disaster management system in estimating costs for 
part of its system deployment. Further, because the program relied on 
historical costs, and conducted a risk and uncertainty analysis to 
determine where the estimate fell against the range of all possible 
costs, the program has increased assurance that the estimate reflects 
the most likely costs to be incurred. Additionally, the estimate is 
properly adjusted for inflation--for example, the cost estimate is 
presented in both base-year dollars (with the effects of inflation 
removed) as well as then-year dollars (with inflation included). 
Lastly, program officials regularly update the estimate with actual 
costs so that it reflects the current program. 

Characteristic: Credible; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program conducted a 
sensitivity analysis to identify the cost drivers that most impacted 
the estimate and a risk and uncertainty analysis to determine where 
the estimate fell against the range of possible costs. However, 
despite this analysis showing the cost estimate having less than a 50 
percent chance of being achieved, no contingency reserve was 
identified. Officials stated that conservative assumptions in the 
analysis provide an informal risk reserve, and that if a significant 
risk was realized it would be an agency decision about how to fund a 
response. However, best practices state that contingency funding 
should be a risk-based decision because, often, it can take many 
months to receive additional funding to address an emerging program 
issue. Further, the program did not cross-check key cost drivers, 
which would show whether different estimating methodologies produced 
similar results, and an independent cost estimate was not conducted to 
validate the estimated costs. Without taking these steps, the program 
lacks a full understanding of the limitations in the estimate and may 
not be prepared to deal with unexpected contingencies. 

Source: GAO analysis the IPAWS program's cost estimate. 

[A] The IPAWS estimate met all key practices for an accurate cost 
estimate. While the estimate is not fully comprehensive, well-
documented, or credible, in this case, the weaknesses in those areas 
do not preclude the estimate from meeting key practices representative 
of an accurate cost estimate. 

[End of table] 

Rescue 21: 

[Side bar: Investment Details; 
Department of Homeland Security (U.S. Coast Guard); 
Program start date: 1997; 
Full operational capability: 
* Current: 2017; 
* Original: 2006; 
Total life-cycle cost: 
* Current: $2.662 billion; 
* Original: $250 million[A]; 
Current life-cycle phase: Mixed (deployment/operations and maintenance)
Source: Agency data. 
[A] The Rescue 21 program's original cost estimate, developed in 1999, 
only included system acquisition costs and did not include costs for 
operating and maintaining the system. These costs were subsequently 
included in the program's 2005 revisions to the cost estimate. End of 
side bar] 

Rescue 21 is designed to modernize the U.S. Coast Guard's maritime 
search and rescue capability. According to the agency, the current 
system--the National Distress and Response System, does not meet the 
demands of the 21st century in that it does not provide complete 
coverage of the continental United States, cannot receive distress 
calls during certain transmissions, lacks interoperability with other 
government agencies, and is supported by outdated equipment. Rescue 21 
is intended to provide a modernized maritime distress and response 
communications system, with increased maritime homeland security 
capabilities that encompass coastlines, navigable rivers, and 
waterways in the continental United States, in addition to Hawaii, 
Guam, and Puerto Rico. Rescue 21 is currently undergoing regional 
deployment, which is planned to be completed in fiscal year 2017. 

In 1999, the Rescue 21 program developed an initial cost estimate of 
$250 million for acquisition of the system, but this estimate did not 
include any costs for operations and maintenance of the system. 
Following three rebaselines, in 2006 the Rescue 21 program revised the 
estimate to $1.44 billion, an increase of approximately $1.19 billion 
over the initial estimate. This included $730 million in development 
and $707 million in operations and maintenance over a 16-year life 
cycle. According to program documentation, these increases were due, 
in part, to incorporating costs for the operation and maintenance of 
the system. Subsequently, in 2008, the Rescue 21 program revised its 
cost estimate again to $2.66 billion, an increase of approximately 
$1.22 billion over the previous estimate, and approximately $2.41 
billion over the initial cost estimate. This includes $1.07 billion in 
development and $1.59 billion in operations and maintenance over a 16-
year life cycle. Program officials stated that the most recent 
increase in the cost estimate was primarily due to schedule delays, an 
extension of the program's life cycle by 6 years based on an expected 
increase in the system's useful life, and to reflect more realistic 
estimates of future costs for ongoing system technology refreshment. 

The Rescue 21 program's current cost estimate does not exhibit all 
qualities of a reliable cost estimate. Specifically, the estimate 
partially reflects key practices for developing a comprehensive, well-
documented, accurate, and credible estimate. Table 15 provides details 
on our assessment of the Rescue 21 program's cost estimate. 

Table 15: Assessment of the Rescue 21 Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate includes 
all government and contractor costs of the program over its full life 
cycle (e.g., development, operations and maintenance, and disposal) 
and documents cost-influencing ground rules and assumptions (e.g., 
budget constraints and inflation rates). Moreover, the cost estimate 
defines key program and technical parameters, such as the acquisition 
strategy, physical characteristics of the system, and relationships 
with predecessor systems. However, the estimate does not reflect the 
current schedule in that the deployment dates for several locations of 
the system have been delayed, but the estimate has not yet been 
updated. According to officials, the program is in the process of 
updating the estimate to reflect these changes, and it should be 
completed in fiscal year 2012. Further, while the program defined cost 
elements at an appropriate level of detail, the work breakdown 
structure was not product-oriented. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost-estimate 
documentation captures in writing the source data used, calculations 
performed and their results, and methodologies used to derive each of 
the cost elements. In addition, the cost estimate was reviewed and 
approved by management, and included key information regarding the 
Rescue 21 program's technical and program baseline, such as the 
completion date for full production. However, the program's estimate 
is from 2008 and has not been updated to reflect changes to the 
technical baseline of the program, such as additional deployment sites 
needed to address service gaps identified by local commanders at 
previously deployed locations. Moreover, while the estimate was 
reviewed and approved, management did not receive all of the 
information necessary to make an informed decision. More specifically, 
a risk and uncertainty analysis was performed that found that, on the 
range of all possible costs, the program's cost estimate fell at the 
12 percent confidence level--meaning that there is an 88 percent 
chance of a cost overrun. However, this confidence level was not 
identified or provided as part of the estimate's review and approval, 
which calls into question whether management had all the information 
needed to make an informed decision. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate accounted for 
inflation based on Office of Management and Budget guidance. In 
addition, the estimates for utilities, leases, and environmental 
permitting, among other things, were grounded in historical costs and 
actual experiences. Further, the estimate has been regularly updated 
from 1999 through 2008 to account for changes in the program. However, 
the program cost estimate was not based on an assessment of most 
likely costs. More specifically, as previously mentioned, a cost 
uncertainty analysis was performed that determined that the estimate 
is at the 12 percent confidence level--meaning that there is a 88 
percent chance of a cost overrun. Accepting such a confidence level 
means that the program has accepted an overly optimistic cost estimate 
rather than reflecting the most likely cost of the program. In 
addition, while the estimate has been updated in the past, the current 
estimate, dated January 2008, does not fully reflect the current 
status of the program. Specifically, as previously mentioned, the 
estimate does not reflect the current deployment schedule and, 
according to program officials, will also need to be updated to 
reflect increased contractor costs and updated time frames for system 
sustainment. 

Characteristic: Credible; 
Assessment: Partially met; 
Key examples of rationale for assessment: The Rescue 21 program 
performed a complete uncertainty analysis (i.e., both a sensitivity 
analysis and Monte Carlo simulation) on the estimate. More 
specifically, a sensitivity analysis was conducted that identified a 
range of possible costs based on varying key parameters, such as the 
technology refresh cycle and change control costs. A risk and 
uncertainty analysis was also conducted using a Monte Carlo simulation 
that identified the distribution of total possible costs and the 
confidence level (12 percent) associated with the cost estimate. As a 
result, the program is more informed of cost, schedule, and technical 
risks and can better prepare mitigation strategies. However, cross-
checks were not performed on major cost elements using different 
estimating methodologies to see if the results were similar. Further, 
an independent cost estimate was not conducted by a group outside of 
the acquiring organization to validate the program's cost estimate. 

Source: GAO analysis of the Rescue 21 program's cost estimate. 

[End of table] 

Next Generation Combined DNA Index System: 

[Side bar: Investment Details; 
Department of Justice (Federal Bureau of Investigation); 
Program start date: 2006; 
Full operational capability: 
* Current: 2011; 
* Original: 2012; 
Total life-cycle cost: 
* Current: $137.0 million; 
* Original: $128.4 million; 
Current life-cycle phase: Operations and maintenance; 
Source: Agency data. End of side bar] 

Since 1998, the Combined DNA Index System (CODIS) has supported the 
Federal Bureau of Investigation's mission by assisting criminal 
investigation and surveillance through DNA collection and examination 
capabilities. CODIS is an automated DNA information processing and 
telecommunications system that generates potential investigative leads 
in cases where biological evidence is recovered. Among other things, 
CODIS links crime scene evidence to other crimes and/or offenders, 
which can identify serial offenders and/or potential suspects. CODIS 
serves over 190 participating laboratories and 73 international 
laboratories representing 38 countries. According to the Federal 
Bureau of Investigation, the reliability and expandability of CODIS 
are critical to the agency's ability to effectively aid law 
enforcement investigations through the use of biometrics, prompting 
the decision in 2006 to initiate a modernization effort, referred to 
as Next Generation CODIS (NGCODIS). In 2011, the program achieved full 
operational capability for CODIS 7.0, a software release of NGCODIS, 
which included functionality for, among other things, implementing a 
software solution to comply with European Union legislation for DNA 
data exchange and maintaining DNA records of arrested persons. 
Additional functionality is expected in the future; however, all 
program development has been put on hold until the necessary funding 
is approved. 

In 2006, the CODIS program developed an initial cost estimate for 
NGCODIS of $128.4 million. This included approximately $69.6 million 
for development and $58.8 million for operations and maintenance over 
an 11-year life cycle. In 2009, the CODIS program developed an 
additional cost estimate of $58.6 million to account for operations 
costs associated with certain versions of NGCODIS. According to 
program officials, even though the program estimated additional 
operations costs of $58.6 million, the program's original cost 
estimate has increased by only $8.6 million because originally planned 
development work related to incorporating advancements in DNA 
technology was delayed and the costs associated with this work were 
removed from the cost estimate. 

The CODIS program's current cost estimate for NGCODIS does not exhibit 
all qualities of a reliable cost estimate. Specifically, the estimate 
partially reflects key practices for developing a comprehensive, well-
documented, accurate, and credible estimate. Table 16 provides details 
on our assessment of the NGCODIS cost estimate. 

Table 16: Assessment of the NGCODIS Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes both 
the government and contractor costs specific to design, development, 
deployment, operation and maintenance, and disposal of the system. 
Further, the estimate is documented at an appropriate level of detail 
in that it includes multiple levels of cost subelements that are 
summed to produce the totals for each cost category. In addition, the 
estimate includes documented cost-influencing ground rules and 
assumptions (e.g., labor rates and inflation rates). However, while 
the estimate generally reflects the program's technical baseline, such 
as the acquisition plan and key performance parameters, this 
information is contained in multiple documents (instead of a single 
document). As a result, it may be difficult for the program to update 
the estimate and provide a verifiable trace to the new cost baseline 
as assumptions change during the course of the program's life cycle. 
In addition, while the estimate uses a detailed cost element 
structure, the structure is not product oriented, which would allow 
costs to be planned and tracked by work products. Without a product-
oriented structure, the program may not be able to identify which 
deliverables, such as a hardware or software component, are causing 
cost or schedule overruns. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate 
documentation captures in writing most of the source data used, 
calculations performed, and methodologies used, and the estimate was 
reviewed and approved by management. In addition, as previously 
mentioned, the cost estimate documentation generally reflects the 
current technical characteristics of the program, such as the key 
performance parameters. However, the cost estimate documentation did 
not always describe the source data and methodologies used to estimate 
costs for certain aspects of the program. For example, the estimate 
includes the number of projected staff associated with the Support 
Contractor; 
however, there is no explanation for where these projections came 
from. Further, although the cost estimate was reviewed and approved by 
a Federal Bureau of Investigation Executive Steering Council, the 
information presented to management did not include adequate detail, 
such as information about how the estimate was developed and the risks 
associated with the underlying data and methods. Without such 
information, management cannot have confidence in the estimating 
process or the estimate produced by the process. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate accounted for 
inflation and has been updated in the past to reflect material changes 
to the program, such as the inclusion of new requirements for 
interacting with international DNA repositories. Further, the program 
projected operations and maintenance costs based on its experience 
with prior version of CODIS. In addition, the program used a cost-
estimating software tool in projecting software development costs that 
program officials stated relied upon cost data from thousands of 
programs. However, the estimate only partially reflected an assessment 
of most likely costs. More specifically, while the program provided 
the results of its risk and uncertainty analysis and the most likely 
cost estimate, it did not provide evidence supporting how it performed 
the analysis and determined the range of all possible costs. 
Therefore, it cannot be determined what confidence levels may have 
been used or the degree of uncertainty given all of the risks 
considered. 

Characteristic: Credible; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program provided a range 
of potential costs based on the level of risk facing the program and 
also provided information describing these risks. For example, the 
program identified the finalization of the system requirements and 
changes to the assumptions behind the cost model as two primary areas 
that drive the uncertainty of the cost estimate. In addition, the 
program conducted a sensitivity analysis for assumptions associated 
with the development of NGCODIS. Specifically, the program altered a 
series of input factors within the cost-estimating software tool used 
in order to test the sensitivity of the program's development costs 
based on each factor. However, although the program adjusted the 
development costs based on program risk factors built into the 
estimating tool, the program office did not provide evidence to 
explain which factors were adjusted, why those factors were adjusted, 
or determine the confidence level associated with the final cost 
estimate. Further, the sensitivity analysis performed only addressed 
costs associated with system development, and did not include an 
assessment of the sensitivity of assumptions associated with other 
aspects of the program, such as operations and maintenance of the 
system. Lastly, an independent cost estimate was not conducted by a 
group outside of the acquiring organization to validate the program's 
cost estimate. 

Source: GAO analysis of the NGCODIS cost estimates. 

[End of table] 

Unified Financial Management System: 

[Side bar: Investment Details; 
Department of Justice (Justice Management Division); 
Program start date: 2002; 
Full operational capability: 
* Current: 2014; 
* Original: 2010; 
Total life-cycle cost: 
* Current: $851.1 million; 
* Original: $357.2 million; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source: Agency data. End of side bar] 

The Unified Financial Management System (UFMS) is to modernize the 
Department of Justice's financial management and procurement 
operations. To accomplish this, UFMS is to replace four legacy core 
accounting systems and multiple procurement systems with a commercial 
off-the-shelf product. Ultimately, the system is expected to 
streamline and standardize financial management and procurement 
processes and procedures across the department's component agencies. 
UFMS was deployed to two component agencies--the Drug Enforcement 
Administration and the Bureau of Alcohol, Tobacco, Firearms, and 
Explosives--in fiscal years 2009 and 2011, respectively. The system is 
planned to be deployed at other component agencies, including the U.S. 
Marshals Service and the Federal Bureau of Investigation, between 
fiscal years 2013 and 2014, and is expected to achieve full 
operational capability in fiscal year 2014. 

In 2002, the UFMS program developed an initial cost estimate of $357.2 
million. This included approximately $196.4 million for development 
and $160.8 million for maintenance over a 10-year life cycle. In 2009, 
the UFMS program revised the estimate to $1.05 billion, an increase of 
approximately $692.8 million. This included $469.5 million for 
development and $581.6 million for operations and maintenance over a 
20-year life cycle. Program officials stated that the increase in the 
estimate was due to extending the program's life cycle to include 
additional years of development work and operations and maintenance of 
the system. Subsequently, in 2011, the program revised its cost 
estimate to $851.1 million, a decrease of approximately $198.9 
million. This estimate includes $419.5 million for development and 
$431.6 million for operations and maintenance over a 20-year life 
cycle. Program officials stated that the decrease in the cost estimate 
was due to a reduction in the number of component agencies that 
planned to implement UFMS. Specifically, UFMS removed the Federal 
Bureau of Prisons; Offices, Boards and Divisions; and Office of 
Justice Programs from the system's deployment schedule in order to 
reduce the overall cost of the system. 

The UFMS program's current cost estimate does not exhibit all 
qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a 
comprehensive, well-documented, and accurate estimate, it does not 
reflect key practices for developing a credible estimate. Table 17 
provides details on our assessment of UFMS program's cost estimate. 

Table 17: Assessment of the UFMS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate includes 
government and contractor costs of the program over most of its life 
cycle, including costs associated with design, development, and 
deployment of the system. In addition, the cost estimate defines costs 
at an appropriate level of detail. For example, the program relies on 
a cost element structure that is decomposed to three levels and 
describes the work to be performed. Further, the cost estimate 
documents cost-influencing ground rules and assumptions such as labor 
and inflation rates. However, the cost estimate does not account for 
all program costs, in that it excludes costs for retirement of the 
system and certain costs incurred by component agencies to implement 
the system. According to program officials, certain component agency 
implementation costs, such as the costs for help desk support, are not 
included because these costs are funded separately through the 
component agencies. However, according to best practices, a life-cycle 
cost estimate should encompass all past, present, and future costs for 
every aspect of the program, regardless of funding source. In 
addition, the estimate lacks important details needed to determine if 
it completely defines the current program. Specifically, in 2010, UFMS 
rescoped the program by removing planned component agency 
implementations and reducing the cost estimate from $1.05 billion to 
$851.5 million, a reduction of about $198.9 million, but did not 
update the cost estimate documentation to justify the associated 
change in cost. Finally, while the cost element structure used is at 
an appropriate level of detail, it does not align with the work 
breakdown structure being used to manage the current work activities 
to be completed by the program. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate 
documentation captures in writing most of the source data used, 
calculations performed and their results, and methodologies used to 
derive each of the cost elements, and was reviewed and approved by 
management. However, the cost estimate documentation only supports the 
program as defined in 2009 and has not been updated to reflect 
significant changes to the program that occurred as part of the 2010 
rescoping effort. According to the UFMS Program Manager, the cost 
estimate documentation does not reflect the current program because 
the life-cycle cost estimate is considered a static document that is 
used as a basis for funding and budget requests, and is not planned to 
be updated. In addition, the estimating methodology and associated 
calculations are excluded for some of the work. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate accounted for 
inflation based on Office of Management and Budget guidance. In 
addition, the estimate was grounded in a historical record of actual 
experiences in that the program leveraged cost data from a prior 
implementation of the system. More specifically, in projecting system 
implementation costs at other Department of Justice component 
agencies, the program relied on historical data from the 
implementation of the UFMS system at the Department of Justice's Drug 
Enforcement Agency and applied scaling factors to estimate the 
implementation costs of the system at other component agencies. 
However, the estimate does not reflect an assessment of the most 
likely costs. More specifically, the program did not determine where 
the estimate fell against the range of all possible costs. In 
addition, the estimate has not been regularly updated to reflect 
material changes, such as the 2010 rescoping of the program. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: The estimate is not credible 
because a risk and uncertainty analysis and sensitivity analysis 
specific to the estimate were not performed. A risk and uncertainty 
analysis can be used to assess variability in the overall cost 
estimate, while a sensitivity analysis can reveal how the cost 
estimate is affected by a change in a single assumption, which helps 
the cost estimator understand which variables most affect the cost 
estimate. The UFMS Program Manager stated that such analyses were not 
conducted, in part, because the labor costs are fixed, and blanket 
purchase agreements allowed for the UFMS team to extrapolate cost data 
for future years, therefore reducing the program's exposure to risk. 
However, without actually taking steps to understand the limitations 
associated with the estimate, the program cannot have confidence that 
this is actually the case. Further, cross-checks were not performed on 
major cost elements using different estimating methodologies to see if 
the results were similar. Lastly, an independent cost estimate was not 
conducted by a group outside of the acquiring organization to validate 
the program's cost estimate, which further calls into question the 
estimate's credibility. 

Source: GAO analysis of the UFMS program's cost estimate. 

[End of table] 

OSHA Information System: 

[Side bar: Investment Details; 
Department of Labor (Occupational Safety and Health Administration); 
Program start date: 2005; 
Full operational capability: 
* Current: 2011; 
* Original: 2009; 
Total life-cycle cost: 
* Current: $91.3 million; 
* Original: $72.3 million; 
Current life-cycle phase: Operations and maintenance; 
Source: Agency data. End of side bar] 

The OSHA Information System (OIS) is a management tool consisting of a 
suite of applications to reduce workplace fatalities, injuries, and 
illnesses through enforcement, compliance assistance, and 
consultation. According to the agency, OIS is intended to close 
performance gaps with existing legacy systems resulting from 
irreplaceable legacy hardware and software, the inability of legacy 
systems to fully support the agency's mission, and the absence of an 
application that supports key business process areas, such as 
compliance assistance. Ultimately, OIS is expected to provide a 
centralized web-based solution to be used by more than 5,900 users at 
the federal and state level, including approximately 4,200 enforcement 
officers and 500 safety and health consultants. The program completed 
development in 2011, and is working to complete deployment of the 
system while addressing operations and maintenance of the system, 
which the program plans to complete by the end of fiscal year 2016. 

In 2006, the OIS program developed an initial cost of $72.3 million. 
This included $42.0 million for development and $30.3 million for 
operations and maintenance over a 12-year life cycle. Subsequently, in 
2010, the OIS program revised its cost estimate to $91.3 million, an 
increase of $19.0 million. This includes $63.3 million for development 
and approximately $28.0 million for operations and maintenance over a 
12-year life cycle. The OIS Program Manager stated that the increase 
in the estimate was due, in part, to unanticipated changes to the OIS 
program's scope to better align with the Department of Labor's 
strategic goals, including securing safe and healthy workplaces, 
particularly in high-risk industries. For example, according to this 
official, the agency's methodology for penalty calculations for 
violators of occupational safety and health rules and regulations was 
modified, which required a redesign of OIS in order to capture and 
accurately calculate these changes. 

The OIS program's current cost estimate does not exhibit all qualities 
of a reliable cost estimate. Specifically, while the estimate 
partially reflects key practices for developing a comprehensive, well-
documented, and accurate estimate, it does not reflect key practices 
for developing a credible estimate. Table 19 provides details on our 
assessment of the OIS program's cost estimate. 

Table 18: Assessment of the OIS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate includes 
government and contractor costs of the program over most of its life 
cycle, including planning, acquisition, and certain operations and 
maintenance costs. The estimate also includes cost-influencing ground 
rules and assumptions, such as user staffing levels and hardware-
hosting responsibilities. However, the cost estimate does not account 
for all applicable program life-cycle costs, including at least 10 
years of operations and maintenance costs beyond the program's 
deployment date in 2011. A program official stated that the OIS 
program will consider including the costs associated with additional 
years of operation and maintenance in a future update to the estimate. 
In addition, the current cost estimate is not structured at a 
sufficient level of detail. Specifically, while OIS program officials 
stated that the program's 2010 estimate is primarily supported by the 
program's 2005 cost model, the cost elements in these two estimates 
are inconsistent. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate 
documentation partially captures in writing the source data used, 
calculations performed and their results, and the estimating 
methodologies used to derive the cost elements. However, the 
supporting cost model documentation reflects the program as defined in 
2005 and has not been updated to reflect the program's current $91.3 
million cost estimate. More specifically, in 2010, the estimated costs 
of the program increased approximately $12 million; 
however, the program did not update its 2005 cost model or document 
the supporting details for this increase. Further, the program did not 
provide documentation that the cost estimate was submitted to, or 
approved by, management. According to program officials, the estimate 
was approved as part of the budget process, during which the cost 
estimate was reviewed and approved by both OSHA management and the 
Department of Labor's Office of the Assistant Secretary for 
Administration and Management and the Office of the Chief Information 
Officer; 
however, this review and approval was not documented. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program relied on a 
contractor with access to historical data in developing the estimate 
and associated cost model. According to program officials, in 2005, 
the program hired a consulting firm based on its cost-estimating 
expertise and historical cost data repository. Further, at that time, 
the program stated that it used an estimating method in projecting 
software development costs that drew upon data from thousands of 
programs based on specific data points entered by the program. In 
addition, while the program updated its cost estimate in 2010 to 
reflect changes to the program and actual costs incurred since 2005, 
significant changes to the program have occurred since 2010 that have 
not been included in the cost estimate. For example, the 2010 cost 
estimate accounts for operations and maintenance costs at 
approximately $26.7 million; 
however, an operations and maintenance contract was recently awarded 
for approximately $39.9 million, an increase of about $13 million, 
which has not been accounted for in the cost estimate. According to 
program officials, the cost estimate is currently being updated; 
however, a completion date for this effort has not yet been 
determined. Further, the estimate is not based on an assessment of the 
most likely costs. Specifically, because a risk and uncertainty 
analysis has not been performed to determine where the estimate fell 
against the range of all possible costs, the OIS program cannot 
determine if the estimate represents the most likely costs to be 
incurred. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: The OIS program provided a 
set of risk-adjusted figures but could not provide evidence supporting 
those figures for the most recent cost estimate. According to program 
officials, the program used the Department of Labor's cost-benefit 
analysis tool which takes risks identified by the program and adjusts 
projected program costs based on underlying formulas embedded in the 
tool. However, officials could only provide evidence that this was 
used in a 2008 update to the cost estimate. In addition, while the 
program conducted 'what if' drills by varying the overall estimate to 
determine at what point it would be necessary to choose a different 
acquisition approach, it did not conduct a comprehensive sensitivity 
analysis that identified key program cost drivers and determined a 
range of possible costs by varying major assumptions and parameters. 
Further, cross-checks were not performed on major cost elements using 
different estimating methodologies to see if the results were similar, 
which further calls into question the estimate's credibility. Lastly, 
no steps were taken--such as an independent cost estimate--to 
independently validate the results of the program's estimate. 

Source: GAO analysis of the OIS program cost estimate. 

[End of table] 

PBGC Benefit Administration: 

[Side bar: Investment Details; 
Department of Labor (Pension Benefit Guaranty Corporation[A]); 
Program start date: 2007; 
Full operational capability: 
* Current: 2017; 
* Original: 2017; 
Total life-cycle cost: 
* Current: $155.9 million; 
* Original: $186.9 million; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source: Agency data. 
[A] Although not a component of the Department of Labor, for 
administrative purposes, the Pension Benefit Guaranty Corporation is 
included within the department's budget submission documentation. End 
of side bar] 

The Pension Benefit Guaranty Corporation's (PBGC) Benefit 
Administration (BA) is a collection of IT systems and applications 
that allows PBGC to administer and service the approximately 1.5 
million participants in over 4,300 plans that have been terminated and 
trusteed as part of PBGC's insurance program for single-employer 
pensions. The BA program is intended to modernize and consolidate 
applications, retire legacy systems, and address performance gaps. To 
do this, the BA program is grouped into four projects--Customer Care, 
Document Management, Case Management, and Benefit Management--in 
support of paying accurate and timely payments and providing customer 
service to participants. The BA program is expected to offer multiple 
self-service channels to participants, reengineer benefit payment 
processes to increase efficiency and productivity, and implement 
enhanced reporting and document management systems. According to the 
agency, this modernization effort is ultimately expected to increase 
customer satisfaction, reduce operational costs, and improve data 
quality. Currently, the program is scheduled to complete modernization 
and decommission the remaining legacy applications in fiscal year 2015. 

In 2007, the BA program developed an initial cost estimate of $186.9 
million. This included $39.4 million for development and $147.5 
million for operations and maintenance over a 5-year life cycle. 
Subsequently, in 2010, BA revised its cost estimate to $155.9 million, 
a decrease of $31.0 million. This revised estimate includes $80.7 
million for development and approximately $75.2 million for operations 
and maintenance over a 10-year life cycle. Program officials stated 
that the decrease in the estimate was due to changes to the program's 
schedule milestones and changes to the system's architecture. 

The BA program's current cost estimate does not exhibit all qualities 
of a reliable cost estimate. Specifically, the estimate partially 
reflects key practices for developing a comprehensive, well-
documented, accurate, and credible estimate. Table 18 provides details 
on our assessment of the BA program's cost estimate. 

Table 19: Assessment of the BA Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate includes 
most contractor costs of the program over its life cycle, including 
planning, development, and operations and maintenance of the system. 
In addition, the cost estimate includes documented ground rules and 
assumptions, such as the assumed hourly contractor rate and the period 
of performance for the system. However, the cost estimate is not fully 
comprehensive because it does not account for all applicable program 
costs in that it excludes costs associated with government personnel 
and retirement of the system. According to program officials, costs 
that did not impact the acquisition strategy, as well as costs 
incurred by the program prior to 2010, were excluded. However, 
according to best practices, a life-cycle cost estimate should 
encompass all past, present, and future costs for every aspect of the 
program. In addition, estimated costs were assigned to high-level 
categories such as contractor development and testing/change 
management; 
however, the cost element structure is not at a sufficient level of 
detail or aligned with the program's work breakdown structure. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate was 
reviewed and approved by the BA Program Manager and presented to the 
Information Technology Investment Review Board and Executive 
Management Committee. In addition, the program's cost estimate 
documentation describes, at a summary level, the types of source data 
and estimating methodologies used. For example, according to the cost 
estimate documentation, the program derived costs from past operations 
and maintenance costs, management support costs, vendor cost data, 
team subject matter experts, and current contracts. However, the 
specific source data, calculations and results, and methodologies used 
to estimate each cost element are not well documented and do not track 
to the final cost estimate. More specifically, in developing the cost 
estimate, the program relied on multiple project teams to develop the 
cost estimates specific to their areas of expertise. However, the 
source data, calculations and results, and methodologies used to 
determine these individual project cost estimates were not always 
documented and, in many cases, did not track between the project 
worksheets and the final cost estimate. 

Characteristic: Accurate; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate accounted 
for inflation based on Office of Management and Budget guidance. In 
addition, the cost estimate was updated in 2010 to account for changes 
to the program that had occurred since it was initiated in 2007. 
However, it cannot be determined whether the estimate fully reflects 
the current status information because the program has not adequately 
documented detailed supporting information of the cost estimate (see 
the assessment of well-documented above). Lastly, program officials 
stated that the cost estimate is based on, among other things, 
historical operations and maintenance costs and management support 
costs. However, the program's supporting documentation did not provide 
evidence that these data were used. Further, the estimate is not based 
on an assessment of the most likely costs because the program did not 
perform a comprehensive risk and uncertainty analysis to determine 
where the estimate fell against the range of all possible costs, and 
to identify the most likely estimate. 

Characteristic: Credible; 
Assessment: Partially met; 
Key examples of rationale for assessment: The cost estimate included 
risk-adjusted figures. Specifically, program officials stated that 
brainstorming sessions were held during which the program relied on 
Office of Management and Budget risk categories to identify risks, and 
then adjusted the program's cost estimate to account for these risks. 
Further, officials stated that risks are continuously monitored for 
their potential impact on the program. However, the program did not 
provide supporting documentation for how the program arrived at the 
risk-adjusted cost figures in the cost estimate, nor evidence that a 
quantitative risk and uncertainty analysis was performed to assess the 
aggregate variability of the cost estimate to determine a confidence 
range around the estimate. The program also performed a sensitivity 
analysis for three scenarios. Specifically, the program assessed the 
potential impact of higher-than-anticipated contractor labor rates, a 
reduced life cycle for the program, and a change in the acquisition 
strategy. However, the scenarios did not provide a basis for the 
changes to the selected assumptions or a minimum and maximum range for 
the adjustments. Lastly, no steps were taken--such as an independent 
cost estimate--to independently validate the results of the program's 
estimate, and cross-checks were not performed on major cost elements 
using different estimating methodologies. 

Source: GAO analysis of the BA program's cost estimate. 

[End of table] 

Health Data Repository: 

[Side bar: Investment Details; 
Department of Veterans Affairs (Office of Information and Technology); 
Program start date: 2001; 
Full operational capability: 
* Current: 2017; 
* Original: 2006; 
Total life-cycle cost: 
* Current: $491.5 million; 
* Original: $126.7 million; 
Current life-cycle phase: Mixed (development/operations and 
maintenance); 
Source. Agency data. End of side bar] 

The Health Data Repository (HDR) is intended to support the 
integration of clinical data across the Department of Veterans Affairs 
and with external healthcare systems such as that of the Department of 
Defense. Specifically, the system is designed to provide a nationally 
accessible repository of clinical data by accessing and making 
available data from existing healthcare systems to support clinical 
and nonclinical decision-making for the care of the department's 
patients. The system is being developed using an Agile software 
development approach and, currently, the program is working on 
software releases to improve the ability to access data in VA's legacy 
healthcare information system, and intends to achieve full operating 
capability in 2017. 

In 2001, the HDR program developed an initial cost estimate of $126.7 
million. This included $105.9 million for development and $20.8 
million for operations and maintenance over a 7-year life cycle. 
According to officials, the program revised its estimate each year 
during the budget cycle; in 2011, HDR revised its cost estimate to 
$491.5 million, an increase of approximately $364.8 million over its 
initial cost estimate. This includes $281.9 million for development 
and $209.6 million for operations and maintenance over a 17-year life 
cycle. Program officials stated that the increase in the cost estimate 
was primarily due to the unplanned deployment and operation of a 
prototype system for 5 years, and the delay of the planned date for 
full operational capability from 2006 to 2017, in part, because of 
changes in the program's scope and technology refreshes (i.e., 
equipment and storage capacity). 

The HDR program's current cost estimate does not exhibit any of the 
qualities of a reliable cost estimate. Specifically, the estimate does 
not reflect key practices for developing a comprehensive, well-
documented, accurate, and credible estimate. Table 20 provides details 
on our assessment of HDR program's cost estimate. 

Table 20: Assessment of the HDR Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Not met; 
Key examples of rationale for assessment: The estimate does not 
include sufficient detail to show that costs for all life-cycle phases 
(e.g., design, development, and deployment) are fully accounted for. 
In addition, the estimate does not include operations and maintenance 
costs beyond the completion of system development work, or costs 
associated with retirement of the system. Further, the estimate does 
not contain technical baseline information to define the technical, 
program, and schedule aspects of the system being estimated. 
Additionally, the estimate only uses high-level budget codes rather 
than a detailed, product-oriented cost element structure to decompose 
the work. Without a cost element structure at sufficient detail, the 
program will lack assurance that cost elements are neither omitted nor 
double counted. Lastly, ground rules and assumptions (e.g., labor 
rates and base-year dollars) are not documented. As a result of these 
weaknesses, the estimate is unlikely to include all program costs, and 
is likely understated. 

Characteristic: Well-documented; 
Assessment: Not met; 
Key examples of rationale for assessment: The HDR program did not 
support the estimate with adequate technical baseline documentation, 
which would provide a technical, programmatic, and schedule 
description of the program. Further, the program did not document the 
data sources, calculations and their results, or the methodologies 
used in developing the estimate, so an analyst unfamiliar with the 
program would not be able to use or replicate the estimate. 
Additionally, the documentation does not provide evidence that 
management has reviewed and approved the program's total estimated 
costs of $491.5 million because the information presented to 
management only includes costs of project increments, the most recent 
of which only had estimated costs of about $39 million, and did not 
include adequate details, such as information about how the estimate 
was developed. Because a cost estimate is not considered valid until 
management has approved it, it is imperative that management 
understand how the estimate was developed, including risks associated 
with underlying data and methods. Without sufficient documentation, 
management and oversight organizations will not be convinced that the 
estimate is credible, and questions about the approach or data used to 
create the estimate cannot be answered. 

Characteristic: Accurate; 
Assessment: Not met; 
Key examples of rationale for assessment: The estimate is updated each 
year as part of the budget cycle, but the program lacks assurance that 
the cost estimate accurately reflects current program status due to, 
as described above, the lack of a comprehensive schedule and technical 
baseline. Further, the estimate is not based on historical costs or 
actual experiences from comparable programs. Such data can be used to 
challenge optimistic assumptions and bring more realism to the 
estimate. Additionally, the estimate was not properly adjusted for 
inflation. Adjusting for inflation is important because cost data must 
be expressed in consistent terms, or cost overruns can result. Lastly, 
the estimate is not based on an assessment of most likely costs, 
because the program did not rely on historical data and did not 
conduct a risk and uncertainty analysis to determine where the 
estimate fell against the range of all possible costs. As a result, 
decision makers cannot have confidence that the estimate accurate 
represents the program's full life-cycle cost. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: Key cost drivers were not 
cross-checked using different methodologies to see if the results were 
similar, which can be used to increase confidence in the estimates. 
Further, while a previous cost estimate developed by the program was 
adjusted for risk, a comprehensive risk and uncertainty analysis was 
not conducted for the current estimate to quantify the impact of risks 
and identify a confidence level associated with the estimate. While 
officials stated that some contingency funding is included, without 
conducting a risk and uncertainty analysis, the program cannot be 
assured that adequate reserves exist to address contingencies that may 
arise. Additionally, a sensitivity analysis was not conducted to 
better understand which variables most affect the cost estimate. 
Lastly, no steps were taken--such as an independent cost estimate--to 
independently validate the results of the program's estimate. As a 
result of these weaknesses, the program does not have an understanding 
of the limitations associated with the estimate and cannot know 
whether its estimate is realistic. 

Source: GAO analysis of the HDR program's cost estimate. 

[End of table] 

Veterans Benefits Management System: 

[Side bar: Investment Details; 
Department of Defense (Department of the Navy); 
Program start date: 2008; 
Full operational capability: 
* Current: 2023; 
* Original: 2016; 
Total life-cycle cost: 
* Current: $12.741 billion; 
* Original: $12.741 billion; 
Current life-cycle phase: Development; 
Source: Agency data. End of side bar] 

The Veterans Benefits Management System (VBMS) is intended to provide 
a paperless claims processing system to support processing a growing 
volume of claims--for example, the number of compensation and pension 
claims submitted in a year passed 1 million for the first time in 
2009. According to the department, due to the reliance on paper-based 
processing, the current system is inefficient and costly, and carries 
risks to veterans' sensitive information. To address this, VBMS is 
designed to provide veterans a secure and accessible means to obtain 
benefits, reduce the claims backlog, implement standardized business 
practices, and support the integration with other veteran-facing 
systems. The program is currently developing functionality for 
compensation and pension claims processing, and plans to add 
additional lines of business in future years. 

In 2008, the VBMS program developed an initial, high-level cost 
estimate of $560.0 million for system development over a 5-year life 
cycle, which did not include costs for operations and maintenance. 
Subsequently, after revising the estimate each year as part the 
program's Office of Management and Budget Exhibit 300 submission, in 
2011 VBMS revised its cost estimate to $934.8 million, an increase of 
approximately $374.8 million over its initial estimate. This includes 
$433.7 million for development and $501.1 million for operations and 
maintenance over an 11-year life cycle. Program officials stated that 
the increase in the estimate was primarily due to incorporating costs 
associated with operations and maintenance and effort spent on 
changing to an Agile development approach. 

The VBMS program's current cost estimate does not exhibit all of the 
qualities of a reliable cost estimate. Specifically, while the 
estimate partially reflects key practices for developing a 
comprehensive and well-documented estimate, it does not reflect key 
practices for developing an accurate and credible estimate. Table 21 
provides details on our assessment of the VBMS program's cost estimate. 

Table 21: Assessment of the VBMS Program's Cost Estimate: 

Characteristic: Comprehensive; 
Assessment: Partially met; 
Key examples of rationale for assessment: The estimate includes 
government and contractor costs over limited phases of the program's 
life cycle, such as initiation and development. However, the estimate 
does not include operations and maintenance costs beyond the end of 
development (to provide for at least one software and hardware 
technical refresh cycle) and does not include costs associated with 
system retirement. Further, the estimate is supported by technical 
baseline information contained in the program's Business Requirements 
Document, which provides the technical, schedule, and programmatic 
basis for the estimate, but some of this information is out of date. 
For example, the technical baseline only describes work through 2013, 
while the estimate describes work to be completed through 2017. 
Further, the estimate lacks sufficient detail to ensure that cost 
elements are neither omitted nor double-counted. Lastly, the estimate 
does not include cost-influencing ground rules and assumptions (e.g., 
labor rates or base-year dollars). Documenting all assumptions is 
imperative to ensuring that management fully understands the 
conditions under which the estimate was developed. Without a fully 
comprehensive cost estimate, decision makers cannot be assured of 
having a complete view of program costs. 

Characteristic: Well-documented; 
Assessment: Partially met; 
Key examples of rationale for assessment: The program documented 
certain aspects of the system's technical baseline, but, as described 
above, this information is out of date. Further, while the program 
described limited use of source data and documented certain 
calculations in estimates for near-term acquisition costs, the 
documentation does not capture all the data sources or the 
methodologies used in developing the estimate. As a result, an analyst 
unfamiliar with the program would find it difficult to use or 
replicate the estimate. Lastly, the documentation provides evidence 
that management approved the estimate; 
however, this was not done on the basis of confidence in the 
estimating process because management was not provided sufficient 
information about how the estimate was developed and the risks 
associated with the underlying data and methods. Without sufficient 
documentation, management and oversight organizations will not be 
convinced that the estimate is credible and the estimate is not useful 
for updates or information sharing. 

Characteristic: Accurate; 
Assessment: Not met; 
Key examples of rationale for assessment: The program lacks assurance 
that the cost estimate accurately reflects current program status due 
to, as described above, the lack of a comprehensive schedule and 
technical baseline. Further, although officials described limited use 
of historical cost data, the program did not have supporting 
documentation showing how the data were used. Such data can be used to 
challenge optimistic assumptions and bring more realism to the 
estimate. Additionally, the estimate was not adjusted for inflation. 
Adjusting for inflation is important because cost data must be 
expressed in consistent terms, or cost overruns can result. Lastly, 
the estimate is not based on an assessment of most likely costs, 
because the program did not rely on good source data and did not 
conduct a risk and uncertainty analysis to determine where the 
estimate fell against the range of all possible costs, and the most 
likely costs. As a result, decision makers cannot have confidence that 
the estimate accurately represents the program's full life-cycle cost. 

Characteristic: Credible; 
Assessment: Not met; 
Key examples of rationale for assessment: A risk and uncertainty 
analysis was not conducted to quantify the impact of risks to the 
estimate. While officials stated that some informal contingency 
funding is included to address risks, without conducting a risk and 
uncertainty analysis the program cannot be assured that adequate 
reserves exist to address contingencies that may arise. Further, a 
sensitivity analysis was not conducted to better understand which 
variables most affect the cost estimate. In addition, cost drivers 
were not cross-checked to see if different estimating methodologies 
produced similar results. Lastly, program officials stated that 
efforts to validate the results of the program's cost estimate with an 
independent cost estimate are in process and planned to be completed 
in May 2012. Until these gaps are addressed, the program will not have 
a full understanding of the limitations associated with the estimate 
and cannot know whether its estimate is realistic. 

Source: GAO analysis of the VBMS program's cost estimate. 

[End of table] 

[End of section] 

Appendix III: Original and Current Life-Cycle Cost Estimates for Case 
Study Programs: 

Collectively, 13 of the 16 case study programs have revised their cost 
estimates upward by almost $5 billion. More specifically, the 13 
programs have experienced cost increases ranging from about $6 million 
to over $2 billion. For example, in many cases, cost estimates had to 
be revised upwards to reflect the incorporation of full costs for all 
life-cycle phases (e.g. development or operations and maintenance), 
which had not originally been included. Other reasons that programs 
cited for revising their life-cycle cost estimates upward included 
changes to program or system requirements, schedule delays, technology 
upgrades, and system defects, among other things. Among the remaining 
3 programs, 1 program's cost estimate had decreased, 1 had not 
changed, and 1 was not applicable because the program only had a 
current cost estimate (see table 22). 

Table 22: Original and Current Life-Cycle Cost Estimates for Case 
Study Programs (as of April 2012): 

Dollars in millions. 

Agency: Agriculture; 
Program: Public Health Information System; 
Original life-cycle cost estimate: n/a[A]; 
Current life-cycle cost estimate: $82.3[A]; 
Change in cost: n/a; 
Percentage Change: n/a. 

Agency: Agriculture; 
Program: Web-Based Supply Chain Management; 
Original life-cycle cost estimate: $142.9; 
Current life-cycle cost estimate: $378.4; 
Change in cost: $235.5; 
Percentage Change: 165%. 

Agency: Commerce; 
Program: Comprehensive Large Array-data Stewardship System; 
Original life-cycle cost estimate: $195.5; 
Current life-cycle cost estimate: $240.0; 
Change in cost: $44.5; 
Percentage Change: 23%. 

Agency: Commerce; 
Program: Patents End-to-End: Software Engineering; 
Original life-cycle cost estimate: $130.2; 
Current life-cycle cost estimate: $188.2; 
Change in cost: $58.0; 
Percentage Change: 45%. 

Agency: Defense; 
Program: Consolidated Afloat Networks and Enterprise Services; 
Original life-cycle cost estimate: $12,740.9; 
Current life-cycle cost estimate: $12,740.9; 
Change in cost: $0; 
Percentage Change: 0%. 

Agency: Defense; 
Program: Tactical Mission Command; 
Original life-cycle cost estimate: $1,968.9; 
Current life-cycle cost estimate: $2,691.5; 
Change in cost: $722.6; 
Percentage Change: 37%. 

Agency: Environmental Protection Agency; 
Program: Financial System Modernization Project; 
Original life-cycle cost estimate: $163.2; 
Current life-cycle cost estimate: $169.3; 
Change in cost: $6.1; 
Percentage Change: 4%. 

Agency: Environmental Protection Agency; 
Program: Superfund Enterprise Management System; 
Original life-cycle cost estimate: $39.3; 
Current life-cycle cost estimate: $62.1; 
Change in cost: $22.8; 
Percentage Change: 58%. 

Agency: Homeland Security; 
Program: Integrated Public Alert and Warning System; 
Original life-cycle cost estimate: $259.0; 
Current life-cycle cost estimate: $311.4; 
Change in cost: $52.4; 
Percentage Change: 20%. 

Agency: Homeland Security; 
Program: Rescue 21; 
Original life-cycle cost estimate: $250.0[B]; 
Current life-cycle cost estimate: $2,662.0; 
Change in cost: $2,412.0; 
Percentage Change: 965%. 

Agency: Justice; 
Program: Next Generation Combined DNA Index System; 
Original life-cycle cost estimate: $128.4; 
Current life-cycle cost estimate: $137.0; 
Change in cost: $8.6; 
Percentage Change: 7%. 

Agency: Justice; 
Program: Unified Financial Management System; 
Original life-cycle cost estimate: $357.2; 
Current life-cycle cost estimate: $851.1; 
Change in cost: $493.9; 
Percentage Change: 138%. 

Agency: Labor; 
Program: OSHA[C] Information System; 
Original life-cycle cost estimate: $72.3; 
Current life-cycle cost estimate: $91.3; 
Change in cost: $19.0; 
Percentage Change: 26%. 

Agency: Labor; 
Program: PBGC[D] Benefit Administration; 
Original life-cycle cost estimate: $186.9; 
Current life-cycle cost estimate: $155.9; 
Change in cost: ($31.0); 
Percentage Change: (17)%. 

Agency: Veterans Affairs; 
Program: Health Data Repository; 
Original life-cycle cost estimate: $126.7; 
Current life-cycle cost estimate: $491.5; 
Change in cost: $364.8; 
Percentage Change: 288%. 

Agency: Veterans Affairs;
Program: Veterans Benefits Management System; 
Original life-cycle cost estimate: $560.0[E]; 
Current life-cycle cost estimate: $934.8; 
Change in cost: $374.8; 
Percentage Change: 67%. 

Agency: Total; 
Original life-cycle cost estimate: $17,321.4; 
Current life-cycle cost estimate: $22,105.4; 
Change in cost: $4,784.0. 

Source: GAO analysis of program data. 

[A] The Public Health Information System was originally part of the 
Public Health Information Consolidation Projects investment and, 
therefore, did not have a life-cycle cost estimate at the time of 
origination. The program's current life-cycle cost estimate has been 
excluded from the total. 

[B] The Rescue 21 program's original cost estimate, developed in 1999, 
only included system acquisition costs and did not include costs for 
operating and maintaining the system. These costs were subsequently 
included in the program's 2005 revisions to the cost estimate. 

[C] Occupational Safety and Health Administration. 

[D] Pension Benefit Guaranty Corporation. PBGC is a wholly owned 
government corporation administered by a presidentially appointed, 
Senate-confirmed Director and overseen by a Board of Directors 
consisting of the Secretaries of Labor, the Treasury, and Commerce. 
Although not a component of the Department of Labor, for 
administrative purposes, PBGC is included within the department's 
budget submission documentation. Therefore, PBGC's IT investments 
(including Benefit Administration) were included among the Department 
of Labor's IT investments in the Office of Management and Budget 
Fiscal Year 2010 Exhibit 53, which provided the basis for our 
selection of the 16 case study programs. 

[E] The Veterans Benefits Management System program's original cost 
estimate, developed in 2008, only included system development costs 
and did not include costs for operating and maintaining the system. 
These costs were included in subsequent revisions to the cost estimate. 

[End of table] 

[End of section] 

Appendix IV: Comments from the Department of Agriculture: 

USDA: 
United States Department of Agriculture: 
Office of the Chief Information Officer: 
1400 Independence Avenue S.W. 
Washington, DC 20250: 

June 20, 2012: 

Valarie C. Melvin: 
Director: 
Information Technology Team: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, DC 20548: 

Dear Ms. Melvin: 

The U.S. Department of Agriculture has reviewed the draft report GAO 
Draft Report (IT Cost Estimation) - GAO-12-629 (Job Code: 311245), 
July 2012. 

Thank you for the opportunity to respond to the GAO draft report. We 
Concur with the content of the report and have no comments. 

For additional information, please contact Denice Lotson, Office of 
the Chief Information Officer Audit Liaison, at 202-720-9384. 

Sincerely, 

Signed by: 

Cheryl L. Cook: 
Acting Chief Information Officer: 

[End of section] 

Appendix V: Comments from the Department of Commerce: 

United States Department of Commerce: 
The Secretary of Commerce: 
Washington, D.C. 20230: 

June 22, 2012: 

Ms. Valerie C. Melvin: 
Director, Information Management and Technology Resources Issues: 
U.S. Government Accountability Office: 
441 G Street NW: 
Washington, DC 20548:  

Dear Ms. Melvin:  

Thank you for the opportunity to comment on the draft report from the 
U.S. Government Accountability Office (GAO) entitled Information 
Technology Cost Estimation: Agencies Need To Address Significant 
Weaknesses in Policies and Practices (GA0-12-629).  

We recognize several of the cost estimating shortfalls in the 
Department of Commerce (DOC). Since a Secretarial-directed Acquisition 
Improvement Study in 2010 to investigate issues contributing to 
problems in several high profile acquisitions, the Department has made 
great strides over the course of the last year to implement corrective 
actions to address several issues. The results of the study identified 
the need for a more comprehensive and corporate approach for 
overseeing and managing acquisitions, particularly with regard to 
requirements development, cost estimating, and the acquisition project 
management processes.  

A follow-on Department-wide Acquisition Improvement Project (AIP) was 
aimed at creating the approaches and infrastructure to sustain a 
healthy acquisition system, with special emphasis on high profile 
projects that merit Department-level oversight. The AIP developed the 
Scalable Acquisition Project Management Framework (the Framework) for 
use in managing future acquisition projects (Information Technology 
(IT) and non-IT). An interim acquisition framework policy has been 
drafted and should be issued shortly, after finalizing the last set of 
Bureau inputs. The DOC Office of Inspector General (OIG) has likewise 
cited the need for the Department to develop a high-profile systems 
acquisition policy and has recognized this in its OIG FY13 Top 
Management Challenges.  

We fully concur with the report's findings and recommendations to 
modify Departmental policies governing cost estimating and have been 
actively engaged in many of the actions.  

Specifically, we are addressing each of the seven policy area 
weaknesses (for both IT and non-IT programs) as follows.  

* Clear Requirement for Cost Estimating: As part of the DOC newly 
developed acquisition framework policy and guidance, there are clear 
processes and documentation required for project cost estimates and 
Independent Cost Estimates (ICE) at the various Framework milestones. 

* Compliance with Cost-Estimating Best Practices: The final 
acquisition framework policy anticipated by the end of 2012 will be 
accompanied by specific cost estimating guidance referencing GAO Cost 
Estimating and Assessment Guide: Best Practices for Developing and 
Managing Capital Program Costs, GAO-09-3SP. We will ensure that high-
profile program and project estimates (those overseen by DOC 
headquarters) are comprehensive, accurate, credible, and well 
documented. 

* Management Review and Approval: The proposed DOC milestone review 
process includes an Integrated Product Team (IPT) approach to review 
the corresponding milestone materials in advance of and to inform each 
milestone decision meeting. The IPT will ensure rigorous compliance 
with the framework, to review materials for the adequacy, and to raise 
any un-reconciled issues at each milestone decision meeting for final 
arbitration and resolution. The review of each program's cost estimate 
will include content, cost, schedule, and risk assessments. 

* Training Requirements: We are developing an on-line cost estimating 
course which will be fully completed in September 2012 to address 
specific DOC cost estimating requirements, but is applicable across 
multiple government agencies. The Department has been working on 
identifying its full community of practice of Program/Project Managers 
(P/PM), to include non-IT and subordinate P/PMs in high-profile 
programs. Once the community is identified, mandatory training 
requirements for cost estimating will include this course. 

* Central, Independent Cost Estimating Team: The Office of Acquisition 
Management (OAM) which has been leading the effort in developing the 
acquisition framework has also brought in senior federal staff experts 
to assist programs and bureaus to succeed in program and project 
management, systems engineering, and cost estimating. Recent examples 
of projects benefiting OAM's involvement and leadership are the 
National Telecommunications and Information Administration's Federal 
Spectrum Management System as well as the National Oceanic and
Atmospheric Administration's (NOAA) Geostationary Operational 
Environmental Satellite-R Series (GOES-R) program and their Fisheries 
Survey Vessel 6 (FSV6). 

* Standard Structure for Defining Work Products: In developing its 
acquisition framework, the Department has leveraged policies 
throughout the federal government and the private sector to construct 
an acquisition process tailored to the DOC requirements, capabilities, 
and program variety. The same premise is also being applied to the 
standard templates and the work breakdown structures to be used. Our 
intent is to leverage Military Standard 881, National Aeronautics and 
Space Administration, and commodity-specific products as we find them, 
recognizing their need to be tailored as circumstances warrant. 

* Process to Collect and Store Cost-Related Data: As programs progress 
through the acquisition process, data will be collected on a variety 
of program types. This has already begun. For example, historical 
flight and ground segment costs have been collected and cataloged for 
the GOES-R program and will similarly be completed for the Joint Polar 
Satellite System. In addition, analogous data are being collected and 
cataloged from outside the Department for comparison across the cost 
community. For example, cost estimates for NOAA's FSV6 are being 
compared with historical, analogous DoD and Coast Guard development 
costs. Also we are collecting case study projects to populate and 
refine the cost estimating course. We also plan to use cost data 
identified by the US Patent and Trademark Office as they are made 
available to us. 

With regard to the specific DOC IT investment programs evaluated by GAO,
Comprehensive Large Array-data Stewardship System (CLASS) and Patents 
End-to-End (PE2E), we concur with the report's findings and will 
ensure that the future cost estimates for these programs follow GAO's 
cost estimating best practices to correct the weaknesses identified.
Our forthcoming acquisition framework policy and guidance will also 
include the appropriate elements to ensure that cost estimates for 
high-profile programs are comprehensive, accurate, credible, and well-
documented. 

Please contact Barry Berkowitz at 202-482-4248, if you have questions 
regarding this response. 

Sincerely, 

Signed by: 

Rebecca M. Blank: 
Acting Secretary of Commerce: 

[End of section] 

Appendix VI: Comments from the Department of Defense: 

Office of The Secretary Of Defense: 
Cost Assessment and Program Evaluation: 
1800 Defense Pentagon: 
Washington, DC. 20301-1800: 

June 26, 2012: 

Ms. Valerie Melvin: 
Director, Information Management and Technology Resources Issues: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Ms. Melvin, 

This is the Department of Defense (DoD) response to the Government 
Accountability Office (GAO) draft report GA0-12-629, "Information 
Technology Cost Estimation: Agencies Need to Address Significant 
Weaknesses in Policies and Practices", dated May 22, 2012, (GAO Code 
311245). The Department partially concurs with the recommendation 
addressed to DoD. The Department's full response to the report's 
recommendation is attached. 

The Department appreciates the opportunity to respond to your draft 
report. Should you have any questions please contact my primary action 
officer, Mr. William E. Raines, at 571-256-1426 or 
william.raines@osd.mil. 

Sincerely, 

Signed by: 

Christine H. Fox: 
Director: 

Attachment(s): 

1. DoD Comments to GAO Recommendations: 

[End of letter] 

GAO Draft Report Dated May 22, 2012: 
GAO-12-629 (GAO Code 311245): 

"Information Technology Cost Estimation: Agencies Need To Address 
Significant Weaknesses In Policies And Practices" 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: The GAO recommends that the Secretary of Defense 
direct responsible officials to update future life-cycle cost 
estimates of the Tactical Mission Command program using cost-
estimating practices that address the weaknesses that we identified. 

DoD Response: DoD partially concurs with GAO's recommendation. DoD 
agrees with the criteria, methodology, and assessment of the DoD 
programs. However, the Tactical Mission Command program made a 
fielding decision in January 2009 and is currently performing system 
deployment, and there is currently no plan to formally update the life-
cycle cost estimate for this program based on where the Tactical 
Mission Command program is in the acquisition lifecycle. DoD does, 
however, recognize the need to use the cost-estimating policies and 
practices identified in this report across all of our Major Automated
Information Systems. Through the Weapons Systems Acquisition Reform 
Act of 2009, the Director of Cost Assessment and Program Evaluation 
has a responsibility to review the cost estimated for all Major 
Automated Information Systems. As a part of that estimate review 
process the Director of Cost Assessment and Program Evaluation 
assesses the use of standard cost-estimating practices and adherence 
to cost-estimating policies in the preparation of these cost-estimates. 

[End of section] 

Appendix VII: Comments from the Environmental Protection Agency: 

United States Environmental Protection Agency: 
Washington, D.C. 20460: 

June 29, 2012: 

Valerie C. Melvin, Director: 
Information Technology, Human Capital and Management Issues: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, D.C. 20548: 

Dear Ms. Melvin: 

Thank you for the opportunity to comment on the draft report entitled 
"Information Technology Cost Estimation: Agencies Need to Address 
Significant Weaknesses in Policies and Practices (GA0-12-629)." Sound 
fiscal management practices should be followed in all aspects of the
Agency's information technology operations, including cost estimating 
for the development of new systems. 

EPA recognizes GAO's comment that "agency policies did not require 
cost-estimating best practices." We believe that the GAO Cost 
Estimating Guide: Best Practices for Developing and Managing Capital 
Program Costs, GAO-09-3SP (Washington, DC: March 2009) is a valuable 
resource. In recognition of GAO's comment, EPA will update our Systems 
Life Cycle Management (SLCM) procedures, as suggested. 

With regard to GAO's assessment of the Financial System Modernization 
Project (FSMP), EPA does not have specific comments. 

The remainder of this letter includes our general comments on the 
approach and findings of this assessment as it relates to the 
Superfund Enterprise Management System (SEMS). Detailed responses to 
the findings are included in the enclosure. 

General Comments regarding SEMS: 

We do not believe that the GAO assessment accurately reflects the cost 
estimating practices employed for the development of SEMS. The Office 
of Solid Waste and Emergency Response (OSWER) believes that the SEMS 
project has met the spirit and intent of the cost estimating 
guidelines outlined in GAO Cost Estimating Guide: Best Practices for 
Developing and Managing Capital Program Costs, GAO-09-3SP (Washington, 
DC: March 2009). However, EPA may have used different processes or 
documentation in order to do so. 

Although we believe that the 2009 GAO cost estimation guide is a 
valuable resource, given the variation, uncertainty, and widely 
different contexts that cost estimation occurs, we feel it can only 
serve as a guide. First, the GAO guide was not published until three 
years after the SEMS development commenced. The Agency maintains a 
rigorous approach to system development and system operations. SEMS is 
rich in documentation, as demonstrated by more than 200 documents that 
were shared with the GAO review team. The production has followed EPA 
system life cycle management policy, and has withstood intense 
internal and external review.  

The draft GAO report erroneously concludes that the SEMS cost estimate 
increased from $39.3 million to $62.0 million in just two years. As 
has been previously explained to GAO, this revised cost estimate was a 
direct result of a change in the duration of O&M included in each 
calculation. The $39.3 million figure represented an estimate until 
the end of FY 2013, while the $62.0 million figure represents an 
estimate through FY 2017 and was accepted as part of formal 
rebaselining under the Capital Planning and Investment Control (CPIC) 
process governed by the Office of Management and Budget (OMB).  

Conclusion:  

As the enclosed detailed response demonstrates, the SEMS project has 
been subject to a multitude of independent reviews, internal senior 
management reviews, cost benefit analyses, sensitivity and risk 
analyses, and CPIC approvals by OMB. These processes and documents 
demonstrate a clear commitment to effective planning and management of 
SEMS cost estimation that is not recognized by the GAO assessment. We 
suggest that the assessment team evaluate these actions against the 
spirit and intent of the cost estimating guidelines.  

While we do not agree with the findings of this assessment relative to 
SEMS, we do feel that the GAO cost estimation guide is a valuable tool 
for future IT cost estimation efforts, We will also continue to comply 
with applicable Agency policy and procedures when developing cost 
estimates for other systems, and when updating estimates with respect 
to SEMS.  

Please feel free to contact Vaughn Noga, Director of the Office of 
Technology Operations and Planning in the Office of Environmental 
Information on (202) 566-0300 or Robin Richardson, Director of the 
Resources Management Division in the Office of Superfund Remediation 
and Technology Innovation, at (703) 603-9048, if you would like to 
discuss these points and further.  

Sincerely, 

Signed by: 

Mathy Stanislaus 
Assistant Administrator: 
Office of Solid Waste and Emergency Response:  

Signed by: 

Malcolm D. Jackson: 
Assistant Administrator and  Chief Information Officer: 
Office of Environmental Information:  

Enclosure: 

[End of section] 

Appendix VIII: Comments from the Department of Homeland Security: 

U.S. Department of Homeland Security: 
Washington, DC 25528: 

June 22, 2012: 

Valerie C. Melvin: 
Director. Information Management and Technology Resources Issues: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington. DC 20548: 

Re: Draft Report GAO-12-629, "Information Technology Cost Estimation:
Agencies Need To Address Significant Weaknesses in Policies and 
Practices" 

Dear Ms. Melvin: 

Thank you for the opportunity to review and comment on this draft 
report. The U.S. Department of Homeland Security (DHS) appreciates the 
U.S. Government Accountability Office's (GAO) work in conducting its 
review and issuing this report. 

The Department is pleased to note GAO's positive acknowledgment of 
MIS' continued progress in establishing and implementing effective 
cost estimating processes. In particular, we appreciate GAO's 
recognition that DHS has "fully: addressed" several of the key 
components of effective cost-estimating policy, such as adherence to 
the use of best practices. This includes relying extensively on the 
GAO Cost Estimating and Assessment Guide, and working through
Acquisition and Program Management Centers of Excellence to provide 
tailored training workshops for Component personnel. 

The draft report contained two recommendations directed at DHS, with 
which the Department concurs. Specifically. GAO recommended that the 
Secretary of Homeland Security direct responsible officials to: 

Recommendation 1: Modify policies governing cost estimating to ensure 
that they address the weaknesses that we identified. 

Response: Concur. The Office of Program Accountability and Risk 
Management (PARM) is currently developing a process to transform the 
Department's Acquisition Management Directive (D) 102-01 and 
accompanying instructions into a more usable and flexible structure. 
The transformation will include development of a revised cost 
estimating policy that will further incorporate GAO best practices, as 
appropriate. 

Recommendation 2: Update future life-cycle cost estimates of the 
system acquisition programs discussed in this report using cost-
estimating practices that address the detailed weaknesses that we 
identified. 

Response: Concur. PARM is developing a scorecard for assessing 
programs' Life Cycle Cost Estimates based on GAO best practices. PARM 
is also working through the Acquisition and Program Management Centers 
of Excellence to provide training workshops on cost estimating to 
address the weaknesses identified by GAO and any gaps identified by 
the scorecard process. 

The workshops are available to personnel in all major programs 
throughout the Department. 

Again, thank you for the opportunity to review and comment on this 
draft report. Please feel free to contact me if you have any 
questions. We look forward to working with you in the future. 

Sincerely, 

Signed by: 

Jim H. Crumpacker: 
Director: 
Departmental GAO-OIG Liaison Office: 

[End of section] 

Appendix IX: Comments from the Department of Labor: 

U.S. Department of Labor: 
Office of the Assistant Secretary for Administration and Management: 
Washington, D.C. 20210: 

June 22, 2012: 

Ms. Valerie C. Melvin: 
Director: 
Information Management and Technology Resources Issues: 
Government Accountability Office: 
441 G Street, NW: 
Washington, D.C. 20548: 

Dear Ms. Melvin: 

Thank you for the opportunity to review and comment on the Draft 
Government Accountability Office (GAO) Report #GAO-12-629, Information 
Technology Cost Estimation: Agencies Need to Address Significant 
Weaknesses in Policies and Practices. 

The Department of Labor (DOL) appreciates the recommendations provided 
by GAO regarding cost estimating within the Department. As DOL has 
previously stated, we have a relatively small IT portfolio which does 
not justify the cost of establishing a central, independent office 
dedicated to cost estimating. Through our existing policy, as 
reflected in our Baseline Management Guide (BMG), the IT Cost 
Estimation Guide (an Appendix within the BMG), and our Post 
Implementation Review (PIR) process--and reinforced through training 
to agency IT managers--we continue to document and improve our IT cost 
estimation. 

The DOL BMG further describes standard, detailed requirements for 
creating and modifying a Project Plan with the appropriate level and 
type of work products, and directs the collection of lessons learned 
related to cost estimates in the Integrated Baseline Review and then 
again in the PIR stages of an IT investment's lifecycle. 

To ensure agency IT acquisitions are managed as strategic business 
resources, the Department has created an IT Acquisition Review Board 
(GARB). The ITARB is accountable for the approval of funds for all IT 
acquisitions, including infrastructure, products, commodities, and 
services to ensure alignment with the Department's IT Modernization 
and strategic sourcing initiatives. This effort will assist the 
agencies in the long term by working towards a more streamlined and 
consolidated approach to IT goods and services acquisitions, thus 
saving time and money agencies could use to meet their program goals. 

It is also important to note that the Office of the Chief Information 
Officer (000) participates in the budget formulation process at the 
Departmental level for the annual budget submission to the
Office of Management and Budget. 

Additionally, the Occupational Safety and Health Administration (OSHA) 
offers the following comments: 

OSHA believes that GAO's "Not Met" Assessment of the "Credible" 
characteristic for the OIS program is too low and should be assessed 
as at least "Partially Met." The focus of GAO's assessment is on the 
2010 cost estimate to the exclusion of all other evidence provided for 
the program. The complete set of estimating tools for the 2010 
estimate was not available so, at GAO's request, OSHA provided 
additional supporting evidence for OSHA cost estimating practices, 
particularly the 2008 documentation. However, the assessment does not 
give consideration to this additional documentation in this 
assessment.  

OSHA follows DOL's standard cost estimating policies and practices 
using standardized cost estimating tools with sensitivity analysis, 
risk adjustments, alternatives analysis, and financial analysis built 
into the templates as demonstrated in the 2008 documentation. 
Furthermore, an independent cost estimate was conducted at the outset 
of the program by an industry-leading IT consulting firm as 
recommended by the DOL OIG using the most widely respected and time-
tested software development estimating methodology.  

Sincerely, 

Signed by: 

T. Michael Kerr: 
Assistant Secretary for Administration and Management: 

[End of section] 

Appendix X: Comments from the Pension Benefit Guaranty Corporation: 

Pension Benefit Guaranty Corporation: 
Office of the Director: 
1200 K Street, NW: 
Washington, DC 20005-4026: 

June 18, 2012: 

Valerie C. Melvin, Director: 
Information :Management and Technology Resource Issues: 
U.S. Government Accountability Office: 
Washington, D.C. 20548: 

Re: GAO Study of Cost Estimating Practices for Selected Information
Technology Investments (Job Code: 3112451): 

Dear Ms. Melvin: 

Thank you for the opportunity to comment on your draft report. 

We are pleased that GAO concluded that PBGC met a large portion, or at 
least half, of GAO's quality indicators in each of its four categories 
of cost estimating: Comprehensive, Well-documented, Accurate, and 
Credible. 

This is even more significant considering that PBGC is much smaller 
than the other agencies referenced in the report, and does not have 
the same level of resources to dedicate to cost estimating 
infrastructure. PBGC takes estimating system costs seriously, and 
places a great deal of emphasis on system planning, implementation, 
and oversight to help ensure that we receive an appropriate return on 
our system investments. 

As well as PBGC performs, we understand that there is always room for 
improvement. Consistent with your recommendation, PBGC will evaluate 
and improve future life-cycle cost estimates for the Benefit 
Administration (BA) investment in ways that appropriately consider 
GAO's framework, while balancing the resources available in a small 
agency. 

The extent of the improvements will also reflect the context of the BA 
investment moving to a predominately Operations & Maintenance (O&M) 
stage. Along the way, Development, Modernization & Enhancements (DM&E) 
releases will address critical audit findings, prudently keep pace 
with technology, support legislative changes relating to the Pension
Protection Act of 2006, and accommodate unforeseen complexities in 
pension plans that will be trusteed in the future. Additionally, the 
findings outlined in this report will be considered as we continue to 
refine our existing Total Cost of Ownership (TCO) Guidance document and
we apply the TCO concept to other IT Projects in addition to the BA 
investment. Our updated TCO guidance document, and related TCO 
training will he as available in September 2013, and an updated plan 
on "Improving BA Investments" is slated for completion in September 
20114. As Table 22 of the draft report indicates, we plan on reducing 
total life-cycle costs of the BA program over the next few years. 

Please contact Martin O. Boehm at 202-326-4161. ext. 3901, should you 
have any questions, 

Sincerely, 

Signed by: 

Josh Gotbaum: 

cc: 

Barbara Bovbjerg, GAO, Director EWIS; 
Patricia Kalb, CFO: 
Richard Macy, CIO: 
Vince Snowbarger, DDO: 
Judith Starr, General Counsel: 
Martin 0. Boehm, CCRD: 

[End of section] 

Appendix XI: Comments from the Department of Veterans Affairs: 

Department of Veterans Affairs: 
Washington DC 20420: 

June 26, 2012: 

Ms. Valerie C. Melvin: 
Director, Information Management and Technology Resource Issues: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Ms. Melvin: 

The Department of Veterans Affairs (VA) has reviewed the Government
Accountability Office's (GAO) draft report, "Information Technology Cost
Estimation: Agencies Need To Address Significant Weaknesses in 
Policies and Practices" (GAO-12-629) and generally agrees with GAO's 
conclusions. 

The enclosure specifically addresses GAO's two recommendations, 
provides an action plan, and includes a technical comment. VA 
appreciates the opportunity to comment on your draft report. 

Sincerely, 

Signed by: 

John R. Gingrich: 
Chief of Staff: 

Enclosure: 

[End of letter] 

Enclosure: 

Department of Veterans Affairs (VA) Comments to Government 
Accountability Office (GAO) Draft Report, "Information Technology Cost 
Estimation: Agencies Need To Address Significant Weaknesses in 
Policies and Practices" (GA0-12-629): 

GAO Recommendation 1: We recommend that the Secretary of Veterans 
Affairs direct responsible officials to modify policies governing cost 
estimating to ensure that they address the weaknesses that we 
identified. 

VA Comment: VA concurs with the recommendation for clarifying VA 
policy for cost estimating at an appropriate level. As a long term 
solution to support strategic cost estimation, the Office of 
Information and Technology (0184T) is evaluating the utility of 
establishing an organizational function focused solely on multiyear 
estimation. This function is presently distributed to multiple sub-
units within 0187 based on responsibility for different parts of the 
Software Development Lifecycle. The organization element would be 
charged with supporting the articulation of multiyear programs via the 
delivery of cost and performance metrics gathered during increment and 
project delivery. This decentralization took place because VA, as part 
of our transformational efforts in IT, focused intensely on 
incremental delivery to force an adoption of an agile methodology for 
project success across the enterprise. The current environment focuses 
on incremental delivery of business requirements first and then 
management of programs (page 15 of this GAO report), consolidating 
this function could be a step toward maturing a capability to estimate 
both increment-based and lifecycle costs. VA seeks to complete this 
evaluation by the end of the first quarter, fiscal year 2013. 

While this evaluation is underway, VA is improving cost-estimating 
practices for IT efforts at the increment level in support of program 
management. Efforts presently underway include implementing 
obligation, cost, and staff time capture at the increment level as 
well as developing an increment-type taxonomy. With availability and 
utilization of this increment-level information, Program Managers and 
Platform/Product Managers will be better equipped to assist in forward 
projection of costs. 

VA is completing our transition from a Waterfall only approach to 
project delivery to an approach based on an incremental delivery of 
customer facing functionality utilizing Agile principles. In a 
Waterfall approach, the development of component pieces of a larger 
software package is carried out in sequential phases with no 
consistent, fixed duration to complete full development. With an 
incremental approach, initial planning regarding cost, scope, and 
timing is conducted at a high level. Following initial high-level 
planning, specific plans are developed for each iteration of the 
project. High-level plans are then modified as necessary. This process 
results in better estimates of both increment-based and lifecycle cost 
estimation than is typically the case with a Waterfall approach. But 
more importantly, an incremental approach has a significantly higher 
success rate in the delivery of customer facing functionality to 
enable mission accomplishment. 

GAO Recommendation 2: We recommend that the Secretary of Veterans 
Affairs direct responsible officials to update future life-cycle cost 
estimates of the system acquisition programs discussed in this report 
using cost-estimating practices that address the detailed weaknesses 
that we identified. 

VA Comment: VA concurs with the recommendation. VA is improving cost-
estimating practices for IT efforts at the increment level in support 
of program management. Initial project manager training was conducted 
within the last ninety days, and additional training is scheduled 
early in fiscal year 2013. Other efforts presently underway include
implementing obligation, cost, and staff time capture at the increment 
and project levels. The results will be utilized to improve future 
life-cycle estimates. 

[End of section] 

Appendix XII: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Valerie C. Melvin, (202) 512-6304 or melvinv@gao.gov: 

Staff Acknowledgments: 

In addition to the contact name above, individuals making 
contributions to this report included Eric Winter (Assistant 
Director), Mathew Bader, Carol Cha, Jennifer Echard, J. Christopher 
Martin, Lee McCracken, Constantine Papanastasiou, Karen Richey, 
Matthew Snyder, and Jonathan Ticehurst. 

[End of section] 

Footnotes: 

[1] Office of Management and Budget, Report on IT Spending for the 
Federal Government, February 2012. 

[2] The eight agencies were the Departments of Agriculture, Commerce, 
Defense, Homeland Security, Justice, Labor, and Veterans Affairs, and 
the Environmental Protection Agency. We did not review the cost-
estimating policies at these agencies' components or smaller agencies. 

[3] We relied on the Office of Management and Budget Fiscal Year 2010 
Exhibit 53, which contained data on the planned IT spending at 28 
agencies, to select the 8 agencies for review. At the time the 8 
agencies were selected, the Office of Management and Budget Fiscal 
Year 2010 Exhibit 53 was the most current source with complete data on 
agencies' planned IT spending. 

[4] Only one agency, the Department of Defense, had greater than $10 
billion in IT spending in fiscal year 2010. 

[5] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[6] One investment selected from the Department of Labor is the 
responsibility of the Pension Benefit Guaranty Corporation (PBGC). 
PBGC is a wholly owned government corporation administered by a 
presidentially appointed, Senate-confirmed Director and overseen by a 
Board of Directors consisting of the Secretaries of Labor, the 
Treasury, and Commerce. Although not a component of the Department of 
Labor, for administrative purposes, PBGC is included within the 
department's budget submission documentation. Therefore, PBGC's IT 
investments were included among the Department of Labor's IT 
investments in the Office of Management and Budget Fiscal Year 2010 
Exhibit 53, which provided the basis for our selection of the 16 case 
study programs. 

[7] The Office of Management and Budget defines a major IT investment 
as a system or an acquisition requiring special management attention 
because it has significant importance to the mission or function of 
the agency, a component of the agency, or another organization; is for 
financial management and obligates more than $500,000 annually; has 
significant program or policy implications; has high executive 
visibility; has high development, operating, or maintenance costs; is 
funded through other than direct appropriations; or is defined as 
major by the agency's capital planning and investment control process. 

[8] Steady state refers to operating and maintaining systems at 
current levels (i.e., without major enhancements). 

[9] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[10] See for example, GAO, Defense Infrastructure: Navy Can Improve 
the Quality of Its Cost Estimate to Homeport an Aircraft Carrier at 
Naval Station Mayport, [hyperlink, 
http://www.gao.gov/products/GAO-11-309] (Washington, D.C.: Mar. 3, 
2011); Secure Border Initiative: DHS Needs to Reconsider Its Proposed 
Investment in Key Technology Program, [hyperlink, 
http://www.gao.gov/products/GAO-10-340] (Washington, D.C.: May 5, 
2010); and DOD Business Systems Modernization: Key Marine Corps System 
Acquisition Needs to Be Better Justified, Defined, and Managed, 
[hyperlink, http://www.gao.gov/products/GAO-08-822] (Washington, D.C.: 
July 28, 2008). 

[11] See for example, GAO, IRS Management: Cost Estimate for New 
Information Reporting System Needs to be Made More Reliable, 
[hyperlink, http://www.gao.gov/products/GAO-12-59] (Washington, D.C.: 
Jan. 31, 2012); Information Technology: Better Informed Decision 
Making Needed on Navy's Next Generation Enterprise Network 
Acquisition, [hyperlink, http://www.gao.gov/products/GAO-11-150] 
(Washington, D.C.: Mar. 11, 2011); Department of Energy: Actions 
Needed to Develop High-Quality Cost Estimates for Construction and 
Environmental Cleanup Projects, [hyperlink, 
http://www.gao.gov/products/GAO-10-199] (Washington, D.C.: Jan. 14, 
2010); VA Construction: VA Is Working to Improve Initial Project Cost 
Estimates, but Should Analyze Cost and Schedule Risks, [hyperlink, 
http://www.gao.gov/products/GAO-10-189] (Washington, D.C.: Dec. 14, 
2009); DOD Business Systems Modernization: Planned Investment in Navy 
Program to Create Cashless Shipboard Environment Needs to Be Justified 
and Better Managed, [hyperlink, 
http://www.gao.gov/products/GAO-08-922] (Washington, D.C.: Sept. 8, 
2008); and 2010 Census: Census Bureau Should Take Action to Improve 
the Credibility and Accuracy of Its Cost Estimate for the Decennial 
Census, [hyperlink, http://www.gao.gov/products/GAO-08-554] 
(Washington, D.C.: June 16, 2008). 

[12] OMB, Circular No. A-11, Preparation, Submission, and Execution of 
the Budget (Washington, D.C.: Executive Office of the President, June 
2006) and Capital Programming Guide: Supplement to Circular A-11, Part 
7, Planning, Budgeting, and Acquisition of Capital Assets (Washington, 
D.C.: Executive Office of the President, June 2006). OMB first issued 
the Capital Programming Guide as a supplement to the 1997 version of 
Circular A-11, Part 3. We refer to the 2006 version. OMB later updated 
this guide again in August 2011. 

[13] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[14] [hyperlink, http://www.gao.gov/products/GAO-12-59]. 

[15] [hyperlink, http://www.gao.gov/products/GAO-10-199]. 

[16] [hyperlink, http://www.gao.gov/products/GAO-10-189]. 

[17] OMB, Capital Programming Guide, v.3.0, Supplement to OMB Circular 
A-11: Planning, Budgeting, and Acquisition of Capital Assets 
(Executive Office of the President, Washington, D.C.: August 2011). 

[18] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[19] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[20] An integrated baseline review is an evaluation of a program's 
baseline plan to determine whether all program requirements have been 
addressed, risks have been identified, mitigation plans are in place, 
and available and planned resources are sufficient to complete the 
work. 

[21] The Defense Acquisition Workforce Improvement Act, 10 U.S.C. §§ 
1701-1764. This act recognized acquisition as a multidisciplinary 
career field for DOD, which now identifies 16 career fields/paths, of 
which one is cost estimating and financial management. 

[22] OMB established the Federal Acquisition Certification for Program 
and Project Managers program in 2007 to support skill development of 
program and project managers. The program applies to all civilian 
agencies. 

[23] GAO has ongoing work to assess DOD's cost-estimating office 
(known as Cost Assessment and Program Evaluation), including the 
implementation of its responsibilities under the Weapon Systems 
Acquisition Reform Act of 2009 (10 U.S.C. § 2334). 

[24] Earned value management is a project management tool that 
integrates the technical scope of work with schedule and cost elements 
for investment planning and control. It compares the value of work 
accomplished in a given period with the value of the work expected in 
that period. 

[25] OMB, Circular No. A-11, Preparation, Submission, and Execution of 
the Budget and Capital Programming Guide, v.3.0, Supplement to OMB 
Circular A-11: Planning, Budgeting, and Acquisition of Capital Assets. 

[26] According to OMB's Circular A-11, the Exhibit 300 is used to, 
among other things, make decisions about budgetary resources, and 
further states that agencies should have the supporting evidence used 
to produce the Exhibit 300 readily available as part of project-
specific documentation. 

[27] A Monte Carlo simulation assesses the aggregate variability of 
the cost estimate to determine a confidence range around the cost 
estimate. 

[28] Because uncertainty cannot be avoided, it is necessary to conduct 
a risk and uncertainty analysis to determine the level of confidence 
associated with the cost estimate. The level of confidence is the 
probability that the cost estimate will actually be met. For example, 
if the confidence level is 80 percent, there is an 80 percent chance 
that the final cost will be at or below the cost estimate and a 20 
percent chance that costs will exceed the cost estimate. 

[29] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. The GAO cost 
guide was first released as an exposure draft in July 2007. See GAO, 
Cost Assessment Guide: Best Practices for Estimating and Managing 
Program Costs, Exposure Draft, [hyperlink, 
http://www.gao.gov/products/GAO-07-1134SP] (Washington, D.C.: July 
2007). 

[30] While the Office of Management and Budget's Fiscal Year 2011 
Exhibit 53 was available at the time we made our agency and investment 
selections, it did not contain a complete set of data. Specifically, 
the IT investment spending data for two agencies--the Department of 
Defense and the Department of Energy--were not included. Therefore, we 
relied on the Fiscal Year 2010 Exhibit 53 for our agency and 
investment selection because, at the time, it was the most current and 
complete set of data. 

[31] The 28 departments and agencies included in the Office of 
Management and Budget Fiscal Year 2010 Exhibit 53 are the departments 
of Agriculture, Commerce, Defense, Education, Energy, Health and Human 
Services, Homeland Security, Housing and Urban Development, the 
Interior, Justice, Labor, State, Transportation, the Treasury, and 
Veterans Affairs; the Environmental Protection Agency, General 
Services Administration, National Aeronautics and Space 
Administration, National Archives and Records Administration, National 
Science Foundation, Nuclear Regulatory Commission, Office of 
Management and Budget, Office of Personnel Management, Small Business 
Administration, Smithsonian Institution, Social Security 
Administration, U.S. Agency for International Development, and U.S. 
Army Corps of Engineers. 

[32] Only one agency, the Department of Defense, had greater than $10 
billion in IT spending in fiscal year 2010. 

[33] The Office of Management and Budget defines a major IT investment 
as a system or an acquisition requiring special management attention 
because it has significant importance to the mission or function of 
the agency, a component of the agency, or another organization; is for 
financial management and obligates more than $500,000 annually; has 
significant program or policy implications; has high executive 
visibility; has high development, operating, or maintenance costs; is 
funded through other than direct appropriations; or is defined as 
major by the agency's capital planning and investment control process. 

[34] One investment selected from the Department of Labor is the 
responsibility of the Pension Benefit Guaranty Corporation (PBGC). 
PBGC is a wholly owned government corporation administered by a 
presidentially appointed, Senate-confirmed Director, overseen by a 
Board of Directors consisting of the Secretaries of Labor, the 
Treasury, and Commerce. Although not a component of the Department of 
Labor, for administrative purposes, PBGC is included within the 
department's budget submission documentation. Therefore, PBGC's IT 
investments were included among the Department of Labor's IT 
investments in the Office of Management and Budget Fiscal Year 2010 
Exhibit 53, which provided the basis for our selection of the 16 case 
study programs. 

[35] Steady state refers to operating and maintaining systems at 
current levels (i.e., without major enhancements). 

[36] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[37] According to the Office of Management and Budget's Circular A-11, 
the Exhibit 300 is used to, among other things, make decisions about 
budgetary resources, and further states that agencies should have the 
supporting evidence used to produce the Exhibit 300 readily available 
as part of project-specific documentation. 

[38] Agile software development is not a set of tools or a single 
methodology, but a philosophy based on selected values, such as 
prioritizing customer satisfaction through early and continuous 
delivery of valuable software; delivering working software frequently, 
from every couple of weeks to every couple of months; and making 
working software the primary measure of progress. For more information 
on Agile software development, see [hyperlink, 
http://www.agilealliance.org]. 

[39] According to the program's fiscal year 2013 budget submission 
documentation, this investment is also referred to as the Maneuver 
Control System. 

[40] 42 U.S.C. § 9604. 

[End of section] 

GAO’s Mission: 

The Government Accountability Office, the audit, evaluation, and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the 
performance and accountability of the federal government for the 
American people. GAO examines the use of public funds; evaluates 
federal programs and policies; and provides analyses, recommendations, 
and other assistance to help Congress make informed oversight, policy, 
and funding decisions. GAO’s commitment to good government is 
reflected in its core values of accountability, integrity, and 
reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each 
weekday afternoon, GAO posts on its website newly released reports, 
testimony, and correspondence. To have GAO e-mail you a list of newly 
posted products, go to [hyperlink, http://www.gao.gov] and select 
“E-mail Updates.” 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of 
production and distribution and depends on the number of pages in the 
publication and whether the publication is printed in color or black 
and white. Pricing and ordering information is posted on GAO’s 
website, [hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or 
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card, 
MasterCard, Visa, check, or money order. Call for additional 
information. 

Connect with GAO: 

Connect with GAO on facebook, flickr, twitter, and YouTube.
Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts.
Visit GAO on the web at [hyperlink, http://www.gao.gov]. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 
Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; 
E-mail: fraudnet@gao.gov; 
Automated answering system: (800) 424-5454 or (202) 512-7470. 

Congressional Relations: 

Katherine Siggerud, Managing Director, siggerudk@gao.gov, (202) 512-4400
U.S. Government Accountability Office, 441 G Street NW, Room 7125
Washington, DC 20548. 

Public Affairs: 
Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149 
Washington, DC 20548.