This is the accessible text file for GAO report number GAO-10-199 
entitled 'Department Of Energy: Actions Needed to Develop High-Quality 
Cost Estimates for Construction and Environmental Cleanup Projects' 
which was released on January 14, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to the Subcommittee on Energy and Water Development, Committee 
on Appropriations, House of Representatives: 

United States Government Accountability Office: 
GAO: 

January 2010: 

Department Of Energy: 

Actions Needed to Develop High-Quality Cost Estimates for Construction 
and Environmental Cleanup Projects: 

GAO-10-199: 

GAO Highlights: 

Highlights of GAO-10-199, a report to the Subcommittee on Energy and 
Water Development, Committee on Appropriations, House of 
Representatives. 

Why GAO Did This Study: 

The Department of Energy (DOE) spends billions of dollars on 
construction projects—those that maintain nuclear weapons, conduct 
research, and process nuclear waste—and projects that clean up nuclear 
and hazardous wastes at DOE’s sites; these projects are largely 
executed by contractors. DOE has struggled to keep these projects 
within cost and schedule estimates. GAO was asked to assess (1) DOE’s 
cost-estimating policies and guidance, (2) the extent to which 
selected projects’ cost estimates reflect best practices identified in 
GAO’s cost-estimating guide, and (3) DOE’s recent actions to improve 
cost estimating. GAO reviewed relevant documents, including support 
for cost estimates at three major construction projects—those costing 
$750 million or more—and one environmental cleanup project, and 
interviewed DOE officials. 

What GAO Found: 

DOE has not had a policy that establishes standards for cost 
estimating in place for over a decade, and its guidance is outdated 
and incomplete, making it difficult for the department to oversee the 
development of high-quality cost estimates by its contractors. DOE’s 
only cost-estimating direction resides in its project management 
policy that does not indicate how cost estimates should be developed. 
In addition, DOE’s outdated cost-estimating guide assigns 
responsibilities to offices that no longer exist and does not fully 
include most of the best practices from government and industry in GAO’
s cost-estimating guide. Lacking a documented policy and associated 
guidance that contain best practices, DOE does not have appropriate 
internal controls in place that would allow its project managers to 
provide contractors a standard method for building high-quality cost 
estimates. DOE has drafted a new cost-estimating policy and guide but 
the department expects to miss its deadline for issuing them by more 
than a year. 

The cost estimates for the four projects we reviewed did not exemplify 
the four characteristics of high-quality cost estimates as established 
by best practices—credible, well-documented, accurate, and 
comprehensive. The four estimates lacked credibility because DOE did 
not sufficiently identify the level of confidence associated with the 
estimates, adequately examine the effects of changing key assumptions 
on the estimates, or cross-check the estimates with an ICE—an estimate 
created by an entity with no vested interest in the project. In 
addition, the four estimates were only partially documented, in part 
because the projects did not ensure that the contractors thoroughly 
documented the details of how they developed the estimates. Moreover, 
all four estimates lacked accuracy because they were not based on a 
reliable assessment of costs most likely to be incurred. Finally, none 
of the four estimates were comprehensive; for example, three of the 
estimates did not include costs associated with the full life cycle of 
the projects, and the estimating teams’ expertise and compositions did 
not reflect best practices. 

Although DOE has undertaken some actions to improve cost estimating, 
the department may undercut their impact by limiting the role and 
effectiveness of its new Office of Cost Analysis (OCA). In contrast to 
best practices and DOE’s stated mission for OCA, DOE’s draft cost-
estimating policy does not require OCA to conduct ICEs at project 
milestones unless requested by senior management. As a result, major 
projects are likely to continue to be approved without this 
independent check, limiting their credibility. Further, locating OCA 
apart from the existing DOE office that performs a similar but broader 
review function may lead to duplication of efforts and does not 
reflect best practices. That is, centralizing a cost-estimating team, 
rather than maintaining separate teams, facilitates sharing resources 
and using standard processes. Finally, placing OCA under the office 
that manages DOE’s finances may limit OCA’s independence and its 
access to relevantly skilled staff. It is also inconsistent with 
Congress’ recent action to establish an independent cost-estimating 
office at the Department of Defense, whose project management 
responsibilities are similar to those of DOE. 

What GAO Recommends: 

GAO is making six recommendations to improve DOE’s cost estimating. 
Among other things, GAO recommends that DOE (1) ensure its new policy 
and guide fully reflect cost-estimating best practices, in part by 
requiring independent cost estimates (ICE) for its major projects, (2) 
create a centralized, independent cost-estimating capability within 
the department, and (3) conduct ICEs for those major projects that 
have not received one. In commenting on a draft of this report, DOE 
generally agreed with GAO’s recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-10-199] or key 
components. For more information, contact Gene Aloise at (202) 512-
3841, or aloisee@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

DOE Lacks a Cost-Estimating Policy and Its Guidance Is Outdated and 
Incomplete, Impeding the Department's Development of High-Quality Cost 
Estimates: 

The Four Project Cost Estimates We Reviewed Were Not High-Quality: 

DOE Has Begun Taking Actions to Improve Its Cost Estimating, but 
Actions May Be Hampered by Limited Role and Organizational Location of 
New Cost Estimating Office and Lack of Coordination between DOE and 
Its Program Offices: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Scope and Methodology: 

Appendix II: Four Characteristics of a High-Quality Cost Estimate with 
Their Corresponding 12 Key Cost-Estimating Steps: 

Appendix III: Assessments of Four Project Cost Estimates Reviewed: 

Appendix IV: Comments from the Department of Energy: 

Appendix V: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Extent to Which DOE's Cost-Estimating Guide Contains 12 Key 
Steps for Developing High-Quality Cost Estimates: 

Table 2: Cost-Estimating Criteria for NSLS-II: 

Table 3: Cost-Estimating Criteria for Uranium Processing Facility: 

Table 4: Cost-Estimating Criteria for Salt Waste Processing Facility: 

Table 5: Cost-Estimating Criteria for EM Cleanup at Y-12: 

Figures: 

Figure 1: Extent to Which Four Cost Estimates We Reviewed Were 
Credible, Well-Documented, Accurate, and Comprehensive: 

Figure 2: National Synchrotron Light Source-II: 

Figure 3: Uranium Processing Facility: 

Figure 4: Salt Waste Processing Facility: 

Figure 5: EM Cleanup at Y-12: 

Abbreviations: 

CFO: Chief Financial Officer: 

DOD: Department of Defense: 

DOE: Department of Energy: 

EM: Office of Environmental Management: 

ICE: independent cost estimate: 

NNSA: National Nuclear Security Administration: 

NSLS: National Synchrotron Light Source: 

OCA: Office of Cost Analysis: 

OECM: Office of Engineering and Construction Management: 

Science: Office of Science: 

WBS: work breakdown structure: 

Weapons Systems Act: Weapons Systems Acquisition Reform Act of 2009: 

[End of section] 

United States Government Accountability Office: 

Washington, DC 20548:
January 14, 2010: 

The Honorable Peter J. Visclosky:
Chairman:
The Honorable Rodney P. Frelinghuysen:
Ranking Member:
Subcommittee on Energy and Water Development: 
Committee on Appropriations:
House of Representatives: 

The Department of Energy (DOE) is the largest civilian contracting 
agency in the federal government, spending about 90 percent of its 
annual budget to operate its laboratories, nuclear production 
facilities, and environmental cleanup sites. DOE's current projects 
include over 100 construction projects--those that, among other 
things, help maintain the nuclear weapons stockpile, conduct research 
and development, and process nuclear waste so it can be disposed of--
at an estimated total cost of nearly $90 billion and more than 90 
nuclear and hazardous waste cleanup projects at an estimated total 
cost of more than $220 billion. DOE's two largest program offices, the 
Office of Environmental Management (EM) and the Office of Science 
(Science), and the National Nuclear Security Administration (NNSA), a 
separately organized agency within DOE,[Footnote 1] manage the vast 
majority of this work, which is almost entirely conducted by 
contractors at DOE's sites. 

For years, DOE has had difficulty managing its contractor-run 
projects, and, despite repeated recommendations from us and others 
regarding specific steps that would improve project management, DOE 
continues to struggle to keep its projects within their cost, scope, 
and schedule estimates. For example, we reported in March 2007 that 8 
of DOE's 12 major construction projects had exceeded their initial 
cost estimates by a total of nearly $14 billion;[Footnote 2] we also 
reported in September 2008 that 9 of the 10 major EM cleanup projects 
had experienced cost increases and that DOE had estimated that it 
needed an additional $25 billion to $42 billion to complete these 
projects over the initial cost estimates.[Footnote 3] Because of DOE's 
history of inadequate management and oversight of its contractors, we 
have included DOE contract and project management on our list of 
government programs at high risk for fraud, waste, abuse, and 
mismanagement since the list's inception in 1990.[Footnote 4] In 
response to its continued presence on our list, in 2008, DOE examined 
the root causes of its contract and project management problems and 
found that independent cost estimating was one of the top five areas 
needing improvement. 

In March 2009, we issued a cost-estimating guide, a compilation of 
cost-estimating best practices drawn from across industry and 
government.[Footnote 5] Specifically, the cost guide identifies four 
characteristics of a high-quality--that is, reliable--cost estimate. 
Such an estimate would be credible, well-documented, accurate, and 
comprehensive.[Footnote 6] In addition, the guide lays out 12 key 
steps that, when followed correctly, should result in high-quality 
cost estimates. 

In this context, you asked us to assess cost estimating at DOE. This 
report determines (1) the extent to which DOE's policies and guidance 
support the creation of high-quality cost estimates, (2) the extent to 
which cost estimates of selected DOE projects reflect the four key 
characteristics of high-quality cost estimates, and (3) the actions 
DOE has taken recently to improve its cost estimating. 

For the first objective, we analyzed the policies and guidance in 
effect across DOE and compared them with the best practices identified 
in GAO's cost-estimating guide. We also interviewed several project 
managers to identify the policies and guidance they followed when 
overseeing their contractors' work in generating their cost estimates. 
For the second objective, we reviewed the most recent total project 
cost estimates from each of four selected DOE projects, including 
NNSA: three major construction projects--Science's National 
Synchrotron Light Source-II at Brookhaven National Laboratory in New 
York, NNSA's Uranium Processing Facility at Y-12 National Security 
Complex in Tennessee, and EM's Salt Waste Processing Facility at the 
Savannah River Site in South Carolina--and one environmental cleanup 
project, EM's decontamination and decommissioning project for the Y-12 
National Security Complex in Tennessee (EM Cleanup at Y-12). We 
compared the methods used to develop these estimates with cost-
estimating best practices. For the third objective, we reviewed 
documentation of proposed and recently implemented DOE actions and 
evaluated it against the best practices in our cost-estimating guide. 
We also interviewed department-level management officials, including 
NNSA, and officials from EM and Science. Appendix I contains 
additional information on our scope and methodology. 

We conducted this performance audit from September 2008 to January 
2010, in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

Background: 

DOE relies on its contractors to operate its sites and carry out its 
diverse missions, including developing, maintaining, and securing the 
nation's nuclear weapons capability; cleaning up the nuclear and 
hazardous wastes resulting from more than 50 years of weapons 
production; and conducting basic energy and scientific research, such 
as mapping the human genome. This mission work is carried out under 
the direction of DOE's program offices, including EM and Science, and 
NNSA. 

Project cost estimates are a necessary part of DOE's work for many 
reasons--for example, to support decisions about funding one project 
over another, to evaluate resource requirements at key project 
milestones, and to develop performance baselines. Having a realistic, 
up-to-date estimate of projected costs--one that is continually 
revised as the project matures--supports the development of annual 
budget requests, supports effective resource allocation, and increases 
the probability of a project's success. 

DOE's Project Management Order: 

In 2000, DOE issued Order 413, which established a process for 
managing projects, from identification of need through project 
completion.[Footnote 7] This project management order applies to 
construction projects--projects that build large complexes that often 
house unique equipment and technologies such as those that process 
waste or other radioactive material--and environmental cleanup 
projects, also referred to as "capital assets." Specifically, the 
order establishes five major milestones--or "critical decision 
points"--that span the life of a project. 

* Milestone 0: Approve mission need. DOE formally establishes the 
project and begins the process of conceptual planning and identifying 
a range of alternative approaches to meet the identified need. 

* Milestone 1: Approve an approach and cost range. At this milestone, 
DOE completes the conceptual design, selects its preferred approach, 
and approves the project's preliminary cost range. 

* Milestone 2: Approve the performance baseline--defined as a 
project's cost, schedule, and scope (the activities needed to achieve 
project goals). At this milestone, DOE completes its preliminary 
design and develops a definitive cost estimate. For construction 
projects, the cost estimate is now a point estimate and no longer a 
range; for cleanup projects, the cost estimate is also a point 
estimate but includes only the near-term scope of the project, 
covering about a 5-year period. 

* Milestone 3: Approve the start of construction. At this milestone, 
design and engineering are essentially complete and have been 
reviewed, and project construction or implementation begins. 

* Milestone 4: Approve the start of operations or project completion. 
For construction projects, at this milestone DOE completes the project 
and begins the transition to operations. For cleanup projects, this 
milestone represents completion of the project's activities and 
turnover of responsibility for management to another organization. 

Order 413 specifies the requirements that must be met, along with the 
documentation necessary, to move past each milestone; the order also 
requires that DOE senior management review the supporting 
documentation and approve the project at each milestone. DOE also 
provides suggested approaches for meeting the requirements contained 
in Order 413 through additional guidance. 

DOE's project management order also requires that the department 
conduct a variety of independent reviews of projects at the major 
milestones. For example, external independent reviews examine a 
project's estimated cost, scope, and schedule and are intended to 
provide reasonable assurance that the project can be successfully 
executed on time and within budget. These reviews are to be conducted 
by DOE's Office of Engineering and Construction Management (OECM)--an 
independent office outside the program offices--at milestone 2 for 
projects that cost $100 million or more and at milestone 3 for the 
major projects. For projects estimated to cost less than $100 million, 
Order 413 requires independent project reviews, which serve a similar 
function. Independent project reviews are required to be conducted at 
milestones 2 and 3 by reviewers from the respective program office or 
NNSA, who have no association with the project being reviewed. Order 
413 also requires a technical independent project review for nuclear 
projects approaching milestone 1; this review focuses on ensuring that 
the project's design integrates safety and security measures. 

In validating a project's cost estimate as part of an external 
independent review or independent project review, reviewers have a 
number of methods available to them. According to the best practices 
compiled in our cost guide, the most rigorous method is the 
independent cost estimate (ICE). Generated by an entity that has no 
stake in the approval of the project, an ICE provides an independent 
view of expected project costs. An ICE is usually developed based on 
the same technical parameters as the project team's estimate, so the 
estimates are comparable. Conducting an ICE is especially important at 
major milestones because it provides senior decision makers with a 
more objective assessment of the likely cost of a project. A second, 
less rigorous method for validating a project's cost estimate--an 
independent cost review--focuses on examining the estimate's 
supporting documentation and interviewing relevant staff. Moreover, 
independent cost reviews address the cost estimate's high-value, high-
risk, and high-interest aspects without evaluating the remainder of 
the estimate. 

Evolution of Cost Estimating at DOE: 

DOE's approach to managing the work its contractors perform, including 
developing cost estimates, has changed substantially several times 
over the past 30 years. In 1982, we reported that DOE lacked 
sufficient guidance to provide to its contractors for developing cost 
estimates.[Footnote 8] DOE subsequently implemented a cost-estimating 
policy that increased oversight by, among other things, placing a 
headquarters-based office in charge of cost estimating and requiring 
it to conduct independent cost estimates. The policy also directed DOE 
to establish guidance that outlined procedures to be used by 
contractors when generating estimates and by DOE officials reviewing 
them. In the mid-1990s, however, as part of a governmentwide 
management reform movement, DOE rescinded its cost-estimating policy 
and replaced it with a less prescriptive one that did not contain 
specifics on cost estimating but rather focused on managing the life 
cycles of the department's physical assets. DOE has acknowledged that 
some of its actions likely went too far in removing oversight of the 
work of its contractors, and, over the past several years, the 
department has taken steps to reintroduce some cost-estimating 
oversight functions. However, some of these efforts were never 
officially implemented or were abandoned. For example, DOE proposed to 
create a "cost engineering group" in 2002 with a mission to improve 
DOE's cost estimating, but the effort was never fully implemented. 

In late 2007, DOE initiated an effort to address its contract and 
project management challenges, which involved identifying issues that 
significantly impeded the department's ability to complete projects 
within budget and on schedule. DOE undertook this exercise--known as a 
root cause analysis--as part of its effort to be removed from our list 
of agencies at high risk for fraud, waste, abuse, and mismanagement. 
DOE senior staff identified an insufficient independent cost 
estimating capability as one of the top five reasons that DOE was 
unable to complete projects on cost and schedule--the other top 
reasons were inadequate front-end planning, biased identification and 
management of risks associated with projects, failure to request and 
obtain adequate funding for projects, and inadequate federal 
personnel, including cost estimators. DOE officials identified root 
causes associated with the department's challenges--for cost 
estimating, these included a lack of policy or standards, lack of 
personnel with appropriate skills, and lack of databases with 
historical cost information. In mid-2008, DOE adopted a corrective 
action plan designed to mitigate the issues identified in the root 
cause analysis. The corrective action plan includes a set of actions 
designed to establish and implement a "federal independent government 
cost estimating capability" to address the issues related to cost 
estimating. 

Cost-Estimating Best Practices: 

Drawing from federal cost-estimating organizations and industry, our 
cost-estimating guide includes four characteristics of a high-quality 
cost estimate that management can use for making informed decisions. 
[Footnote 9] A high-quality cost estimate is credible, well- 
documented, accurate, and comprehensive. Following are some of the 
criteria that the cost-estimating guide cites as central to achieving 
these characteristics. An estimate is: 

* credible when it has been cross-checked with independent cost 
estimates, the level of confidence associated with the point estimate 
has been identified,[Footnote 10] and a sensitivity analysis has been 
conducted--that is, the project has examined the effect of changing 
one assumption related to each project activity while holding all 
other variables constant in order to identify which variable most 
affects the cost estimate; 

* well-documented when supporting documentation is accompanied by a 
narrative explaining the process, sources, and methods used to create 
the estimate and contains the underlying data used to develop the 
estimate; 

* accurate when it is not overly conservative or too optimistic and 
based on an assessment of the costs most likely to be incurred; and: 

* comprehensive when it accounts for all possible costs associated 
with a project, is structured in sufficient detail to ensure that 
costs are neither omitted nor double-counted, and the estimating 
teams' composition is commensurate with the assignment. 

In addition, our cost guide lays out 12 key steps that should result 
in high-quality cost estimates. The guide also contains hundreds of 
best practices drawn from across industry and government for carrying 
out these steps. Appendix II shows how these 12 key steps relate to 
each characteristic of a high-quality cost estimate. 

DOE Lacks a Cost-Estimating Policy and Its Guidance Is Outdated and 
Incomplete, Impeding the Department's Development of High-Quality Cost 
Estimates: 

DOE has not had a cost-estimating policy in place for more than 10 
years, one factor that makes it difficult for the department to 
oversee the development of high-quality cost estimates. A cost-
estimating policy would establish roles and responsibilities for those 
preparing, reviewing, and updating all types of cost estimates, 
including independent cost estimates. A policy would also identify 
when different cost estimates would be conducted, while also serving 
as a mechanism for providing standard cost-estimating procedures to 
agency officials and contractors. As we have previously reported, the 
lack of a cost-estimating policy at other agencies has led to cost 
estimates of poor quality.[Footnote 11] 

The only direction DOE currently provides related to project cost 
estimating is contained in its project management order, which states 
that cost estimates should be developed and reviewed at most of the 
major milestones that span the life of a project. For example, the 
order simply states that the performance baseline developed for 
milestone 2 should be a "definitive cost estimate," encompassing the 
total project cost; it does not specify how the cost estimate should 
be developed. In addition, the order sets out requirements for the 
type of cost estimate project teams need to produce at each milestone--
for example, ranges versus detailed point estimates--but does not 
identify which phases of a project should be included in each estimate 
(e.g., research and development, operations or decommissioning) or how 
the estimate should be maintained throughout the life of the project. 
Moreover, guidance accompanying the order contains a description of 
the process a project team could follow when developing the cost 
portion of a performance baseline, but that description is limited as 
well. For example, the guidance discusses the importance of including 
certain information in the estimate's supporting documentation, such 
as the cost-estimating methodology used, but does not offer direction 
on how to choose the appropriate methodology. 

Regarding the direction on reviewing cost estimates, while DOE's 
project management order specifies that its projects receive either an 
external independent review or an independent project review prior to 
approval of both milestone 2 (the approval of the performance 
baseline) and milestone 3 (the approval of the start of construction), 
the order is not clear on when an external independent review should 
include an ICE, and the order is silent on the method to be used for 
an independent project review. Although, as explained in our cost 
guide, an ICE is considered a best practice, the order states that 
ICEs should be done only for major projects where "complexity, risk, 
cost or other factors create a significant cost exposure" for the 
department; however, it does not define what is meant by 
"significant." Moreover, even though DOE officials have acknowledged 
that the department's major projects would benefit from ICEs, OECM has 
not conducted them, and instead has traditionally examined project 
cost estimates through independent cost reviews, a less rigorous 
approach. We recently reported that the usefulness of these 
independent cost reviews is questionable given that 4 of the 10 DOE 
major cleanup projects we reviewed in 2008 had significantly increased 
their cost estimates even though these reviews had found the estimates 
were valid.[Footnote 12] Moreover, the project management order does 
not require an independent review of any project's cost estimate prior 
to milestone 1, even though, according to senior DOE officials, there 
is significant risk to the project at this early milestone given the 
large number of unknowns that could affect the project team's ability 
to complete the project within cost and on schedule. 

Although DOE lacks a cost-estimating policy, the cost-estimating guide 
it developed in the 1990s remains in effect;[Footnote 13] this guide 
is out of date and lacks important components. For example, the guide 
assigns responsibilities to offices that no longer exist and is based 
on policies that have been canceled. More specifically, the guide's 
description of DOE's project management system is based on the policy 
that preceded Order 413, and the guide states that it serves as a 
companion to DOE's defunct cost-estimating policy. In addition, the 
guide does not contain sufficient information to help ensure that a 
cost estimator following the guide will successfully create a high- 
quality cost estimate. Table 1 shows the extent to which DOE's guide 
contains information on the steps identified as best practices for 
developing high-quality cost estimates. 

Table 1: Extent to Which DOE's Cost-Estimating Guide Contains 12 Key 
Steps for Developing High-Quality Cost Estimates: 

12 key steps: 

1. Define estimate's purpose, scope, and schedule: Fully. 

2. Develop the estimating plan: Fully. 

3. Define the program: Partially. 

4. Determine the estimating approach: Somewhat. 

5. Identify ground rules and assumptions: Somewhat. 

6. Obtain the data: Mostly. 

7. Develop the point estimate and compare it to an independent cost 
estimate: Mostly. 

8. Conduct sensitivity analysis: Not. 

9. Conduct risk and uncertainty analysis: Somewhat. 

10. Document the estimate: Mostly. 

11. Present the estimate to management: Not. 

12. Update the estimate to reflect actual costs and changes: Somewhat. 

Source: GAO Analysis of DOE Guide 430.1-1 (1997). 

Note: The ratings we used in this analysis are as follows: "Fully" 
means that the guide included information that satisfied the 
criterion; "Mostly" means that the guide included the majority of the 
information to satisfy the criterion; "Partially" means that the guide 
included information satisfying part of the criterion; "Somewhat" 
means that the guide included information satisfying a minor part of 
the criterion; and "Not" means that the guide did not include 
information that satisfied the criterion. 

[End of table] 

Specifically, DOE's guide contains all the information necessary to 
fully carry out 2 of the 12 key steps identified in our cost guide as 
necessary for developing high-quality cost estimates: defining the 
estimate's purpose, scope, and schedule and developing an estimating 
plan. Defining the estimate's purpose, scope, and schedule is 
important to ensure that the estimate supports the department's 
missions, goals, and strategic objectives; that it will meet the 
customer's needs; and that there will be sufficient time to develop 
the estimate. In addition, developing a written estimating plan that 
details a master schedule of specific tasks, responsible parties, and 
due dates helps ensure that all stakeholders are involved and aware of 
their responsibilities and deadlines. Moreover, although the guide 
includes varying degrees of information for mostly, partially, or 
somewhat carrying out 8 of the 12 steps, it contains no specifics 
about 2 other key steps: conducting a sensitivity analysis and 
presenting the estimate to management for approval. A sensitivity 
analysis is important because it determines the effects of changing 
key assumptions underlying the cost estimate and allows project 
managers to develop appropriate mitigation measures, where warranted. 
In addition, according to best practices, management should be briefed 
on how a cost estimate was developed before it is approved, and that 
briefing should include risks associated with the underlying data and 
methods. While DOE's project management order identifies the decision 
maker responsible for approving project cost estimates, the DOE guide 
does not provide information on what to include when briefing 
management. 

Without a documented cost-estimating policy and related guidance, DOE 
project managers do not have a standard method to provide to their 
contractors to help them build high-quality cost estimates or for DOE 
to use as a basis for evaluating those estimates. According to the 
Standards for Internal Control in the Federal Government, federal 
agencies are to employ internal control activities, such as reviews by 
management, to help ensure that management's directives are carried 
out and to determine if agencies are effectively and efficiently using 
resources.[Footnote 14] However, DOE project managers do not have a 
documented standard to use when comparing actual performance of its 
contractors--in terms of the cost estimates the contractors deliver--
to planned or expected results. As a result, for the four projects we 
reviewed, contractors had developed cost estimates using their own 
company policies, along with the minimal direction provided by DOE's 
project management order, and some of the project managers we spoke 
with looked elsewhere for guidance. For example, one DOE project 
manager said he still used the cost-estimating policy DOE canceled in 
the mid-1990s to help oversee the contractor; another project manager 
said he relied on cost-estimating guidance that had been drafted but 
never formalized. Specifically, in 2004, a team drafted an update to 
DOE's cost-estimating guide that, according to one of its authors, was 
ready to be submitted for agency-wide approval when senior-level 
support for it evaporated. According to this official, although it was 
never officially adopted, the draft guide is still shared among DOE 
staff, especially those within EM. According to the corrective action 
plan DOE developed to address its contract and project management 
issues, it has identified the need to update its cost-estimating 
guide, as well as to re-establish a cost-estimating policy. According 
to DOE officials, it has begun drafting the new policy and guide. 

The Four Project Cost Estimates We Reviewed Were Not High-Quality: 

Our analysis of the four DOE project cost estimates we reviewed found 
that they did not fully achieve the four characteristics of high- 
quality estimates as identified by the professional cost-estimating 
community and documented in our cost guide--credible, well-documented, 
accurate, and comprehensive. More specifically, the four estimates 
were only somewhat credible and only partially well-documented, 
accurate, and comprehensive, as shown in figure 1. Appendix III 
contains more information about each project, including the project 
cost estimate, stage of development, and how well the project followed 
the 12 key steps of the cost-estimating process that lead to high-
quality estimates. 

Figure 1: Extent to Which Four Cost Estimates We Reviewed Were 
Credible, Well-Documented, Accurate, and Comprehensive: 

[Refer to PDF for image: table] 

Projects and characteristics: 

National Synchrotron Light Source-II: 
Well-documented: Somewhat; 
Credible: Partially; 
Accurate: Somewhat; 
Comprehensive: Partially. 

Uranium Processing Facility: 
Well-documented: Somewhat; 
Credible: Partially; 
Accurate: Somewhat; 
Comprehensive: Partially. 

Salt Waste Processing Facility: 
Well-documented: Somewhat; 
Credible: Partially; 
Accurate: Partially; 
Comprehensive: Partially. 

EM Cleanup at Y-12: 
Well-documented: Somewhat; 
Credible: Partially; 
Accurate: Partially; 
Comprehensive: Mostly. 

Source: GAO analysis. 

Note: The ratings we used in this analysis are as follows: "Fully" 
means that the program provided documentation that satisfied the 
criterion; "Mostly" means that the program provided the majority of 
the documentation to satisfy the criterion; "Partially" means that the 
program provided documentation satisfying part of the criterion; 
"Somewhat" means that the program provided documentation satisfying a 
minor part of the criterion; and "Not" means that the program did not 
provide documentation that satisfied the criterion. 

[End of figure] 

The Four Project Cost Estimates We Reviewed Were Somewhat Credible: 

The cost estimates of the four projects we reviewed lacked credibility 
because DOE did not sufficiently cross-check the projects' cost 
estimates with ICEs, use best practices when identifying the level of 
confidence associated with the estimates, or sufficiently analyze 
project sensitivities. More specifically, DOE did not conduct ICEs for 
three of the four projects--National Synchrotron Light Source-II, 
Uranium Processing Facility, and EM Cleanup at Y-12. Instead, these 
three projects received independent cost reviews as part of external 
independent reviews or independent project reviews. An independent 
cost review is less rigorous than an ICE because it addresses the cost 
estimate's high-value, high-risk, and high-interest aspects without 
evaluating the remainder of the estimate. In some cases, the project 
teams or program offices conducted additional reviews beyond what was 
required under DOE's project management order. For example, a team 
from a number of DOE's national labs and Science officials performed a 
peer review of the National Synchrotron Light Source-II estimate, and 
its contractor hired a firm to conduct two independent estimates of 
the construction portion of the project's scope, though not for the 
entire project. These additional reviews add value, but are not as 
independent as the best practice--although the contractor obtained an 
independent estimate for the construction portion, it was not 
conducted by an entity without a stake in the approval of the project, 
compromising the estimate's independence. 

In contrast, although DOE conducted an ICE at the fourth project, Salt 
Waste Processing Facility, after DOE's Deputy Secretary requested it, 
DOE did not follow best practices when reconciling the ICE's results 
with the project team's estimate, contributing to the project 
estimate's lack of credibility. By extrapolating costs from a smaller 
scale, similar project already operational near the Salt Waste 
facility, the ICE team estimated the cost for the project could reach 
$2.7 billion, more than twice as much as the project team's estimate 
of $1.3 billion. According to our cost guide, ICEs are usually higher 
and more accurate than baseline estimates, which are created by 
project teams; if a project's estimate is close to an ICE's results, 
one can be more confident that it is accurate. It is also a best 
practice that, after the ICE is completed, the ICE team and project 
team identify the major differences between the two estimates and, 
where possible, reconcile those differences; a synopsis of the 
estimates and their differences should then be provided to management. 
According to officials from the ICE team and the project team, this 
formal process did not occur. The difference between the ICE and the 
project team's estimate primarily stemmed from the ICE's incorporation 
of additional costs to cover risks the ICE team felt were 
insufficiently addressed in the project team's estimate.[Footnote 15] 
The ICE team provided the project with a high-level summary of its 
findings but did not provide the supporting details of its estimate. 
Based on the information provided, the project team increased its 
estimated total cost by $100 million. According to project officials, 
by increasing the estimate by this amount, the estimate included 
sufficient funding to mitigate the risks raised by the ICE team. 
Moreover, during the time DOE conducted the ICE, OECM conducted an 
independent cost review as part of its external independent review 
and, after taking the ICE's conclusions into consideration, validated 
the project team's final estimate of $1.3 billion. It is too soon to 
tell whether the risks identified by the ICE will materialize; the 
project was approved to begin full construction in January 2009 and 
has had some challenges with quality assurance in constructing the 
foundation for the building, but according to the project team, these 
issues are not expected to have a significant impact on performance. 

Additionally, the methods DOE used to identify the level of confidence 
associated with the cost estimates for the four projects only 
partially reflected best practices, which limited the estimates' 
credibility. Although each project conducted a risk analysis to 
identify the confidence level associated with its estimate--if 
calculated correctly, a confidence level tells the likelihood of the 
project being completed at or under a specific cost--none of the 
projects used best practices when constructing the computer models 
used to support the risk analysis and the resulting confidence level. 
[Footnote 16] For example, one significant problem common across all 
four projects was that, when building their models, the projects did 
not correlate--or link--different project activities that are 
dependent on or tied to one another. Correlation captures the fact 
that, for example, technical performance problems experienced by one 
activity could result in unexpected design changes and unplanned 
testing for other activities. Similarly, a schedule slip experienced 
by one activity could have a cascading effect on other activities, 
such as if a supplier delivers an item late, other scheduled 
deliveries could be missed, resulting in additional cost. According to 
best practices, ignoring correlation, as the four projects did, can 
significantly affect the results of a risk analysis, creating a false 
sense of confidence in the resulting estimate. As a result, the 
confidence levels associated with the projects' cost estimates--
ranging from 80 percent to 95 percent confidence--are likely 
overstated. 

A second problem with how the estimates' levels of confidence were 
identified was that two of the projects did not use best practices in 
determining their contingency reserves.[Footnote 17] According to best 
practices, the difference in cost between the project team's estimate 
and the desired confidence level should determine the required 
contingency amount. For example, if a project team estimates its cost 
and determines, through its risk analysis, that it has 50 percent 
confidence in completing the project for no more than that cost, in 
order to increase the confidence level of the estimate, it should add 
contingency reserves based on the statistical output from the risk 
analysis.[Footnote 18] In contrast, two of the projects we reviewed 
did not add contingency reserves to their estimates using this method. 
For example, the Uranium Processing Facility contractor conducted a 
risk analysis in accordance with many best practices that showed that, 
by adding contingency reserves, the project would have 95 percent 
confidence of success in completing the project with a $2.3 billion 
estimate. However, contrary to best practices, NNSA headquarters 
officials then added more than $1 billion in contingency to the 
contractor's estimate, bringing the high end of the project's 
estimated cost range up to $3.5 billion. This billion-dollar 
"allowance" was added, according to a senior NNSA official, because 
the experts reviewing the project thought it would require 
significantly more than the project team's $2.3 billion estimate to 
complete the project, and NNSA did not want to exceed the high end of 
its cost estimate range in the future. The allowance was supported by 
a memo outlining risks to the project it was intended to cover that 
were not included in the initial risk analysis, including material and 
commodity cost growth and schedule impacts associated with delayed 
decision-making or arrival of expected funds; the high-level nature of 
the memo stands in sharp contrast to the detail involved in the 
project team's risk analysis. Further, because the risks in the memo 
were not incorporated into the project team's risk analysis, the 
confidence level associated with the $3.5 billion high end of the 
estimate range is not known, leaving decision makers within DOE and 
Congress without a sound basis for determining appropriate funding 
levels for the project. 

A third problem with how projects identified the level of confidence 
associated with their estimates was that two of the projects we 
reviewed--EM Cleanup at Y-12 and Uranium Processing Facility--included 
contingency reserves in their cost estimates but their program offices 
did not budget for all of it, limiting the funding available to cover 
the costs associated with risks that may materialize once the project 
is under way. According to best practices, having adequate funding is 
paramount for optimal project execution, since it can take many months 
to obtain necessary funding to address an emergent issue. Without 
readily available risk funding, additional cost growth is likely. For 
example, the EM Cleanup at Y-12 project's near-term cost estimate 
included more than $50 million in contingency reserves, but EM has not 
committed to funding any of it. Without this contingency reserve 
available, according to the project baseline, the likelihood of 
completing the project's near-term scope within its budget is 50 
percent. As we previously reported, although EM project managers build 
contingency funding into their near-term and out-year estimates, EM 
management does not generally include funding in its budget requests 
to cover contingency for cleanup projects until after it is actually 
needed to address a problem. We also reported that this practice was 
likely a contributing factor to the cost increases and schedule delays 
recently experienced by EM's major cleanup projects.[Footnote 19] 
Similarly, even though the approved upper end of Uranium Processing 
Facility's cost estimate range is $3.5 billion, NNSA does not intend 
to include the $1 billion allowance in the project's budget.[Footnote 
20] According to the memo explaining the allowance, if any of the 
risks covered by the allowance were to occur, they would be beyond 
NNSA's budget parameters and would have to be funded on a case-by-case 
basis. 

Finally, none of the four projects conducted a sensitivity analysis, 
further undermining their estimates' credibility. By conducting this 
analysis, the variable that most affects the cost estimate becomes 
more obvious, thereby allowing project managers to develop risk 
mitigation steps specific to that variable. Because this analysis can 
help decision makers choose among alternatives, it is especially 
important early in a project's life cycle while assumptions can still 
be changed. For the one project we reviewed that was at such an early 
stage of development--Uranium Processing Facility--not conducting a 
sensitivity analysis meant that project managers were not able to give 
decision makers an understanding of the impacts on project cost of, 
for example, varying the square footage of the building. 

The Four Project Estimates Were Only Partially Documented, Accurate, 
and Comprehensive: 

The four estimates we reviewed lacked complete documentation. For 
example, three of the four projects did not generate a narrative 
summary explaining the process, sources, and methods used to create 
the estimates, and outlining clearly and concisely the cost estimate 
results, including information about cost drivers and high-risk areas. 
Documenting the estimate in a narrative at this level of detail 
provides enough information so that someone unfamiliar with the 
project could easily recreate or update it. In addition, none of the 
four projects systematically included the underlying data on which the 
estimates were based in the documentation sets, which can cause an 
estimate's credibility to suffer because the rationale supporting the 
specific costs is not clear. More specifically, at the EM Cleanup at Y-
12 project, while the project team created notebooks containing 
supporting documentation for the estimate that included detailed 
descriptions of how each cost was derived, these notebooks did not 
consistently contain evidence of the source data--for example, quotes 
from vendors or historical data from another project--used for each 
calculation. Similarly, cost estimators at Uranium Processing Facility 
did not collect evidence of all the data that supported costs 
contained in the estimate, which led to a lack of transparency of what 
work activities were included in the estimate. As a result, after the 
estimate was approved, as project engineers continued to identify 
activities necessary for constructing the facility, there was no 
documented record for them to refer to in order to determine which 
activities were already present in the estimate or whether they 
represented new work that would increase the cost of the project. 

Moreover, the four project cost estimates we reviewed lacked accuracy 
because they were not based on a reliable assessment of costs most 
likely to be incurred. For example, two of the projects--Uranium 
Processing Facility and National Synchrotron Light Source-II--did not 
always use appropriate estimating methodologies. These projects both 
used a highly detailed method that is appropriate for a project whose 
design is stable and not anticipated to change. However, this was not 
the case for Uranium Processing Facility or for the portions of 
National Synchrotron Light Source-II that had not been fully designed 
yet. As a result, according to best practices, a less detailed 
methodology focused more on using statistical relationships to 
extrapolate costs would have been more appropriate. Further, NNSA's 
technical independent review of the Uranium Processing Facility 
estimate echoed this sentiment, stating that the project's cost 
estimate range was unsupported in part because it was prepared with 
significant detail--for example, the estimate provided a count of 
pipings and fittings for the facility--despite the fact that there had 
been no design of technical systems or of the building on which to 
base these details. In addition, three of the four project estimates--
Salt Waste Processing Facility, National Synchrotron Light Source-II, 
and Uranium Processing Facility--did not use adequate data to estimate 
the projects' costs. According to best practices, basing an estimate 
largely on valid and useful historical data is a key step in 
developing a sound cost estimate; however, these three estimates were 
not primarily based on relevant historical actual costs. For example, 
at National Synchrotron Light Source-II, only 12 percent of the cost 
estimate was based on historical costs. For the remainder of the 
estimate, the project team relied heavily on the professional opinion 
of the technical experts working on the project. Although these 
individuals had significant experience working with other light 
sources, relying on judgment lacks objectivity and introduces bias 
into the estimate. These projects did not use historical data, in 
part, because, in contrast to best practices, DOE does not have a 
database of historical costs from previously completed projects 
available for newer projects to use, nor does the department 
explicitly require projects to use historical data when generating 
cost estimates. In addition, even if historical data did exist, they 
would not always be available for use. For example, at Uranium 
Processing Facility, historical costs from a comparable project that 
was built next door would have been directly relevant but were not 
available because of their proprietary nature. In addition, although 
the EM Cleanup at Y-12 project had directly relevant historical actual 
costs to draw from, its cost estimate lacked accuracy in part because 
the project team did not determine the validity of the statistical 
relationships it used when calculating the out-year portion of its 
estimate. 

Finally, none of the four projects were fully comprehensive, in part 
because they did not account for all possible costs. Specifically, 
Salt Waste Processing Facility, National Synchrotron Light Source-II, 
and Uranium Processing Facility did not include costs associated with 
the full life of the project. According to best practices, life cycle 
costing--a "cradle to grave" approach that includes costs from design 
and construction through operations, decommissioning, and disposal-- 
enhances decision making and provides valuable information about how 
much projects are expected to cost over time. However, DOE's Order 413 
does not require construction projects to produce life cycle cost 
estimates at every major milestone; as a result, the three 
construction project estimates we reviewed represented a more limited 
scope of activities.[Footnote 21] For instance, the estimate for Salt 
Waste Processing Facility did not include costs to maintain and 
operate the facility, including the time during which the facility is 
turned on and tested to see whether it will work as designed--known as 
"hot commissioning." According to DOE officials, these costs were 
captured under a separate operations project. As a result, the full 
life cycle cost of the facility, including the operations it supports, 
is not transparent or easily identified. The ICE for Salt Waste 
Processing Facility recommended that DOE at least incorporate the hot 
commissioning costs into the total project cost, since it more 
completely captures the work it takes to prepare the facility for full 
operations. In response, according to agency officials, DOE is 
considering changing its policy to include these costs in the scope of 
future construction projects. 

In addition, the cost estimates of National Synchrotron Light Source-
II and Uranium Processing Facility left out significant portions of 
scope required to complete construction of their facilities, further 
limiting their comprehensiveness. In particular, although National 
Synchrotron Light Source-II was designed to include 58 points at which 
the X-ray generated by the light source is directed into experimental 
facilities--known as beamlines--the project's scope, and thus, its 
cost estimate, only included funding for 6 of those beamlines. Based 
on the data supplied by a senior project official, we estimate that 
including funding for the rest of the beamlines would add roughly $400 
million to $500 million to the estimate--about 50 percent more than 
the approved total project cost. According to project officials, the 
costs for the beamlines were not included in the scope of the project 
because other agencies are expected to contribute funding for them. 
However, excluding them from the project resulted in a cost estimate 
that did not include the facility's full--or even partial--capability 
and did not represent the total cost to the taxpayers. At the Uranium 
Processing Facility, the cost estimate did not include costs 
associated with developing technologies that are critical to the 
facility's functioning but were not yet mature enough to be included 
in such a facility. Although not part of the total project cost for 
the facility, these technology development costs are managed and 
funded by the contractor running the Y-12 site. However, this 
situation presents a challenge to the project's managers, since the 
funding to develop these technologies is not under their control. 
[Footnote 22] Moreover, Uranium Processing Facility's $1.4 billion 
cost estimate for the low end of its range includes less of the 
project's scope than the high end of its range. For example, while the 
$3.5 billion high end of the range includes costs associated with a 
tunnel that will be used to safely transport dangerous materials 
between Uranium Processing Facility and an adjacent storage building, 
the low end does not include this tunnel. Because the low end of the 
estimate range does not include the same scope as the high end, 
presenting $1.4 billion as a possible cost of the facility is 
misleading, especially given that the project's risk analysis shows 
there is a zero percent likelihood of constructing the facility for 
that amount. 

Finally, the four projects' estimates lacked comprehensiveness because 
the teams who generated the estimates were not comprised in accordance 
with best practices. Although each of these teams was led by or 
included some experienced and trained cost analysts, the teams were 
generally made up of scientists and engineers who did not appear to 
have such experience or training. Further, it was the contractors' 
staff--not federal staff--who developed the cost estimates for all 
four of the projects we reviewed. As we reported in our cost guide, 
reliance on support contractors raises questions from the cost-
estimating community about whether numbers and qualifications of 
federal personnel are sufficient to provide oversight of and insight 
into contractor cost estimates. At DOE, it appears this is not the 
case--as part of its effort to address its contract and project 
management challenges, DOE found that one of the root causes of these 
problems was its lack of federal personnel, including cost estimators, 
to oversee its contractors. 

DOE Has Begun Taking Actions to Improve Its Cost Estimating, but 
Actions May Be Hampered by Limited Role and Organizational Location of 
New Cost Estimating Office and Lack of Coordination between DOE and 
Its Program Offices: 

DOE recently initiated a number of actions at the department-wide 
level to improve its cost estimates, the first of which was to 
establish the Office of Cost Analysis (OCA) in 2008. OCA has started 
implementing a number of actions that are designed to improve cost 
estimating, but these actions may be hampered for various reasons. In 
addition, some program offices have taken independent steps to improve 
the quality of their cost estimates, some of which reflect best 
practices; however, a lack of coordination on some actions may lead to 
duplication of effort. 

DOE Has Begun Taking Actions to Improve Cost Estimating at the 
Department-wide Level: 

DOE established OCA in order to improve the department's cost- 
estimating capabilities and better ensure that its project cost 
estimates are reliable by providing a new independent cost-estimating 
function for the department. In addition, as outlined in its recently 
developed corrective action plan, DOE has given OCA the primary 
responsibility for implementing the department's cost-estimating 
improvement efforts. Specifically, DOE tasked OCA with the following 
actions: 

* developing a new cost-estimating policy for the department and 
updating its guidance on cost estimating; 

* conducting independent cost estimates and analyses for major 
projects; 

* developing escalation rates to help program managers and DOE 
contractors estimate future costs of commodities and labor; 

* developing a historical cost database, designed to improve cost- 
estimating accuracy by allowing project managers and contractors 
access to historical costs of completed DOE projects; 

* developing cost-estimating skills of field office staff, in part 
through new training courses; 

* creating a common work activity structure, known as a work breakdown 
structure, to better enable side-by-side comparisons of project 
estimates and to facilitate collecting comparable cost data; and: 

* identifying lessons learned from external independent reviews and 
developing relevant corrective actions. 

These actions appear to represent a comprehensive approach to 
improving cost estimates, and although OCA is making progress in 
completing some of these actions, it has fallen behind on other tasks. 
Most notably, OCA has developed ICEs for several projects, including 
the ICE for Salt Waste Processing Facility, and has begun publishing 
annual escalation rates that projects will be required to use and that 
are based on OCA's evaluation of economic conditions and industry 
trends.[Footnote 23] According to OCA's director, the office also held 
its first training course on cost estimating in August 2009 and is 
establishing an online "Cost Analysis Community Portal" where DOE 
staff can access training and electronic links to professional 
development tools. In contrast, OCA missed its December 2008 deadline 
for completing the new cost-estimating policy and guide. According to 
OCA's Director, the new cost-estimating policy--Order 415--and the 
cost-estimating guide are now anticipated to be issued sometime in the 
first quarter of 2010. Also according to the OCA Director, the office 
has experienced a delay in completing its historical cost database, 
which is expected to provide project managers with better data for 
building future cost estimates. Originally to be completed in June 
2009, OCA recently changed its approach for developing the database 
and now the Director of OCA expects it to be completed in May 2010. 

The Limited Role and Organizational Location of DOE's New Independent 
Cost Estimating Office May Undercut Improvements: 

While DOE has made some progress on its cost-estimating improvement 
efforts, the most recent draft of its cost-estimating policy does not 
reflect best practices and falls short of fulfilling DOE's stated 
mission for OCA--to conduct ICEs of major projects before approval of 
milestones 1, 2, and 3. In fact, the current draft policy does not 
require OCA to conduct ICEs of any project at any milestone, in 
contrast to best practices. Instead, the draft directs OCA to conduct 
ICEs only at the request of DOE senior management. Senior DOE 
officials told us that when OCA could not obtain the support necessary 
within the department for mandatory ICEs at these milestones, it 
eliminated the requirement from the draft policy. Consequently, many 
of DOE's projects will continue to be approved without independent 
cost estimates, thereby limiting their credibility. In contrast, for 
many years, Congress has required that the Department of Defense's 
(DOD) independent cost office, the Office of the Deputy Director for 
Cost Assessment,[Footnote 24] perform ICEs of major projects before 
DOD's equivalent of milestones 2 and 3. Furthermore, Congress recently 
moved to improve DOD's cost estimating by passing the Weapons Systems 
Acquisition Reform Act of 2009 (Weapons Systems Act), in which it 
expanded the requirement that DOD conduct ICEs to include DOD's 
milestone A, the equivalent of DOE's milestone 1.[Footnote 25] 

DOE's decision to locate the new cost-estimating office where it did 
within the organization limits OCA's ability to effectively function 
as intended in several ways. First, instead of locating OCA within an 
office that has staff with similar skills and expertise and that 
performs similar functions, such as OECM, DOE located OCA under the 
Chief Financial Officer's (CFO) Office--an office that serves as the 
principal advisor for DOE's financial, budgeting, and strategic 
planning issues. According to best practices, an agency's cost- 
estimating team should be centralized--that is, consolidated--in order 
to facilitate the use of standardized processes and more effective 
sharing of resources. Under the current configuration, however, DOE's 
cost-estimating functions are split across the two offices. Further, 
the National Academy of Public Administration reported in July 2009 
that DOE's mission-support offices--which include OCA and OECM--need 
better integration. While the report stated that the department's 
mission-support offices communicate with one another, they tend to 
operate independently, and there is no ongoing mechanism to coordinate 
common efforts. Moreover, complicating the fact that OCA and OECM are 
in separate offices, their roles and responsibilities are not clear 
and may overlap. Specifically, the draft cost-estimating policy does 
not clearly delineate the roles of these two offices, stating only 
that, "the Office of Cost Analysis will not duplicate cost reviews 
performed by the Office of Engineering and Construction Management." A 
senior DOE program office official has raised concerns that without a 
clearer articulation of the roles of these offices, an additional 
layer of review could be added, leading to duplication of work. This 
type of duplication occurred at Salt Waste Processing Facility when, 
prior to milestone 3 and within weeks of each other, OECM performed an 
independent cost review as part of an external independent review and 
OCA conducted a separate ICE. Because the reviews were not 
coordinated, the project team spent extra time separately briefing and 
providing many of the same documents to each independent team, taking 
away from the time it could spend executing the project. In addition, 
the two teams drew very different conclusions. DOE's Deputy Secretary 
ultimately approved the project at the level recommended by OECM. DOE 
has developed a draft memorandum of understanding between OCA and OECM 
to better articulate the roles of each office; however, although the 
memorandum contains additional details about the responsibilities for 
each office, it still leaves the department's independent cost 
estimating function divided between them. 

The second way in which locating OCA under the CFO may not effectively 
support OCA's ability to fulfill its mission relates to the skills 
that will be available to OCA within the Office of the CFO. In 
contrast to OCA, staff in an agency's CFO office--including the CFO--
tend to have accounting and financial analysis skills used to develop 
and evaluate budgets, while OCA staff need to have strong analytical 
skills to understand engineering and technical details that are used 
as the basis for developing independent cost estimates. Although DOE's 
current CFO comes from an analytical background that meshes well with 
OCA's mission, a more traditional CFO trained as an accountant may not 
be willing to support OCA's independent cost-estimating function as 
effectively. 

Moreover, the House Appropriations Committee has weighed in on the 
issue of locating OCA under the Office of the CFO. More specifically, 
the House Appropriations Committee Chairman's Explanatory Statement 
that accompanies the Omnibus Appropriations Act of 2009 directed DOE 
to move OCA's staff and function out of the CFO Office, to be managed 
by OECM.[Footnote 26] Subsequently, in its fiscal year 2010 budget 
request, DOE indicated it had decided against this, instead opting to 
maintain OCA where it was initially placed in the CFO. In response, 
the Committee Report accompanying the recently enacted Energy and 
Water Development and Related Agencies Appropriations Act of 2010 
noted DOE's decision and reiterated the Committee's direction that OCA 
be moved from the CFO.[Footnote 27] Further, the report also stated 
that the Committee generally opposes creating a new office to address 
an issue with existing mission functions. 

Finally, subsuming OCA under another office whose primary 
responsibilities do not include cost estimation and analysis also 
diverges from the approach Congress has taken in establishing an 
independent cost-estimating office at DOD. DOE is similar to DOD in 
that both agencies rely heavily on contractors to construct complex 
and unique projects, and both have struggled to develop accurate cost 
estimates for these projects. The Weapons Systems Act moved DOD's 
independent cost-estimating office out from under another office 
within DOD to increase its independence by having it report directly 
to the Secretary and Deputy Secretary. According to the legislative 
history, Congress mandated the change to ensure that this office would 
have the independence and authority to make objective determinations 
and be able to report these determinations directly to the Secretary 
and Deputy Secretary.[Footnote 28] In recent testimony, we also 
supported the idea of having an independent cost-estimating office at 
DOD, noting that establishing an independent office that reports 
directly to DOD's Secretary or Deputy Secretary would provide an 
increased level of accountability.[Footnote 29] 

Some Program Offices' Actions Offer Potential to Improve Quality of 
Estimates, but Lack of Coordination on Some Actions May Lead to 
Duplication of Effort and Rework: 

Recognizing that their projects' cost performance needed improvement, 
EM and NNSA recently initiated actions--some of which reflect best 
practices--to improve their cost estimating, largely independently of 
the actions being taken concurrently by OCA. At EM, these actions 
included placing cost estimators at its large sites and establishing 
an internal cost-estimating office capable of providing cost-
estimating assistance primarily to its smaller sites, but also to its 
large sites on an as-needed basis, including conducting ICEs and 
training on cost estimating. EM's cost-estimating office has already 
conducted a number of independent estimates and reviews, including 
many to support EM's American Recovery and Reinvestment Act work. It 
has also built a database--Environmental Cost Analysis System--
containing historical actual costs from two of EM's cleanup projects 
and has helped develop a standard way of organizing work scope within 
a project to ensure all cleanup projects collect actual costs in a way 
that is consistent with the new database. In addition, NNSA has 
recently implemented actions to improve its cost-estimating 
capability. In early 2009, NNSA adopted a new independent cost-
estimating policy. Among other things, the policy requires either NNSA 
or OCA to conduct an ICE prior to approval of milestone 2 for NNSA's 
major projects--those estimated to cost more than $750 million. 
[Footnote 30] By requiring ICEs for these projects at this milestone, 
NNSA's policy better reflects best practices than DOE's new draft cost-
estimating policy. The policy also designates the NNSA project 
management office to serve as the focal point for all cost-estimating 
policy and standards within NNSA. Finally, in contrast to the 
improvements that EM and NNSA are undertaking, Science does not have 
efforts under way specific to improving its cost estimating. According 
to the director of Science's project management office, Science has 
had a process in place for several years to conduct independent 
project reviews of its projects on an ongoing basis that, in addition 
to other oversight it performs, already sufficiently validates its 
cost estimates. We reported in 2008 that although Science has 
generally achieved its projects' original cost and schedule targets, 
sometimes it has done so by trimming selected components from some 
projects' scope.[Footnote 31] OECM and DOE's Inspector General have 
expressed concern that such changes in scope may not always preserve a 
project's technical goals. 

Although several of these efforts show potential to improve cost 
estimating, some of them have not been well coordinated with 
department-level efforts, which may lead to duplication of effort or 
inefficiencies. Specifically, we saw two examples of efforts that were 
not well-coordinated. First, we found that NNSA's new independent cost-
estimating policy may conflict with OCA's draft policy on cost 
estimating. Although NNSA's new policy requires either NNSA or OCA to 
conduct an ICE for NNSA's projects before approving milestone 2, 
according to a senior NNSA official, NNSA intended to avoid 
duplicating department-level independent review efforts at that 
milestone by having the ICE replace the cost review portion of OECM's 
external independent review. However, OECM officials recently told us 
they were not aware of NNSA's new policy, and they intend to continue 
conducting the cost review portion of the external independent review 
for NNSA's projects in addition to any ICE that NNSA or OCA may 
conduct, which may lead to a duplication of effort. Second, although 
EM's new database has the potential to provide useful historical cost 
data to cleanup projects developing cost estimates, according to a 
senior OCA official, it may be challenging to share data between EM's 
database and OCA's planned historical cost database, representing a 
missed opportunity for collaboration. According to this OCA official, 
OCA's database will use a different structure for organizing its data, 
one that is better suited to DOE's projects. As a result, project 
teams interested in using information from both databases may find it 
difficult to gather compatible information from the databases to use 
in their estimates. 

Conclusions: 

DOE's responsibilities include overseeing billions of dollars worth of 
environmental cleanup, scientific research, nuclear weapons 
management, and other mission work vital to the nation's safety, 
security, and energy supply. Given the task of managing nearly 200 
projects expected to cost hundreds of billions of dollars, along with 
DOE's history of struggling to complete its projects within their cost 
estimates, obtaining realistic estimates from the contractors carrying 
out the projects is increasingly critical to inform officials' project 
management decisions. 

However, DOE's lack of both a policy for estimating the costs of 
projects and current guidance containing best practices to help 
contractors implement the policy has left the department without the 
benefit of the internal controls specified in federal standards. More 
specifically, without a way to ensure that its contractors use best 
practices in generating cost estimates, and without adequate federal 
personnel to gauge the quality of the contractors' cost estimates, DOE 
has effectively ceded a significant portion of its control of this 
process to its contractors. Further, because DOE's draft cost- 
estimating policy does not require the department's new independent 
cost-estimating office--or any other entity within DOE--to conduct 
ICEs for any projects, including the major projects, at any of the 
milestones, DOE is not using the most rigorous method available for 
validating its cost estimates. As a result, some project estimates are 
likely to continue to lack credibility, and DOE will not have a sound 
basis for making decisions on how to most effectively manage its 
portfolio of projects. 

Despite its policy and guidance deficiencies, DOE has taken an 
important step in recognizing its need for an independent cost- 
estimating capability. However, by creating a new office, OCA, that 
has review functions that are similar to those of an office that 
already exists, DOE has increased the likelihood that duplicative 
independent review efforts will continue. Further, because OCA reports 
to the CFO, OCA lacks the independence necessary to dictate its own 
agenda and remain focused on its core mission and may not be fully 
able to provide top management with objective determinations. A recent 
change at DOD serves as a model in this case: at the direction of 
Congress, DOD provided its centralized, cost-estimating office with 
greater independence by having it report directly to the Secretary and 
Deputy Secretary. 

Finally, because the cost estimates of the four projects we reviewed 
were not high quality, the projects are more likely to exceed their 
estimates, similar to DOE's eight major construction projects that, as 
we recently reported, exceeded their initial estimates by a total of 
nearly $14 billion, and DOE's nine major cleanup projects that 
exceeded their initial estimates by $25 billion to $42 billion. 
Without better estimates of what construction and cleanup projects 
will cost, neither DOE nor Congress has reliable information for 
supporting funding decisions at all stages of the department's project 
life cycles. Moreover, given the shortcomings of the four cost 
estimates we reviewed, we are concerned that DOE's cost estimates for 
its remaining major construction projects and major environmental 
cleanup projects--which represent tens of billions of dollars 
combined--have the potential to be similarly problematic. If this is 
the case, DOE may have greatly miscalculated the amount of funding it 
will require to complete its portfolio of projects: 

Recommendations for Executive Action: 

To better ensure that DOE is able to develop high-quality project cost 
estimates, we are making the following six recommendations. 

First, we recommend that the Secretary of Energy issue the 
department's forthcoming cost-estimating policy and updated guidance 
as soon as possible, ensuring that: 

* the policy requires DOE and its contractors to generate cost 
estimates in accordance with best practices; 

* the policy requires that ICEs be conducted for major projects at 
milestones 1, 2, and 3; and: 

* the guidance fully reflects best practices. 

In addition, to minimize duplication of effort and promote the 
independence of the cost-estimating review process, we recommend that 
the Secretary of Energy: 

* create a centralized cost-estimating capability by combining the 
functions that OCA and OECM have in common; and: 

* consider the structure recently adopted by the Department of 
Defense, under which its independent cost-estimating office reports 
directly to the Secretary and Deputy Secretary. 

Finally, given the limitations of the cost estimates of the four 
projects we reviewed, we recommend that the Secretary of Energy direct 
OCA to conduct an ICE for each major project that has not received 
one, including three of the four projects we reviewed, all major 
projects that have not yet started construction or operations, and all 
future major projects. 

Agency Comments and Our Evaluation: 

In commenting on a draft of this report, the Deputy Secretary of 
Energy said that DOE concurs with our recommendations in general. 
However, DOE did not fully concur with two of our recommendations: to 
require ICEs for major projects at milestones 1, 2, and 3 and to 
conduct an ICE for each major project that has not received one. 

Specifically, regarding requiring ICEs for major projects at 
milestones 1, 2, and 3, DOE said that the department's new independent 
cost-estimating policy will require ICEs for these projects at 
milestones 1 and 2 but that the department does not intend to require 
ICEs at milestone 3 unless "warranted by risk and performance 
indicators or required by senior officials." DOE provided no reason 
for treating milestone 3 differently from milestones 1 and 2. We 
continue to believe that requiring ICEs at all three milestones is 
important. According to cost-estimating best practices, conducting an 
ICE at major milestones, such as milestone 3, is critical because it 
provides an independent check--and thus a more objective assessment--
of the project team's cost estimate. In particular, because DOE 
projects generate a new or updated cost estimate for milestone 3, 
which is the final approval milestone before construction or cleanup 
operations begin, it is essential to cross-check a project team's 
updated estimate with a new ICE to help ensure the estimate's 
credibility. 

In addition, DOE only partially agreed with our recommendation to 
conduct ICEs for all major projects that have not received one, 
including for three of the four projects we reviewed--National 
Synchrotron Light Source-II, the Uranium Processing Facility, and EM 
Cleanup at Y-12. We have two concerns about DOE's response to this 
recommendation. First, although DOE said it will conduct an ICE for 
the Uranium Processing Facility before it reaches milestone 2, the 
agency noted that since both National Synchrotron Light Source-II and 
EM Cleanup at Y-12 have passed milestone 3, it will conduct ICEs for 
these projects only if they encounter significant performance issues. 
We believe, however, that checking these two projects' most recent 
cost estimates by conducting ICEs is both warranted and important and 
would improve the credibility of their cost estimates. 

Second, although the scope of our recommendation is broader than the 
three projects we reviewed that have not received an ICE, DOE did not 
address an important component of our recommendation: that DOE should 
conduct ICEs for "each major project that has not received one." As a 
result, it is not clear whether DOE intends to immediately begin 
conducting ICEs for all major projects already under development that 
have not yet started construction or operations as well as all future 
major projects. We have amended our recommendation to clarify the need 
to conduct these additional ICEs--without waiting until the department 
has passed its new cost-estimating policy. Moreover, given DOE's 
statement that it does not intend to require ICEs at milestone 3, we 
reiterate that our recommendations include conducting ICEs for all 
major projects at all three milestones. 

In addition, although DOE agreed with our recommendation to issue its 
forthcoming cost-estimating policy and guidance in accordance with 
best practices, we have two concerns. First, DOE did not state when it 
plans on issuing the policy and guidance. We believe DOE should issue 
the policy and guidance as soon as possible since they are already 
more than a year overdue. Second, DOE did not indicate whether it 
intends to require that its contractors generate cost estimates in 
accordance with the department's new guidance and the best practices 
contained therein, as we recommended. We are concerned that, without 
this requirement, DOE's contractors may continue to develop project 
cost estimates according to their own individual policies, as they did 
at the four projects we reviewed--undermining a DOE-defined, standard 
approach to cost estimating. Although having an updated DOE cost-
estimating guide in effect is critical for establishing a standard 
method for generating and evaluating cost estimates, its impact will 
be limited unless the department ensures that its contractors generate 
project cost estimates based on the standard set forth within the 
guide. 

Finally, in responding to our recommendations that DOE create a 
centralized cost-estimating capability and consider the organizational 
structure adopted by the Department of Defense, DOE agreed. However, 
the department did not address whether it will do so by combining the 
functions that OCA and OECM have in common, as we recommend. It is 
unclear whether DOE plans instead to implement its draft memorandum of 
understanding between the two offices without combining their 
functions--a decision that would not meet cost-estimating best 
practices. 

In addition to its written comments, which are reprinted in appendix 
IV, DOE provided detailed technical comments, which we incorporated as 
appropriate. 

We are sending copies of this report to other interested congressional 
committees and to the Secretary of Energy. The report will also be 
available at no charge on the GAO Web site at [hyperlink, 
http://www.gao.gov]. 

If you or your staff have any questions regarding this report, please 
contact me at (202) 512-3841 or aloisee@gao.gov. Contact points for 
our Offices of Congressional Relations and Public Affairs may be found 
on the last page of this report. GAO staff who made major 
contributions to this report are listed in appendix V. 

Signed by: 

Gene Aloise: 
Director, Natural Resources and Environment: 

[End of section] 

Appendix I: Scope and Methodology: 

To determine the extent to which the Department of Energy's (DOE) cost-
estimating policies and guidance support the development of high- 
quality cost estimates, we analyzed policies and guidance in effect 
across the department containing specifics on cost estimating. We then 
compared them with the best practices identified in our cost 
guide[Footnote 32] and identified differences. We also interviewed 
several DOE project directors and asked them to identify the policies 
and guidance they provided to their contractors before the contractors 
created the projects' cost estimates in addition to the policies and 
guidance the project directors followed in overseeing the contractors' 
estimating work. 

To determine the extent to which selected cost estimates reflected the 
four key characteristics of high-quality cost estimates--credible, 
well-documented, accurate, and comprehensive--we chose three major 
construction projects--the Office of Science's (Science) National 
Synchrotron Light Source-II at Brookhaven National Laboratory in New 
York, the National Nuclear Security Administration's (NNSA) Uranium 
Processing Facility at Y-12 National Security Complex in Tennessee, 
and the Office of Environmental Management's (EM) Salt Waste 
Processing Facility at the Savannah River Site in South Carolina--and 
one environmental cleanup project, EM's decontamination and 
decommissioning project for the Y-12 National Security Complex in 
Tennessee (EM Cleanup at Y-12), to include in our review. We selected 
these projects because they require a significant commitment of 
resources by DOE, were at different stages in the milestone approval 
process, and were managed by different program offices. We then 
analyzed the supporting documents related to each project's most 
recently approved total project cost estimate. These documents 
included independent review reports, risk analysis outputs, project 
execution plans, summaries of project assumptions, and design and 
technical documents. We visited each site, and while there, 
interviewed DOE project managers and contractor officials about the 
process used to prepare the cost estimates. We shared our cost guide 
and the criteria against which we would be evaluating the projects' 
cost estimates with DOE officials. We then compared DOE's methods and 
approaches for preparing the estimates with the best practices 
contained in our cost guide. 

To assess the actions DOE has taken recently to improve its cost 
estimating, we analyzed documentation of proposed and recently 
implemented actions at the department level and at EM, NNSA, and 
Science and evaluated these actions against the best practices found 
in our cost-estimating guide. This documentation included DOE's 
proposed new cost-estimating policy, Order 415, as drafted by the 
Office of Cost Analysis (OCA), as well as draft guidance on cost 
estimating. At the program offices, we reviewed NNSA's Independent 
Cost Estimate Policy and EM's Cost Estimating Strategy. In addition, 
we interviewed department-level officials from the Office of the Chief 
Financial Officer, OCA, and Office of Management, including officials 
from the Office of Engineering and Construction Management and the 
Office of Procurement and Assistance Management, as well as various 
officials within EM, NNSA, and Science to obtain their perspective on 
efforts being taken to improve cost estimating. We also interviewed 
officials with the Energy Facility Contractors Group, an organization 
of DOE contractors, to obtain their perspective on DOE's proposed 
actions to enhance cost estimating. Finally, to further inform our 
assessment, we reviewed recently passed legislation--the Weapons 
Systems Acquisition Reform Act of 2009--that includes direction 
intended to improve cost estimating at the Department of Defense (DOD) 
by making changes to the structure and function of the Cost Analysis 
Improvement Group, DOD's cost estimating office, and interviewed that 
office's director. 

We conducted this performance audit from September 2008 to January 
2010, in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Four Characteristics of a High-Quality Cost Estimate with 
Their Corresponding 12 Key Cost-Estimating Steps: 

Characteristic: Credible; 
Step: 
* Develop the point estimate and compare it to an independent cost 
estimate[A]; 
* Conduct sensitivity analysis; 
* Conduct risk and uncertainty analysis. 

Characteristic: Well-documented; 
Step: 
* Define the estimate's purpose, scope, and schedule; 
* Define the program; 
* Identify ground rules and assumptions; 
* Obtain the data; 
* Document the estimate; 
* Present the estimate to management. 

Characteristic: Accurate; 
Step: 
* Develop the point estimate and compare it to an independent cost 
estimate; 
* Update the estimate to reflect actual costs and changes. 

Characteristic: Comprehensive; 
Step: 
* Develop the estimating plan; 
* Determine the estimating approach. 

Source: GAO. 

[A] This step applies to two of the characteristics--credibility and 
accuracy. 

[End of table] 

[End of section] 

Appendix III: Assessments of Four Project Cost Estimates Reviewed: 

This appendix provides a project-by-project assessment of the four DOE 
project cost estimates we reviewed in detail. Each assessment provides: 

* a brief description of the project's mission; 

* project facts, including the cost estimate we reviewed, and the 
status of the project--the milestone, or "Critical Decision" (CD) 
point, most recently approved; and: 

* our analysis of the extent to which the project's cost-estimating 
processes and methodologies included the 12 key steps necessary for 
preparing high-quality cost estimates, and some key examples of the 
rationale behind our analysis. 

National Synchrotron Light Source-II: 

Office of Science:
Brookhaven National Laboratory, New York: 

National Synchrotron Light Source (NSLS)-II is a next generation 
electron synchrotron light source intended to replace the current, 27- 
year-old NSLS-I facility. It is designed to deliver ultra-high 
brightness radiation--10,000 times brighter than NSLS-I--and will help 
meet the nation's need for a high brightness, medium energy X-ray 
source. The Office of Science intends for the light source to serve a 
large and diverse scientific user community, including nanoscale 
imaging for energy, biology, medicine, chemical, and environmental 
research. 

Figure 2: National Synchrotron Light Source-II: 

[Refer to PDF for image: illustration] 

Project facts: 

* Total Project Cost: $912 million.
* Status: Critical Decision 3 approved on January 9, 2009; 
construction is scheduled to be completed in fiscal year 2015. 

Project timeframe: 

CD-1: 07/07; 
CD-2: 01/08; 
CD-3 construction: 01/09; 
GAO review: 11/09; 
CD-4 completion: 06/15. 

Source: DOE. 

[End of figure] 

Table 2: Cost-Estimating Criteria for NSLS-II: 

Four characteristics of high-quality cost estimates and 12 key steps: 
Well documented: Partially met. 

1. Define the estimate's purpose: Partially met; 
Key examples of rationale for assessment: Estimate was prepared at a 
very low level of detail, consistent with the level of design maturity 
of one part of the estimate, but not others, including the accelerator 
and experimental systems. Also, personnel contributing to the estimate 
lacked significant cost analysis experience and there was little 
experienced supervision available to assist. 

3. Define the program: Mostly met; 
Key examples of rationale for assessment: Technical baseline is 
sufficiently addressed in documentation, but is documented across many 
documents rather than one single document. Additionally, there was a 
lack of traceability from quantitative information contained within 
the technical documents to the cost estimate documentation provided 
for the CD-3 milestone. 

5. Identify ground rules and assumptions; Partially met; 
Key examples of rationale for assessment: Ground rules and assumptions 
were documented; however they lacked thorough rationale; source; 
and traceability to specific work breakdown structure (WBS) elements, 
cost estimate documentation, and risk analysis models. 

6. Obtain the data; Partially met; 
Key examples of rationale for assessment: As the foundation of the 
estimate, little data were gathered from historical data sources; 
only 12 percent of the total project cost was based on these data. The 
remainder of the estimate was based on professional judgment, vendor 
quotes, and catalog prices. For portions of the estimate not based on 
historical data, the data used were not subjected to validation checks. 

10. Document the estimate; Somewhat met; 
Key examples of rationale for assessment: The cost estimate 
documentation outlines the buildup of cost in a logical format and is 
stored in an electronic, collaborative format that is accessible by 
authorized personnel. However, it insufficiently describes the 
underlying data and basis for the calculation used; lacks narrative 
for describing the cost estimate process, data sources, and methods; 
and does not document linkages to technical baseline documentation. 

11. Present the estimate to management; Partially met; 
Key examples of rationale for assessment: The project team briefed 
management, but only at a high level, atypical of the content usually 
presented to management. In addition, the briefing did not provide all 
assumptions and methodologies and did not address cost drivers. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Comprehensive: Partially met. 

2. Develop the estimating plan; Partially met; 
Key examples of rationale for assessment: The estimating plan lacked 
the detail to address specific cost-estimating tasks. While most of 
the team responsible for generating the cost estimate inputs has 
significant technical experience in their assigned project areas, they 
lacked significant experience in cost estimating and were not from a 
centralized cost-estimating organization. 

4. Determine the estimating approach; Mostly met; 
Key examples of rationale for assessment: The estimate is generally 
structured well, employing a product-oriented WBS. However, the WBS 
dictionary lacks the detail necessary to describe the resources and 
functional activities required to produce each element. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Accurate; Somewhat met. 

7. Develop the point estimate and compare to an independent cost 
estimate[A]; Somewhat met; 
Key examples of rationale for assessment: The point estimate relied 
largely on expert opinion as the basis of estimate. The estimate 
lacked the use of statistical techniques and cost-estimating 
relationships. While the point estimate was aggregated logically 
according to the WBS, no references that relate the results to cross-
checks or accuracy checks were found. 

12. Update the estimate to reflect actual costs and changes; Somewhat 
met; 
Key examples of rationale for assessment: The project team did not 
update the point estimate to reflect actual costs and changes, and did 
not document lessons learned. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Credible; Somewhat met. 

7. Develop the point estimate and compare to an independent cost 
estimate; Somewhat met; 
Key examples of rationale for assessment: Although the estimate 
received independent project reviews and two independent cost 
estimates (ICE) for the conventional facilities, the ICEs only covered 
a portion of the overall project estimate and were not developed 
outside the project approval chain. DOE did not perform a project-
level ICE. 

8. Conduct sensitivity analysis; Somewhat met; 
Key examples of rationale for assessment: Although a sensitivity 
analysis was conducted, it did not assess effects of changing discrete 
performance, physical, or programmatic parameters, preventing the 
analysis of design changes. The sensitivity analysis did not follow 
the typical steps for preparing a credible sensitivity analysis, and 
as a result does not provide a range of possible costs or the means 
for performing what-if analyses. 

9. Conduct risk and uncertainty analysis; Partially met; 
Key examples of rationale for assessment: Although a risk analysis was 
performed, there was no quantifiable linkage between key cost driver 
assumptions and factors and the probability of occurrence and impact 
values. The risk analysis input parameters are subjectively based, 
lacking any historical basis or statistical derivation. Additionally, 
the risk analysis fails to consider correlation between cost elements 
and improperly forecasts risk at the individual WBS element as well as 
year by year. 

Sources: DOE (data), GAO (analysis). 

Note: The ratings we used in this analysis are as follows: "Fully" 
means that the program provided documentation that satisfied the 
criterion; "Mostly" means that the program provided the majority of 
the documentation to satisfy the criterion; "Partially" means that the 
program provided documentation satisfying part of the criterion; 
"Somewhat" means that the program provided documentation satisfying a 
minor part of the criterion; and "Not" means that the program did not 
provide documentation that satisfied the criterion. 

[A] This step applies to both accuracy and credibility. 

[End of table] 

Uranium Processing Facility: 

National Nuclear Security Administration:
Y-12 National Security Complex, Tennessee: 

NNSA is planning to build the Uranium Processing Facility in order to 
ensure the long-term viability, safety, and security of its enriched 
uranium. The project is expected to consolidate all of NNSA's enriched 
uranium operations at Y-12 into a single, modern facility with new 
technologies and safeguards, replacing current operations that are 
located in deteriorating facilities that do not meet modern safety and 
security standards. The effort is expected to support the nation's 
nuclear weapons stockpile and nonproliferation activities, and provide 
fuel for navy reactors. 

Figure 3: Uranium Processing Facility: 

[Refer to PDF for image: illustration] 

Project facts: 

* Preliminary Cost Range: $1.4 billion to 3.5 billion. 

* Status: Critical Decision 1 approved on July 25, 2007; the first of 
multiple Critical Decision 2 milestones is projected for September 
2010. 

Project timeline: 

CD-1: 7/07; 
GAO Review: 11/09; 
CD-2/3 Site prep and long lead procurement: 9/10; 
CD-3: 10/13; 
CD-4 completion: 9/18. 

Source: DOE. 

[End of figure] 

Table 3: Cost-Estimating Criteria for Uranium Processing Facility: 

Four characteristics of high-quality cost estimates and 12 key steps: 
Well documented: Partially met. 

1. Define the estimate's purpose; Mostly met; 
Key examples of rationale for assessment: Project clearly defined the 
estimate's purpose, but the level of detail at which the estimate was 
conducted was not consistent with available design of the project. 

3. Define the program; Mostly met; 
Key examples of rationale for assessment: The typical elements found 
in a technical baseline were addressed by multiple documents; 
however, it is not clear what key portions of the technical baseline 
were reflected in the estimate. 

5. Identify ground rules and assumptions; Partially met; 
Key examples of rationale for assessment: Although the project 
published a document listing key assumptions, no rationales or backup 
data were provided, nor was there a clear trace of specific 
assumptions to the underlying estimate. 

6. Obtain the data; Somewhat met; 
Key examples of rationale for assessment: The foundation of the 
estimate was not based on data from primary sources and the estimate 
of total number of design documents and design labor hours was based 
on professional judgment. However, estimators had access to a 
commercial construction cost database and historical costs from a 
contractor who had done work at the site in the past. 

10. Document the estimate; Partially met; 
Key examples of rationale for assessment: There is no step-by-step 
centralized document that ties together the underlying input data used 
to construct the estimate along with the estimating methodology 
utilized. It would be very difficult for an analyst unfamiliar with 
the project to replicate the estimate. 

11. Present the estimate to management; Partially met; 
Key examples of rationale for assessment: The formal presentation 
provided to DOE management contained a summary of the cost estimate 
that was not detailed--for example, it did not address cost drivers or 
contain information on the confidence levels associated with the 
estimate. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Comprehensive: Partially met. 

2. Develop the estimating plan; Partially met; 
Key examples of rationale for assessment: Contractor's cost estimators 
were from a centralized cost-estimating group, however, cost estimate 
documentation stated the team of 30-40 people who worked on the 
estimate included two certified cost consultants, a certified cost 
engineer, and 8-10 estimators. It was not clear who was an experienced 
and trained cost analyst. 

4. Determine the estimating approach; Mostly met; 
Key examples of rationale for assessment: The WBS appeared to contain 
all work that needs to be done to complete the project with the 
exception of technology development costs; 
but it was not based on a standard WBS. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Accurate; Somewhat met. 

7. Develop the point estimate and compare to an independent cost 
estimate[A]; Somewhat met; 
Key examples of rationale for assessment: Contractor did not employ 
various cost estimating methods, but rather relied solely on a 
detailed, bottoms-up method. Contractor's estimating system was very 
detailed and did not lend itself to a more top-down, statistical 
approach. 

12. Update the estimate to reflect actual costs and changes; Somewhat 
met; 
Key examples of rationale for assessment: Since approval of CD-1, the 
contractor is collecting performance data on the design portion of the 
project's life cycle, but no one is tracking the effect of this 
performance on total project cost. Estimate will not be updated until 
CD-2. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Credible; Somewhat met. 

7. Develop the point estimate and compare to an independent cost 
estimate; Somewhat met; 
Key examples of rationale for assessment: Project did not receive an 
ICE; NNSA conducted a technical independent review, which included 
examining costs. The project team cross-checked the estimate with data 
from other projects, although those data reflected estimated costs, 
not actual costs. 

8. Conduct sensitivity analysis; Not met; 
Key examples of rationale for assessment: Project team did not conduct 
a sensitivity analysis. 

9. Conduct risk and uncertainty analysis; Partially met; 
Key examples of rationale for assessment: Project team conducted a 
risk analysis that examined technical and programmatic, cost, and 
schedule risk to the project; however, this risk analysis did not 
account for correlation between cost elements. Separate from the risk 
analysis, NNSA headquarters added an additional $1.1 billion unfunded 
"programmatic allowance" to the upper end of the cost range to account 
for additional risks not incorporated in the analysis. 

Sources: DOE (data), GAO (analysis). 

Note: The ratings we used in this analysis are as follows: "Fully" 
means that the program provided documentation that satisfied the 
criterion; "Mostly" means that the program provided the majority of 
the documentation to satisfy the criterion; "Partially" means that the 
program provided documentation satisfying part of the criterion; 
"Somewhat" means that the program provided documentation satisfying a 
minor part of the criterion; and "Not" means that the program did not 
provide documentation that satisfied the criterion. 

[A] This step applies to both accuracy and credibility. 

[End of table] 

Salt Waste Processing Facility: 

Office of Environmental Management:
Savannah River Site, South Carolina: 

This project will construct a facility to treat large quantities of 
waste from reprocessing and other liquids generated by nuclear 
materials production operations at the Savannah River site, converting 
it into a stable form for eventual disposal in a geological 
repository. Approximately 37 million gallons of this waste are being 
stored on an interim basis in 49 underground storage tanks at the 
site--of this, about 34 million gallons are salt waste slated for 
treatment at the new facility. 

Figure 4: Salt Waste Processing Facility: 

[Refer to PDF for image: illustration] 

Project facts: 

* Total Project Cost: $1.34 billion. 

* Status: Critical Decision 3 approved on January 2, 2009; 
construction is expected to be completed in October 2015. 

Project timeline: 

CD-0: 06/01; 
CD-1: 10/04; 
CD-2/3A: 09/07; 
CD-3B: 01/09; 
CD-4 completion: 10/15. 

Source: DOE. 

[End of figure] 

Table 4: Cost-Estimating Criteria for Salt Waste Processing Facility: 

Four characteristics of high-quality cost estimates and 12 key steps: 
Well documented: Partially met. 

1. Define the estimate's purpose; Mostly met; 
Key examples of rationale for assessment: Although the estimate and 
scope were clearly defined and based on a bottoms-up review, it is not 
clear that there was enough time or resources to develop and review 
the estimate. 

3. Define the program; Mostly met; 
Key examples of rationale for assessment: Although the contractor 
supplied several documents that appear to sufficiently address the 
technical baseline, the contractor did not fully demonstrate that the 
technical baseline was developed by qualified personnel. Furthermore, 
it was not possible to evaluate the technical baseline because 
underlying data used to develop the cost estimate were not provided. 

5. Identify ground rules and assumptions; Mostly met; 
Key examples of rationale for assessment: Although the estimate 
defines and documents most of the ground rules and assumptions it 
makes, it does not provide the historical data for some of its key 
assumptions to back up claims. In addition, although the schedule is 
assessed for impacts, the estimate does not model schedule activity 
uncertainties. 

6. Obtain the data; Somewhat met; 
Key examples of rationale for assessment: Although the contractor 
claims that historical data were used to develop the estimate, 
historical cost data and data from other technical sources were not 
provided. As a result, we could not analyze the data to determine 
whether they were reasonable. 

10. Document the estimate; Somewhat met; 
Key examples of rationale for assessment: The estimate insufficiently 
describes the underlying data and bases for the calculations used. As 
a result, we could not analyze data to determine whether they contain 
elements of a high-quality estimate. 

11. Present the estimate to management; Partially met; 
Key examples of rationale for assessment: The project team briefed 
management, and obtained DOE approval to continue the project; 
however, the briefing was at a high level and did not provide detailed 
information about assumptions or methodologies. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Comprehensive; Partially met. 

2. Develop the estimating plan; Partially met; 
Key examples of rationale for assessment: The estimating approach and 
schedule is documented, however, it is not clear that team members are 
from a centralized cost estimating group or have the proper experience 
or access to subject area experts knowledgeable about large, complex 
nuclear construction projects of the scale and complexity of this 
facility. 

4. Determine the estimating approach; Mostly met; 
Key examples of rationale for assessment: Although the estimate has a 
WBS, it is only partially product oriented with product-oriented 
structures embedded in a functionally oriented structure. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Accurate; Partially met. 

7. Develop the point estimate and compare to an independent cost 
estimate[A]; Partially met; 
Key examples of rationale for assessment: It is not clear what 
estimation methodologies were used because there were few details 
about methodology and no supporting data. 

12. Update the estimate to reflect actual costs and changes; Mostly 
met; 
Key examples of rationale for assessment: The project team had a 
process for updating the estimate, and updated the point estimate to 
reflect actual costs and changes, but did not document lessons learned. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Credible; Somewhat met. 

7. Develop the point estimate and compare to an independent cost 
estimate; Partially met; 
Key examples of rationale for assessment: OCA performed an independent 
cost estimate at the request of the Deputy Secretary of Energy, 
resulting in an estimate of $2.7 billion-more than twice as high as 
the project team's estimate. However, no formal reconciliation process 
occurred between OCA and the project team to determine where and why 
there were differences in their estimates. 

8. Conduct sensitivity analysis; Not met; 
Key examples of rationale for assessment: The contractor did not 
conduct a sensitivity analysis. 

9. Conduct risk and uncertainty analysis; Partially met; 
Key examples of rationale for assessment: Although a risk and 
uncertainty analysis was performed and cost drivers were identified, 
the correlation between cost elements was not accounted for, and the 
probability associated with the point estimate was not identified. In 
addition, an S-curve of alternative cost estimate probabilities was 
not provided. 

Sources: DOE (data), GAO (analysis). 

Note: The ratings we used in this analysis are as follows: "Fully" 
means that the program provided documentation that satisfied the 
criterion; "Mostly" means that the program provided the majority of 
the documentation to satisfy the criterion; "Partially" means that the 
program provided documentation satisfying part of the criterion; 
"Somewhat" means that the program provided documentation satisfying a 
minor part of the criterion; and "Not" means that the program did not 
provide documentation that satisfied the criterion. 

[A] This step applies to both accuracy and credibility. 

[End of table] 

Nuclear Facility Decontamination and Decommissioning at Y-12 (EM 
Cleanup at Y-12): 

Office of Environmental Management:
Y-12 National Security Complex, Tennessee: 

This project involves the cleanup of the Y-12 National Security 
Complex, a significant source of environmental contamination. 
Specifically, it includes construction and operation of on-site 
landfills and the Environmental Management Waste Facility disposal 
facility; decontamination and decommissioning of contaminated 
facilities, including the Alpha 4 Facility; soil, sediment, scrap, and 
burial ground remediation; and environmental monitoring of soils and 
water sources to assess the effectiveness of cleanup actions. Much of 
the project's scope will be transferred to EM's Integrated Facility 
Disposition Project, a large project encompassing decontamination and 
decommissioning and remediation of soil and groundwater at Y-12 and 
the neighboring Oak Ridge National Laboratory. 

Figure 5: EM Cleanup at Y-12: 

[Refer to PDF for image: illustration] 

Project facts: 

* Near-term Baseline (NTB): $338 million. 

* Total life cycle cost: $1.1 to 1.2 billion. 

* Status: Critical Decision 2/3 approved on February 13, 2008; NTB 
period is from fiscal year 2008-2012. 

Project timeline: 

CD-2/3 for NTB: 02/08; 
GAO review: 11/09; 
End of NTB: 09/12; 
CD-4 completion: 9/22. 

Source: DOE. 

[End of figure] 

Table 5: Cost-Estimating Criteria for EM Cleanup at Y-12: 

Four characteristics of high-quality cost estimates and 12 key steps: 
Well documented: Partially met. 

1. Define the estimate's purpose; Fully met; 
Key examples of rationale for assessment: Fully meets all assessment 
criteria. 

3. Define the program; Mostly met; 
Key examples of rationale for assessment: The technical baseline 
exists and is updated as changes become known, and it was mostly 
developed by qualified personnel. However, the baseline provided was 
only at a summary level. 

5. Identify ground rules and assumptions; Partially met; 
Key examples of rationale for assessment: Although the estimate 
defines and documents the ground rules and assumptions, there is no 
evidence that they have been approved by upper management and there is 
no rationale provided to support some of the assumptions. 

6. Obtain the data; Partially met; 
Key examples of rationale for assessment: The data used to prepare the 
cost estimates were based on conceptual design as well as actual data, 
historical data, quotes from vendors, expert opinion, and experience. 
However the program was not able to provide documentation regarding 
the source of the data used. 

10. Document the estimate; Partially met; 
Key examples of rationale for assessment: The program has formal 
documentation books for each cost element. These books contain 
technical and programmatic information. While the books contain the 
methodology used to create the estimate, only some examples of the 
data sources used were provided. 

11. Present the estimate to management; Partially met; 
Key examples of rationale for assessment: The program held a formal 
briefing in January 2008. The briefing contains summary information 
about the project but does not contain detailed information about the 
cost estimate methodology. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Comprehensive; Mostly met. 

2. Develop the estimating plan; Partially met; 
Key examples of rationale for assessment: The cost estimators who 
developed the cost estimates are qualified estimators. However there 
was no schedule developed for creating the estimate. 

4. Determine the estimating approach; Mostly met; 
Key examples of rationale for assessment: The program has a WBS that 
is product-oriented and reflects all work that needs to be 
accomplished. However the agency does not have a standardized WBS. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Accurate; Partially met. 

7. Develop the point estimate and compare to an independent cost 
estimate[A]; Partially met; 
Key examples of rationale for assessment: A variety of cost-estimating 
methodologies were used to develop the cost estimate. However, limited 
statistical testing was done on the underlying data. 

12. Update the estimate to reflect actual costs and changes; Mostly 
met; 
Key examples of rationale for assessment: Cost estimates are updated 
for major events such as when a new contractor is selected, at the end 
of the near-term baseline, or if a major funding change is approved. 
The program does not have a formal process for capturing lessons 
learned. 

Four characteristics of high-quality cost estimates and 12 key steps: 
Credible; Somewhat met. 

7. Develop the point estimate and compare to an independent cost 
estimate; Somewhat met; 
Key examples of rationale for assessment: In November 2007, an 
independent assessment was performed. The independent assessment only 
reviewed a selected portion of the project and is typically used to 
validate the technical approach. 

8. Conduct sensitivity analysis; Somewhat met; 
Key examples of rationale for assessment: A sensitivity analysis was 
not performed. However, the project does consider variations in cost 
elements. 

9. Conduct risk and uncertainty analysis; Partially met; 
Key examples of rationale for assessment: Although a risk analysis was 
performed, the analysis did not address correlation. Also, the 
documentation and rationale behind the risk analysis were not provided. 

Sources: DOE (data), GAO (analysis): 

Note: The ratings we used in this analysis are as follows: "Fully" 
means that the program provided documentation that satisfied the 
criterion; "Mostly" means that the program provided the majority of 
the documentation to satisfy the criterion; "Partially" means that the 
program provided documentation satisfying part of the criterion; 
"Somewhat" means that the program provided documentation satisfying a 
minor part of the criterion; and "Not" means that the program did not 
provide documentation that satisfied the criterion. 

[A] This step applies to both accuracy and credibility. 

[End of table] 

[End of section] 

Appendix IV: Comments from the Department of Energy: 

The Deputy Secretary of Energy: 
Washington, DC 20565: 

December 17, 2009: 

Mr. Gene Aloise: 
Director, Natural Resources and Environment: 
Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Aloise: 

Thank you for the opportunity to comment on your draft report on the 
Department of Energy's (DOE) cost estimating function, Actions Needed 
to Develop High-Quality Cost Estimates for Construction and 
Environmental Cleanup Projects. While the Department's high-risk, 
unique projects make the development of accurate cost estimates 
particularly challenging, DOE is committed to improving its cost 
estimating capability. 

The Department has begun several initiatives to improve cost 
estimating practices as a result of its Corrective Action Plan for 
contract and project management issues. These include development of a 
DOE-wide cost database, additional training courses, and development 
of policies and guidance relating to independent cost estimates
DOE program offices and the National Nuclear Security Administration 
(NNSA) have made substantial progress towards improving their cost 
estimates. NNSA established a policy to requite independent cost 
estimates when appropriate The Office of Environmental Management 
established the Office of Cost Estimating and Analysis within its 
Consolidated Business Center and is developing a historical cost 
database to facilitate and improve cost estimates. 

While the Department concurs with GAO's recommendations in general, 
specific responses to the report recommendations are enclosed. Also, 
we are submitting technical and factual comments for your 
consideration in preparing the final report. 

Sincerely yours, 

Signed by: 

Daniel B. Poneman: 

Enclosures: Response to recommendations: 
Departmental and NNSA technical and factual comments: 

Enclosure: 

U.S. Department of Energy: 
GAO-10-199 – "Department Of Energy: Actions Needed to Develop High-
Quality Cost Estimates for Construction and Environmental Cleanup 
Projects" 

Response to GAO Recommendations for Executive Action: 

Recommendation 1: The Secretary of Energy should issue the 
department's forthcoming cost estimating policy and guidance as soon 
as possible to ensure that DOE and its contractors generate cost 
estimates in accordance with best practices. 

DOE Response: The Department concurs with GAO's recommendation DOE 
will issue its updated policy, requirements and guidance in the 
revision to DOE Order 413 3A, Program and Project Management for the 
Acquisition of Capital Assets, Order 415.X, Cost Estimating for DOE 
Programs and Projects and companion Guides The policy and guidance 
documents will reflect best practices noted by GAO in its Cost 
Estimating and Assessment Guide. Additionally, the policy and guidance 
documents will clarify roles and responsibilities for cost estimating 
and assessment within the Department. 

Recommendation 2: The Department's forthcoming cost-estimating policy 
and guidance should require that independent cost estimates he 
conducted for major projects at milestones 1, 2, and 3.
(Note Milestones I, 2, and 3, as defined by GAO, refer to Alternative 
Selection and Cost Range, Performance Baseline, and Start of 
Construction, respectively) 

DOE Response: The Department partially concurs with GAO's 
recommendation The Department's pending cost estimating order (415 X) 
will require independent cost estimates for major projects prior to 
approval of Alternative Selection and Performance Baseline (milestones 
1 and 2). These independent cost estimates will be consistent with the 
project phase. For milestone 1, the Department will identify a cost 
range using parametric cost methods (or extrapolation from actual 
costs for similar projects when available) For milestone 3—start of 
construction—DOE will conduct an independent cost estimate if 
warranted by risk and performance indicators or required by senior 
officials. 

Recommendation 3: To minimize duplication of effort and promote the 
independence of the cost-estimating review process, GAO recommends 
that the Secretary of Energy create a centralized cost-estimating 
capability by combining the functions that the Office of Cost Analysis 
(OCA) and the Office of Engineering and Construction Management (OECM) 
have in common. In centralizing the cost estimating functions in one 
office, GAO recommends that the Secretary consider the organizational 
structure recently adopted by the Department of Defense (DOD), where 
the cost-estimating office reports directly to the Secretary and 
Deputy Secretary. 

DOE Response: The Department concurs with GAO's recommendation. DOE 
will centralize its cost estimating functions and will consider the 
organizational structure adopted by the Department of Defense. 
Specific organizational roles and responsibilities will be defined in 
the Department's pending cost estimating order. 

Recommendation 4: The Secretary of Energy should direct the Office of 
Cost Analysis to conduct an independent cost estimate for each major 
project that has not received one, including three of the four 
projects that GAO reviewed (Note. The three projects refer to the 
National Synchrotron Light Source II (NSLS II), Uranium Processing 
Facility (UPF), and EM Cleanup at Y-12). 

DOE Response: The Department partially concurs with GAO's 
recommendation The Department will conduct independent cost estimates 
for the projects reviewed by GAO as appropriate for the next project 
milestone This is consistent with GAO's prior recommendation to 
conduct independent cost estimates at project milestones 1, 2, and 3 
Regarding the specific projects identified by GAO: 

* UPF: The Department will perform an independent cost estimate for 
this project prior to establishing the Performance Baseline 
(anticipated in mid to late 2010); 

* NSLS-II: This project has started construction the Department will 
continue to monitor cost, schedule and performance for this project on 
a quarterly basis, and will perform an independent cost estimate only 
if there are significant performance issues; 

* Y-12: This project has started remediation. As with NSLS-II, the 
Department will evaluate performance of this project and will only 
perform an independent cost estimate if there are significant 
performance issues. 

Additionally, for the Salt Waste Processing Facility project reviewed 
by GAO, the Department will continue to monitor cost, schedule and 
performance for this project on a quarterly basis, and will perform an 
independent cost estimate only if a performance baseline change is 
required. 

[End of section] 

Appendix V: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Gene Aloise, (202) 512-3841 or aloisee@gao.gov: 

Staff Acknowledgments: 

In addition to the individual named above, Daniel Feehan, Assistant 
Director; Brian Bothwell; Rudy Chatlos; Nancy Crothers; Tisha 
Derricotte; Jennifer Echard; Mike Gallo; Kristen Massey; Brian Octeau; 
Cheryl Peterson; Leslie Kaas Pollock; and Jacqueline Wade made key 
contributions to this report. 

[End of section] 

Footnotes: 

[1] NNSA was created by the National Defense Authorization Act for 
Fiscal Year 2000, Pub. L. No. 106-65 (1999), with responsibility for 
the nation's nuclear weapons, nonproliferation, and naval reactors 
programs. 

[2] Major construction projects are those with a total cost of more 
than $750 million; major cleanup projects are those whose costs exceed 
$1 billion in the near term--usually a 5-year window of the project's 
total estimated life cycle. See GAO, Department of Energy: Major 
Construction Projects Need a Consistent Approach for Assessing 
Technology Readiness to Help Avoid Cost Increases and Delays, 
[hyperlink, http://www.gao.gov/products/GAO-07-336] (Washington, D.C.: 
Mar. 27, 2007). 

[3] GAO, Nuclear Waste: Action Needed to Improve Accountability and 
Management of DOE's Major Cleanup Projects, [hyperlink, 
http://www.gao.gov/products/GAO-08-1081] (Washington, D.C.: Sept. 26, 
2008). 

[4] Because of progress DOE has made in this area since 2007, in 2009 
we narrowed the scope of this high-risk area to focus on NNSA and EM, 
although projects across DOE continue to receive scrutiny. 

[5] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[6] In the context of our cost guide, a cost estimate is the summation 
of individual cost elements, using established methods and valid data, 
to estimate the future costs of a project, based on what is known 
today. 

[7] DOE Order 413.3A was approved in 2006, and changed in 2008. This 
order cancels DOE Order 413.3, which was issued in 2000. For this 
report, we use Order 413 to refer to the order in effect, unless 
otherwise specified. 

[8] GAO, Further Improvements Needed in the Department of Energy for 
Estimating and Reporting Project Costs, [hyperlink, 
http://www.gao.gov/products/GAO/MASAD-82-37] (Washington, D.C.: May 
26, 1982). 

[9] [hyperlink, http://www.gao.gov/products/GAO-09-3SP]. 

[10] A point estimate is the best guess or most likely value for the 
cost estimate, given the underlying data. The level of confidence for 
the point estimate is the probability that the point estimate will 
actually be met. For example, if the confidence level for a point 
estimate is 80 percent, there is an 80 percent chance that the final 
cost will be at or below the point estimate and a 20 percent chance 
that costs will exceed the point estimate. 

[11] GAO, Information Technology: FBI Following a Number of Key 
Acquisition Practices on New Case Management System, but Improvements 
Still Needed, [hyperlink, http://www.gao.gov/products/GAO-07-912] 
(Washington, D.C.: July 30, 2007); Telecommunications: GSA Has 
Accumulated Adequate Funding for Transition to New Contracts but Needs 
Cost Estimation Policy, [hyperlink, 
http://www.gao.gov/products/GAO-07-268] (Washington, D.C.: Feb. 23, 
2007); Homeland Security: Recommendations to Improve Management of Key 
Border Security Program Need to Be Implemented, [hyperlink, 
http://www.gao.gov/products/GAO-06-296] (Washington, D.C.: Feb. 14, 
2006). 

[12] [hyperlink, http://www.gao.gov/products/GAO-08-1081]. 

[13] DOE, Cost Estimating Guide, DOE G 430.1-1, 3/28/1997. 

[14] GAO, Standards for Internal Control in the Federal Government, 
[hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] 
(Washington, D.C.: November 1999). 

[15] The $2.7 billion estimated cost from the ICE also included $360 
million to cover costs associated with the demonstration phase of the 
project, in which a facility is turned on and tested to see whether it 
will work as designed. According to project officials, these costs are 
captured by a separate project that will fund the operation of the 
facility once construction is complete. Even without the hot 
commissioning costs, the ICE was more than $1 billion higher than the 
project team's estimate. 

[16] Before conducting a risk analysis, each of the four projects 
identified risks associated with executing the project that could 
limit the project teams' ability to deliver the project on cost and 
schedule, such as increases in commodity prices, unexpected need for 
greater design complexity, or technology uncertainty. 

[17] Contingency reserves are funds that may be needed to cover 
potential cost increases stemming from a variety of project risks. 

[18] As we reported in our cost guide, how much contingency should be 
allocated to a program beyond the 50 percent confidence level depends 
on the program cost growth an agency is willing to risk. While no 
specific confidence level is considered a best practice, experts agree 
that project cost estimates should be budgeted to at least the 50 
percent confidence level, but budgeting to a higher level (for 
example, 70 percent to 80 percent, or the mean) is now common practice. 

[19] [hyperlink, http://www.gao.gov/products/GAO-08-1081]. 

[20] The cost estimate we reviewed of the Uranium Processing Facility 
project was a cost estimate range because the project's most recently 
approved estimate was for milestone 1. 

[21] Although not required at every milestone, Order 413 and its 
accompanying guidance direct DOE's construction projects to generate a 
life cycle cost estimate to inform the process of selecting the 
preferred alternative before milestone 1. According to project 
officials, the three construction projects we reviewed, including 
Uranium Processing Facility, developed life cycle costs as part of 
this process. 

[22] Although the estimate does not include the cost for the 
technology development, it does include contingency reserves to 
mitigate the risk of the technologies not being ready when they are 
needed. 

[23] Escalation is the provision in a cost estimate that captures 
increases in the cost of equipment, material, and labor due to 
continuing price changes over time. Escalation rates and indexes are 
used to forecast future project costs or to bring historical costs to 
the present. Most cost estimating is done in current-year dollars and 
then escalated to the time when the project will be accomplished. 

[24] Legislation recently changed the name of DOD's independent office 
from the Cost Analysis Improvement Group; see Pub. L. No. 111-23 
(2009), Weapons Systems Acquisition Reform Act of 2009. 

[25] Pub. L. No. 111-23 (2009). 

[26] Pub L. No. 111-85 (2009); see 155 Cong. Rec. H1962. 

[27] Pub. L. No. 111-85 (2009), H.R. Rep. No. 111-203 at 124-125 
(2009). 

[28] 155 Cong. Rec. S5205 - 5224 (2009). 

[29] GAO, Defense Acquisitions: DOD Must Balance Its Needs with 
Available Resources and Follow an Incremental Approach to Acquiring 
Weapon Systems, [hyperlink, http://www.gao.gov/products/GAO-09-431T] 
(Washington, D.C.: Mar. 3, 2009). 

[30] According to NNSA's policy, NNSA should conduct an independent 
estimate for projects estimated to cost between $100 million and $750 
million, and OCA should conduct an independent estimate for projects 
estimated to cost greater than $750 million. 

[31] GAO, Department of Energy: Office of Science Has Kept Majority of 
Projects within Budget and on Schedule, but Funding and Other 
Challenges May Grow, [hyperlink, 
http://www.gao.gov/products/GAO-08-641] (Washington, D.C.: May 30, 
2008). 

[32] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for 
Developing and Managing Capital Program Costs, [hyperlink, 
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: March 2009). 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: