This is the accessible text file for GAO report number GAO-04-460 
entitled 'Yucca Mountain: Persistent Quality Assurance Problems Could 
Delay Repository Licensing and Operation' which was released on April 
30, 2004.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to Congressional Requesters:

April 2004:

YUCCA MOUNTAIN:

Persistent Quality Assurance Problems Could Delay Repository Licensing 
and Operation:

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-460]:

GAO Highlights:

Highlights of GAO-04-460, a report to congressional requesters 

Why GAO Did This Study:

The Department of Energy (DOE) must obtain a license from the Nuclear 
Regulatory Commission (NRC) to construct a nuclear waste repository at 
Yucca Mountain, Nevada. In licensing, a quality assurance program helps 
ensure that the information used to demonstrate the safety of the 
repository is defensible and well documented. DOE developed a 
corrective action plan in 2002 to fix recurring problems with the 
accuracy of such information. This report assesses the status of 
corrective actions and the adequacy of DOE’s plan to measure the 
effectiveness of actions taken.

What GAO Found:

DOE has reportedly implemented most of the actions in its 2002 
corrective action plan, but recent audits and assessments have 
identified lingering quality problems with data, models, and software 
and continuing management weaknesses. Audits revealed that some data 
sets could not be traced back to their sources, model development and 
validation procedures were not followed, and some processes for 
software development and validation were inadequate or not followed. 
DOE believes these problems have not affected the technical basis of 
the project; however, they could adversely affect the licensing 
process. Recent assessments identified continuing management weaknesses 
in the areas of roles and responsibilities, quality assurance policies 
and procedures, and a work environment that did not foster employee 
confidence in raising concerns without fear of reprisal. NRC has 
acknowledged DOE’s effectiveness in identifying quality problems, but 
recently concluded that quality problems could delay the licensing 
process. 

DOE cannot assess the effectiveness of its 2002 plan because the 
performance goals to assess management weaknesses lack objective 
measurements and time frames for determining success. The goals do not 
specify the amount of improvement expected, how quickly the improvement 
should be achieved, or how long the improvement should be sustained 
before the problems can be considered corrected. DOE recently developed 
a measurement tool that incorporates and revises some of the goals from 
the action plan, but most of the revised goals continue to lack the 
necessary time frames needed to determine whether the actions have 
corrected the recurring problems. A recently completed DOE review of 
the 2002 plan found that the corrective actions have been fully 
implemented. However, the review also noted the effectiveness of the 
actions could not be evaluated because many of the plan’s goals lacked 
the level of objectivity and testing needed to measure effectiveness.

Quality Problems with Data, Models, and Software: 

[See PDF for image]

[End of table]

What GAO Recommends:

GAO recommends that DOE revise action plan goals and close the plan 
once sufficient evidence exists showing that the actions have 
succeeded. In commenting on the report, DOE disagreed with the findings 
and recommendations, stating, among other things, that GAO 
mischaracterized the action plan and the results of independent 
reviews. GAO disagrees—the report correctly describes the plan and the 
findings of the reviews. NRC agreed with GAO’s conclusions but 
suggested that DOE be given the flexibility to choose alternative 
approaches to achieve and measure performance. GAO agrees, provided 
that any approach include objective measures and time frames to assess 
effectiveness.

www.gao.gov/cgi-bin/getrpt?GAO-04-460.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Robin M. Nazzaro at (202) 
512-3841 or nazzaror@gao.gov.

[End of section]

Contents:

Letter: 

Results in Brief: 

Background: 

Quality Assurance Problems Persist at the Yucca Mountain Project: 

Corrective Action Plan Lacks Measurable Goals: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Scope and Methodology: 

Appendixes:

Appendix I: Role of Quality Assurance in the Licensing Process: 

Appendix II: Employee Concerns Programs at the Yucca Mountain Project: 

Appendix III: 2002 Corrective Action Plan Process and Status: 

Appendix IV: Comments from the Department of Energy: 

GAO Comments: 

Appendix V: Comments from the Nuclear Regulatory Commission: 

GAO Comment: 

Appendix VI: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Staff Acknowledgments: 

Tables: 

Table 1: Comparison of Goals in the July 2002 Corrective Action Plan to 
Goals in the December 2003 Performance Tool: 

Table 2: Employee Concerns Investigated by DOE and Bechtel in 2003: 

Figures: 

Figure 1: License Application Review Process and Timeline: 

Figure 2: 2002 Corrective Action Plan Process and Status: 

Abbreviations: 

DOE: Department of Energy:

NRC: Nuclear Regulatory Commission:

OCRWM: Office of Civilian Radioactive Waste Management:

OMB: Office of Management and Budget:

Letter April 30, 2004:

The Honorable Harry Reid: 
United States Senate:

The Honorable John Ensign: 
United States Senate:

High-level nuclear waste, created as a by-product of the nuclear power 
process in reactors, can remain highly radioactive for hundreds of 
thousands of years, endangering the public if not properly disposed. 
Storing this waste safely is therefore of vital interest to the nation. 
Currently, more than 50,000 metric tons of this waste is being stored 
at 72 sites across the country. In 2002, Congress approved the 
President's recommendation of Yucca Mountain, Nevada, 90 miles from Las 
Vegas, as a suitable site for the Department of Energy (DOE) to 
construct and operate a geologic repository to safely and permanently 
dispose of this waste. To construct and operate the repository, DOE 
must obtain a license from the Nuclear Regulatory Commission (NRC). As 
part of the license application, DOE must, among other things, 
demonstrate an effective quality assurance program that ensures the 
safe construction and operation of a repository, protecting public 
health and safety. DOE plans to submit a license application by 
December 2004 and is following a demanding schedule to meet this date. 
NRC is reviewing an extensive amount of data as part of a prelicensing 
agreement with DOE.

Before granting a license, NRC requires nuclear facilities to develop a 
quality assurance program that ensures that the technical information 
submitted in support of a license application--such as scientific data, 
models, and details on design and construction--is well documented and 
defensible. The quality assurance program involves a two-part process 
that (1) requires program staff to follow procedures to help ensure the 
reliability of information and (2) uses quality assurance auditors to 
verify that the procedures have been followed. Both program staff and 
quality assurance auditors are required to identify when procedures are 
not being followed or when they encounter problems with the procedures. 
DOE and contractor quality assurance auditors periodically assess 
compliance with procedures.[Footnote 1] In cases where a significant 
problem is found (DOE's criteria refer to significant problems as 
significant conditions adverse to quality), quality assurance personnel 
and program managers follow specific steps to analyze and correct the 
problem:

1. Quality assurance auditors or program personnel complete a 
corrective action report that describes the problem and the need for 
corrective action.

2. Program managers conduct a root-cause analysis of the problem.

3. Program managers identify corrective actions that address the root 
cause(s) to prevent the problem from recurring--these actions are 
included in a corrective action plan.

4. Program managers implement these corrective actions and quality 
assurance personnel verify that they have been implemented.

5. Quality assurance personnel close the corrective action report, and 
program managers later conduct an effectiveness review.

a. If actions are determined ineffective, the process begins again with 
the issuance of a new corrective action report.

b. In cases involving more significant problems, an effectiveness review 
may be conducted prior to closing the corrective action report.

In 1998, DOE's quality assurance auditors identified significant 
problems with data sources, validation of scientific models, and 
software development and issued three corrective action reports. For 
data sources, DOE reported that it could not ensure that all data 
needed to support the scientific models could be tracked back to 
original sources or that the data had been properly collected. For 
validation of models, DOE reported that it had no standardized process 
to develop the scientific models needed to simulate geological events. 
For software, DOE reported that it had no process for ensuring that the 
software being developed to support the models would work. As required 
by DOE's quality assurance procedures, the department conducted a root-
cause analysis and issued a corrective action plan in 1999 that 
identified the needed corrective actions. Following implementation of 
the actions, DOE considered the issues resolved and closed the 
corrective action reports. However, problems with models and software 
resurfaced during 2001 quality assurance audits. As a result, new 
corrective action reports were completed in May and June 2001, 
beginning another iteration of the corrective action process.

Recognizing the need to correct these recurring problems, DOE (1) 
conducted a comprehensive root-cause analysis that included reviews of 
numerous past self-assessments and independent program assessments and 
(2) identified weaknesses in management systems, quality processes, and 
organizational roles and responsibilities. As a result, DOE issued a 
corrective action plan in July 2002 that addressed both the quality 
problems with data and models and the management weaknesses.[Footnote 
2] In addition to the 37 actions in the 2002 plan that addressed models 
and software, DOE added 35 corrective actions to address management 
weaknesses that it found in five key areas:

* roles and responsibilities,

* quality assurance processes,

* written procedures,

* corrective action plans, and:

* a work environment that allows employees to raise quality concerns 
without fear of reprisal.

To correct these weaknesses, DOE completed a management reorganization 
and issued new policy statements to clarify roles and responsibilities, 
revised the primary quality assurance implementing document, reviewed 
and revised program procedures, revised the system to correct quality 
problems, and provided new training for employees to encourage them to 
raise concerns about quality. To assess the effectiveness of its 
actions in correcting the management weaknesses, DOE developed 13 goals 
to determine whether the corrective actions were successful, such as 
achieving decreasing trends in problems attributed to unclear roles and 
responsibilities, reducing the time required to revise procedures and 
complete corrective actions, and reducing the number of employee 
concerns related to the work environment. Because of the significance 
of these problems, DOE stated in the 2002 corrective action plan that 
an effectiveness review would be completed prior to closing the 
corrective action reports and reporting the results to NRC.

In May 2003, at a congressional field hearing, we provided preliminary 
observations on the Yucca Mountain quality assurance program.[Footnote 
3] Specifically, we noted DOE's poor track record in correcting 
recurring quality assurance problems and provided preliminary 
observations on recent actions taken to correct these problems. You 
requested that we continue our evaluation of the quality assurance 
program at Yucca Mountain, focusing on DOE's actions to correct the 
recurring quality problems. As agreed with your offices, this report 
(1) assesses the status of DOE's corrective actions to resolve 
recurring problems and (2) determines the adequacy of DOE's plan to 
measure the effectiveness of these actions.

In conducting our work, we met with DOE and contractor officials, 
assessed the status of DOE's corrective actions, reviewed audits and 
deficiency reports, and visited the Yucca Mountain project office. We 
met with NRC officials and reviewed NRC-prepared documents, including 
observation audits, on-site representative reports, and correspondence 
between DOE and NRC. We attended several DOE-NRC quarterly quality 
assurance meetings and met with representatives of the State of Nevada 
Agency for Nuclear Projects and with representatives of the Nuclear 
Waste Technical Review Board. Our work was performed from April 2003 to 
April 2004 in accordance with generally accepted government auditing 
standards. Our scope and methodology for this review are presented at 
the end of this letter.

Results in Brief:

DOE reports that it has implemented almost all of the corrective 
actions detailed in its 2002 plan, but recent audits and assessments 
show that these actions have not solved the quality assurance problems 
or corrected management weaknesses, and that further actions are 
needed. Quality assurance audits found continuing problems with data, 
models, and software, including unqualified sources for the data used 
in modeling repository performance, noncompliance with processes used 
for the development and validation of models, and ineffective processes 
for developing software. DOE officials have stated that these findings 
represent problems with procedures and documentation and do not 
invalidate the technical products produced using the data, models, and 
software. However, the persistence of these problems could adversely 
affect the licensing process because DOE must demonstrate an effective 
quality assurance program as part of this process. Recent assessments 
show that management weaknesses remain despite DOE's actions. For 
example, one assessment notes that staff roles and responsibilities 
remain poorly defined, and that personnel are still not following 
procedures. Another assessment concluded that despite communication 
mechanisms, DOE had not established a climate of trust in the 
workplace. NRC has acknowledged the ability of DOE's quality assurance 
auditors to effectively identify quality problems; however, a recent 
NRC evaluation concluded that quality problems could adversely affect 
the licensing process.

DOE cannot formally assess the overall effectiveness of its corrective 
actions because the plan's performance goals to assess management 
weaknesses lack objective measurements and time frames to determine 
whether corrective actions have been successful. Most of these goals 
fail to specify the amount of improvement expected, and none of them 
specify how quickly the improvement should be achieved or how long the 
improvement should be sustained before the problems can be considered 
corrected. For example, one goal calls for a decreasing trend in the 
average time needed to make revisions in procedures, but it does not 
specify the desired amount of the decrease, the length of time needed 
to achieve the decrease, or how long the decrease must be sustained. 
DOE has recently developed a project measurement tool that incorporates 
and revises some of the goals from the action plan, but most of the 
revised goals continue to lack the necessary time frames needed to 
determine whether the actions have corrected the recurring problems. A 
DOE independent review of the corrective action plan completed in March 
2004 found that the corrective actions from the 2002 plan to address 
management weaknesses have been fully implemented. However, the review 
also noted the effectiveness of corrective actions under the plan could 
not be evaluated because many of the goals in the performance 
measurement tool that are linked to the 2002 plan lacked the level of 
objectivity and testing needed to measure effectiveness.

To ensure proper assessment of the plan, we are recommending that DOE 
(1) revise the performance goals associated with the 2002 plan to 
ensure that they are measurable with specific time frames for achieving 
and maintaining success in each area of the plan and (2) close the plan 
after it develops evidence to show that the recurring quality assurance 
problems have been successfully corrected.

In commenting on the report, DOE disagreed with the findings and 
recommendations, stating, among other things, that we mischaracterized 
the action plan and the results of several independent reviews. We 
disagree--the report correctly describes the plan and properly 
specifies the findings of the reviews. NRC agreed with our conclusions 
but suggested that DOE be given the flexibility to choose alternative 
approaches to achieve and measure performance. We agree, provided that 
any approach include objective measurements and time frames for 
reaching and sustaining desired performance and include an end point 
for closing out the 2002 plan.

Background:

In 2002, after more than 15 years of scientific investigation, Congress 
approved the Yucca Mountain site in Nevada as a suitable location for 
the development of a long-term permanent repository for high-level 
nuclear waste. DOE is responsible for developing and operating the 
repository, and NRC is responsible for licensing the repository. DOE is 
currently preparing an application to submit to NRC by December 2004 
for a license to construct the repository. To obtain a license, DOE 
must, among other things, demonstrate to NRC that the repository will 
not exceed Environmental Protection Agency health and safety standards 
over a 10,000-year period. An ineffective quality assurance program 
runs the risk of introducing unknown errors into the design and 
construction of the repository that could lead to adverse health and 
safety consequences.

To demonstrate compliance with the health standards over this 10,000-
year period, DOE must rely primarily on a "performance assessment" 
computer model that incorporates over 1,000 data sources, approximately 
60 scientific models, and more than 400 computer software codes to 
simulate the performance of the repository. Given the prominence of 
computer modeling in the licensing of the repository, one of DOE's most 
important tasks is to demonstrate the adequacy of the data, models, and 
software used to perform the simulation. In addition, as part of the 
licensing process, DOE must demonstrate that its quality assurance 
program can effectively identify and correct deficiencies in areas 
important to the safe operation and long-term performance of the 
repository, such as the natural and engineered barriers of the 
repository and the program's data, models, and software. See appendix I 
for more information on the role of quality assurance in the licensing 
process.

DOE has a long-standing history of attempting to correct quality 
assurance problems. In 1988, we identified significant problems with 
the quality assurance program, noting that NRC had identified many 
specific concerns about the Yucca Mountain program, including:

* DOE's heavy reliance on contractors and inadequate oversight would 
increase the likelihood that DOE might encounter quality-related 
problems;

* the possibility that Nevada would contest the licensing proceedings, 
thereby increasing the probability that DOE would have to defend its 
quality assurance program;

* additional expense and time-consuming delays to correct program 
weaknesses if DOE could not properly defend the quality of its work; 
and:

* DOE staff's and contractors' negative attitude toward quality 
assurance.[Footnote 4]

Since the late 1990s, DOE has attempted to correct continuing quality 
assurance problems in three areas critical to the repository's 
successful performance: the adequacy of the data sources, the validity 
of scientific models, and the reliability of computer software that 
have been developed at the site. These problems surfaced in 1998 when 
DOE began to run the initial versions of its performance assessment 
model. Specifically, DOE was unable to ensure that critical project 
data had been properly collected and tracked back to original sources. 
In addition, the department lacked a standardized process for 
developing scientific models used to simulate a variety of geologic 
events and an effective process for ensuring that computer software 
used to support the scientific models will work properly. DOE 
implemented actions in 1999 to correct these deficiencies and prevent 
their recurrence.

In 2001, similar deficiencies associated with models and software 
resurfaced. DOE attributed the recurrence to ineffective procedures and 
corrective actions, improper implementation of quality procedures by 
line managers, and personnel who feared reprisal for expressing quality 
concerns. To ensure that it adequately addressed the problems to 
prevent future recurrence, DOE developed a more comprehensive 
corrective action plan in July 2002, concentrating on actions needed to 
address the causes of the recurring problems while improving the 
organizational culture and instilling a strong commitment to quality in 
all project personnel. The plan detailed specific actions for both DOE 
and its contractor, Bechtel/SAIC Company, LLC (Bechtel), to strengthen 
the roles, responsibilities, accountability, and authority of project 
personnel; develop clearer quality assurance requirements and 
processes; improve program procedures; create an improved programwide 
corrective action process; and improve processes for ensuring that 
employees can raise project concerns without fear of reprisals.

Quality Assurance Problems Persist at the Yucca Mountain Project:

DOE reports that it has implemented almost all of the actions 
identified in its 2002 corrective action plan; however, recent audits 
and assessments indicate that recurring quality assurance problems have 
not been corrected. In 2003, DOE conducted three audits to evaluate the 
effectiveness of the corrective actions taken to address recurring 
problems with data, models, and software. Because each audit identified 
additional quality assurance problems, DOE concluded that there was 
insufficient evidence to demonstrate that the recurring problems had 
been corrected. DOE recently closed the corrective action reports for 
data and software, but did so without determining whether corrective 
actions have been effective. To examine actions taken to correct some 
of the management weaknesses identified in the 2002 corrective action 
plan, DOE conducted four management assessments late in 2003. 
Collectively, these assessments found continuing management weaknesses 
that DOE had identified as root causes of the recurring problems. NRC 
also conducted an assessment that was issued in April 2004. NRC's 
assessment noted some improvements but also found continuing weaknesses 
and noted that quality assurance problems could hinder the licensing 
process.

Audits Have Found Recurring Problems with Data, Models, and Software:

In 2003, DOE's audits of data, models, and software identified 
continuing quality problems that could impede DOE's license 
application. As a result, DOE could not close corrective action reports 
for models and software for nearly 3 years. In a June 2003 audit, DOE 
found quality problems in developing and validating software. In 
September 2003, DOE quality assurance auditors found that some data 
sets were still not qualified or traceable to their sources. In October 
2003, a DOE audit found continuing quality problems in model 
documentation and validation. DOE officials have stated that these 
findings represent problems with procedures and documentation and do 
not invalidate the technical products produced using the data, models, 
and software. In March 2004, DOE closed the corrective action reports 
for data and software but did so without evaluating the effectiveness 
of corrective actions--according to agency officials, they will 
evaluate effectiveness at a later date. DOE anticipates closing the 
corrective action report for models in August 2004 but also plans to do 
so without evaluating the effectiveness of corrective actions.

Data Qualification and Traceability Problems Are Still Being Corrected:

In April 2003, DOE again reported significant problems similar to those 
originally identified in 1998 with the qualification and traceability 
of data sets. At the time, DOE implemented corrective actions to 
recheck all of its data sets to confirm that they were traceable and 
qualified. However, a September 2003 audit identified similar data 
problems and new problems in addition to those noted in the corrective 
action report.[Footnote 5] The audit found that some data sets did not 
have the documentation needed to trace them back to their sources; the 
critical process of data control and management was not satisfactory; 
and, as in 1998, faulty definitions were developed for data procedures, 
which allowed unqualified data to be used. In addition, DOE found that 
overall compliance with procedures was unsatisfactory. Similarly, the 
April 2003 corrective action report also noted a lack of management 
leadership, accountability, and procedural compliance, issues which are 
closely related to the key improvement area of roles and 
responsibilities. DOE officials noted that these findings represented 
noncompliance with procedures, and that the procedures and processes 
were effective in producing defensible technical products if properly 
followed. As of February 2004, DOE had not finished rechecking all of 
its data sets or correcting problems in its data sets. However, DOE 
closed the corrective action report in March 2004 by making the 
rechecking process a continuing part of the Yucca Mountain repository's 
work. The corrective action report was closed without DOE evaluating 
the effectiveness of the rechecking process in correcting problems with 
data. DOE officials stated that they plan to evaluate effectiveness at 
a later date.

Models Still Lack Proper Validation:

An October 2003 DOE quality assurance audit found continuing problems 
with the documentation and validation of models that DOE plans to use 
in its license application.[Footnote 6] Although auditors reported that 
processes were effective in producing defensible models to support the 
license application, they found that for some models sampled, project 
personnel did not properly follow model validation procedures. These 
problems were similar to those identified by audits conducted in 2001. 
Auditors compared results from the 2003 audit with actions taken to 
correct problems identified in 2001 and found that procedures still 
were not being satisfactorily implemented in the areas of model 
documentation and traceability, model validation, and checking and 
review. For example, an action was taken in 2001 to improve the self-
identification of problems before issuing new model reports by allowing 
for sufficient scheduling time for model checking and review. However, 
the 2003 audit concluded that instances of new errors in model reports 
were evidence that the previous actions may not have been fully 
implemented. As a result, DOE has been unable to close the May 2001 
model corrective action report for almost 3 years. DOE recently 
directed a team of industry experts to review its models and revise 
them to ensure consistency, traceability, and procedural compliance. 
DOE anticipates closing the corrective action report in August 2004 but 
will do so without conducting another audit of models to determine if 
corrective actions have been effective.

Software Development Problems Persist:

In a June 2003 audit, DOE auditors discovered recurring software 
problems that could affect confidence in the adequacy of software 
codes.[Footnote 7] Specifically, the auditors found ineffective 
software processes in five areas: technical reviews, software 
classification, planning, design, and testing. The auditors found 
several of the software development problems to be similar to 
previously identified problems, indicating that previous actions were 
ineffective in correcting the problems. For example, auditors again 
noted instances of noncompliance with software procedures. They also 
concluded that technical reviews during software development were 
inadequate, even though documentation indicated that corrective actions 
for this condition had been completed 3 months before the 2003 audit. 
Auditors also noted poorly defined roles and responsibilities as a 
cause of problems identified in the technical review of software, even 
though DOE had taken actions under its 2002 corrective action plan to 
clarify roles and responsibilities. Because of these results, DOE was 
unable to close the June 2001 software corrective action report. DOE 
employed a team of industry professionals in the fall of 2003 to 
examine software quality problems identified from 1998 through 2003. 
The professionals' February 2004 report concluded that software 
problems recurred because DOE did not assess the effectiveness of its 
corrective actions and did not adequately identify the root causes of 
the problems. In a January 2004 follow-up audit of software, auditors 
verified that unqualified software was used to run approved models, and 
noted that procedural controls for determining the adequacy of software 
were inadequate. In March 2004, without evaluating the effectiveness of 
corrective actions, DOE closed the software corrective action report. 
DOE officials plan to evaluate the effectiveness of its corrective 
actions for software sometime in the future.

Assessments Indicate Continuing Management Weaknesses:

DOE reported in the fall of 2003 that it had implemented most of the 
actions identified in the plan focusing on management weaknesses, but 
four DOE management assessments of the Yucca Mountain project completed 
between September and November 2003 found that some of the identified 
management weaknesses had yet to be properly addressed. These 
assessments included one requested by project management comparing 
DOE's management practices at Yucca Mountain with external industry 
best practices,[Footnote 8] one required as an annual assessment of the 
adequacy and effectiveness of the quality assurance program,[Footnote 
9] one requested by the project director that examined the 
effectiveness of selected DOE and contractor management 
systems,[Footnote 10] and one examining the project work 
environment.[Footnote 11] Collectively, these assessments identified 
continuing weaknesses in the areas of roles and responsibilities, 
quality assurance procedures, and a work environment that did not 
foster employee confidence in raising concerns without fear of 
reprisal. DOE officials stated that they are presently reviewing the 
findings of these assessments, and have recently initiated additional 
corrective actions.

Unclear Roles and Responsibilities:

Three of the four management assessments conducted late in 2003 
identified significant continuing problems with the delineation and 
definition of roles and responsibilities for carrying out the quality 
assurance program. In its 2002 corrective action plan, DOE stated that 
it was not possible to build accountability into management without 
clearly and formally defining roles and responsibilities for DOE and 
its contractors. DOE's planned actions included clarification of roles 
and responsibilities within DOE and Bechtel through policy statements, 
communications, a new program manual, and realignment of the 
organization to support performance accountability. DOE reported that 
it had completed all corrective actions in this area by May 2003. The 
assessments noted that these actions had resulted in some improvements, 
but that some management weaknesses remained. The assessments found 
that the Yucca Mountain project:

* lacked formal mechanisms for defining and communicating roles and 
responsibilities that meet both DOE and NRC requirements;

* did not have a systematic process for assigning authorities to DOE 
and Bechtel organizations and individuals;

* relied on program managers who had not fully assumed ownership and 
responsibility for quality assurance;

* lacked formal control of documents outlining roles and 
responsibilities, ensuring that they reflect the organization;

* lacked clear reporting relationships between the project and 
supporting national laboratories;

* had not adequately established processes for reviewing procedures 
when needed;

* had few systematic and effective approaches in place for assigning 
accountability to individuals and organizations; and:

* did not effectively plan and communicate reorganizations and assign 
appropriate authority levels, in the opinion of many project employees.

As a result of findings from these assessments, DOE is pursuing further 
corrective actions. For example, DOE plans to formally control the 
high-level document that defines its organizational structure. Also, 
Bechtel has initiated a management system improvement project, which 
includes issuing a new document defining roles and responsibilities. 
DOE officials expect that roles and responsibilities will continue to 
be a challenge in the future, but that efforts will continue.

Ineffective Procedures:

Three of the four management assessments identified continuing problems 
with project procedures, one of the areas of management weaknesses 
addressed by the 2002 corrective action plan. Although the assessments 
noted that DOE and Bechtel had made improvements in the procedure 
management system and DOE had reportedly reviewed existing procedures, 
issued new or revised procedures, and ensured that personnel using the 
procedures were properly trained, the assessments noted that:

* procedures were overly prescriptive,

* procedures did not cover all required processes, and:

* continuing noncompliance with procedures remained a problem.

Although DOE completed actions under the 2002 plan to revise project 
procedures, DOE has initiated further corrective actions, including a 
plan to again revise Yucca Mountain project procedures by June 2005.

Inadequate Work Environment:

Three of the four assessments identified continuing problems with 
efforts by DOE and Bechtel to ensure a work environment in which 
employees can freely raise concerns without fear of reprisal--one of 
the key areas of management weaknesses identified in the corrective 
action plan. DOE and Bechtel implemented corrective actions to improve 
the work environment by revising and expanding policies, modifying DOE 
contracts to require implementation of program requirements, decreasing 
the backlog of employee concerns, and providing programwide training 
that is based on industry practices. However, the assessments revealed 
continuing problems with the work environment, including both DOE's and 
Bechtel's employee concerns programs, which provide personnel with an 
opportunity to formally raise concerns about the project outside the 
normal chain-of-command without fear of reprisal. Appendix II describes 
the requirements of the Yucca Mountain employee concerns programs. 
Although the assessments noted ongoing management actions to strengthen 
the implementation of the concerns programs, they also noted that:

* neither DOE nor Bechtel have effectively controlled corrective 
actions under the employee concerns programs, sometimes closing cases 
on the basis of anticipated actions;

* both DOE and contractor employee concerns programs are not being 
utilized to their fullest;

* there is a general lack of employee confidence in reporting safety 
issues to management;

* DOE and Bechtel have not made effective resources available for 
determining the root causes of problems identified;

* DOE and Bechtel have not established a climate of trust despite 
communication mechanisms and messages; and:

* a majority of DOE and contractor employees either do not consider the 
project's corrective action process to be effective or are not sure of 
its effectiveness.

Although the plan's actions to improve the work environment were 
completed in November 2003, DOE plans to take additional actions to 
improve employee confidence in raising issues without fear of reprisal.

NRC Is Concerned That Recurring Problems Could Adversely Affect 
Licensing:

NRC has commented on DOE's lack of progress in making improvements to 
the quality assurance program. At an April 2003 management meeting with 
DOE, an NRC official commented that the quality assurance program had 
not produced the outcomes necessary to ensure that the program is 
compliant with NRC requirements. In response, DOE outlined the steps it 
was taking to ensure that its license application would meet NRC 
expectations for completeness, accuracy, and compliance with quality 
assurance requirements. The steps included additional actions to 
improve performance in five areas: license application, procedural 
compliance, the corrective action program, the work environment, and 
accountability. In October 2003, DOE reported to NRC that it had 
completed some of the actions and was making progress in the remaining 
open action items. While NRC officials noted that DOE's actions might 
enhance performance, they found that significant implementation issues 
persist. NRC officials stated that they were seeking evidence of 
incremental DOE progress in the implementation of the quality assurance 
program in order to gain confidence in the adequacy of data, models, 
and software supporting the potential license application. In a 
November 2003 management meeting with DOE, NRC officials expressed 
encouragement with DOE's progress in implementing an improved 
corrective action process and the continued performance of effective 
audits and the identification of areas for improvement. However, the 
NRC staff continued to express concerns with DOE's lack of progress in 
correcting repetitive quality problems with models and software.

NRC recently completed an evaluation of DOE's technical documents and 
supporting activities at Yucca Mountain. This prelicensing evaluation 
focused on an analysis of the technical information supporting three 
important repository models and the processes for developing and 
controlling the models. In addition, NRC evaluated the effectiveness of 
recent corrective actions in the areas of data, models, and software. 
The NRC report, released in April 2004, found that technical support 
for DOE's repository models was greatly improved, current models are 
more comprehensive and contain more data than those presented for site 
recommendation, software documentation was extensive, the management of 
databases was outstanding, and the trending program has been 
improved.[Footnote 12] However, the report noted concerns regarding the 
clarity and sufficiency of the technical information used to support 
the models. The NRC evaluation again found instances where data could 
not readily be traced back to their sources, unqualified data were used 
as direct inputs to the models, unqualified software was used to 
generate data supporting a model, and the model development process 
relied on inadequate checking and review procedures. In addition, NRC 
reported that DOE and Bechtel have not been fully successful in 
carrying out effective actions to eliminate recurring quality problems. 
The report states that DOE and Bechtel had not integrated human 
performance concerns in their root-cause and corrective action efforts 
in response to past quality problems. The NRC report concluded the 
following:

"…if DOE continues to use its existing policies, procedures, methods, 
and practices at the same level of implementation and rigor, the 
license application may not contain information sufficient to support 
some technical positions in the application. This could result in a 
large volume of requests for additional information in some areas which 
could extend the review process, and could prevent NRC from making a 
decision regarding issuing a construction authorization to DOE within 
the time required by law.":

Corrective Action Plan Lacks Measurable Goals:

DOE cannot formally assess the overall effectiveness of its 2002 
corrective action plan because the performance goals to assess 
management weaknesses in the plan lack objective measurements and time 
frames for determining success. For example, the goals do not specify 
the amount of improvement expected, how quickly the improvement should 
be achieved, or how long the improvement should be sustained before the 
problems can be considered corrected. For example, whereas 1 goal calls 
for a decreasing trend in the average time needed to make revisions in 
procedures, it does not specify the desired amount of the decrease, the 
length of time needed to achieve the decrease, or how long the decrease 
must be sustained. DOE recently developed a management tool to measure 
overall project performance that includes more than 200 performance 
indicators with supporting goals, including 17 goals linked to the 13 
goals included in the 2002 corrective action plan. These 17 goals 
specify the desired amount of improvement, but most still lack the time 
frames needed for achieving and sustaining the goals. DOE officials 
told us they intend to use this performance measurement tool to track 
the progress of the project, including actions taken under the 2002 
corrective action plan. A DOE independent review of the corrective 
action plan completed in March 2004 found that the corrective actions 
from the 2002 plan to address management weaknesses have been fully 
implemented. However, the review also noted the effectiveness of 
corrective actions under the plan could not be evaluated because many 
of the goals in the performance measurement tool that are linked to the 
2002 plan lacked the level of objectivity and testing needed to measure 
effectiveness.

Goals Are Not Objectively Measurable and Lack Specific Time Frames:

DOE's 2002 plan included 13 goals to be used to determine the 
effectiveness of the corrective actions that addressed the five areas 
of management weaknesses. However, these goals were poorly defined, 
thus limiting DOE's ability to evaluate the effectiveness of actions 
taken. Both GAO[Footnote 13] and the Office of Management and Budget 
(OMB)[Footnote 14] have stated that performance goals need to be 
measurable, and time frames need to be established in order to track 
progress and demonstrate that deficiencies have been corrected. Of the 
13 goals in the corrective action plan, 3 indicated how much 
improvement was expected. For example, 1 of the 3 goals specified that 
the number of significant quality problems self-identified by program 
managers should be at least 80 percent of all significant quality 
problems, including those identified by program managers, quality 
assurance auditors, or other employees. In contrast, 1 of the other 10 
goals called for the achievement of a decreasing trend in the time 
needed for revising procedures, but did not specify how much of a 
decrease is expected. Further, none of the 13 goals specified the 
length of time needed to reach and maintain the desired goal to 
demonstrate that the actions taken were effective. For example, the 
goal calling for self-identified significant quality problems to be at 
least 80 percent of all significant quality problems did not indicate 
the length of time needed to achieve the goal or how long this goal 
should be sustained in order to demonstrate effectiveness. DOE does not 
intend to revise the goals of the 2002 corrective action plan to 
include quantifiable measures and time frames. Without such 
quantifiable measures to determine whether a goal has been met, and 
without a specified time for the goal to be maintained, DOE cannot use 
these goals to determine the effectiveness of the actions taken.

Subsequent Efforts to Improve Goals Still Lack Time Frames:

DOE's recent efforts to improve performance measurement have not 
allowed it to adequately measure the effectiveness of its corrective 
action plan. DOE has developed a projectwide performance measurement 
tool to assess project performance that includes over 200 performance 
indicators with supporting goals related to the project. At our 
request, Bechtel was able to link 17 of the supporting goals to 12 of 
the 13 goals of the 2002 corrective action plan. Although these linked 
goals improved quantifiable measurement for 11 of the plan's goals by 
specifying the amount of improvement expected, most did not include the 
necessary time frames for meeting the goals and sustaining the desired 
performance. DOE officials stated that this tool was not specifically 
tailored to evaluate the corrective action plan's effectiveness, but 
that they have decided to use it in lieu of the original 13 goals to 
monitor improvements and progress in correcting the management 
weaknesses identified in the plan. Table 1 provides a comparison of the 
supporting goals in the performance tool with the 2002 corrective 
action plan goals.

Table 1: Comparison of Goals in the July 2002 Corrective Action Plan to 
Goals in the December 2003 Performance Tool:

Key area of management weakness: Roles, responsibilities, 
accountability, authority; 
Original goals from corrective action plan, July 2002: 
(1) Improving trend in quality and schedule performance; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(1) Amount of actual work completed is 98 to 115 percent of the amount 
of work scheduled; 
GAO comments/observations: Quality performance is not included in new 
goal. Partial improvement for schedule - quantitative measure added, 
time frame to meet and sustain goal lacking.

Key area of management weakness: Roles, responsibilities, 
accountability, authority; 
Original goals from corrective action plan, July 2002: 
(2) Decreasing trend in quality problems related to roles and 
responsibilities; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: Goal is not covered in performance tool; 
GAO comments/ observations: Not applicable.

Key area of management weakness: Quality assurance process; 
Original goals from corrective action plan, July 2002: 
(3) Numbers of high- priority 
(significant) quality problems that are self-identified are at least 80 
percent of all significant quality problems; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(2) At least 80 percent of quality problems are self-identified; 
GAO comments/observations: Quantitative measure remains the same; 
goal is no longer focused on high-priority problems. Time frame to meet 
and sustain goal lacking.

Key area of management weakness: Quality assurance process; 
Original goals from corrective action plan, July 2002: 
(4) Decreasing trend in average time to resolve significant quality 
problems and in number of delinquent corrective actions for significant 
quality problems; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(3) At least 90 percent of quality problems are closed in 60 days; 
(4) At least 90 percent of significant quality problems are closed in 
100 days; 
(5) At least 80 percent of all problems are screened in 5 days; 
(6) At least 90 percent of all problems have corrective action plans in 
30 days; 
(7) At least 80 percent of corrective action plans are on schedule; 
(8) 1 to 1.2 ratio of new problems to closed problems; 
GAO comments/observations: Partial improvement - quantitative measures 
added, time frames to meet and sustain goals lacking.

Key area of management weakness: Written procedures; 
Original goals from corrective action plan, July 2002: 
(5) Decreasing number of quality problems related to ineffective 
procedures; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(9) 15 percent or less of all quality problems are based on ineffective 
procedures; 
GAO comments/observations: Partial improvement - quantitative measure 
added, time frame to meet and sustain goal lacking.

Key area of management weakness: Written procedures; 
Original goals from corrective action plan, July 2002: 
(6) Decreasing trend in time needed to revise procedures; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(10) Procedure revisions are made in 75 days or less; 
GAO comments/ observations: Partial improvement - quantitative measure 
added, time frame to meet and sustain goal lacking.

Key area of management weakness: Written procedures; 
Original goals from corrective action plan, July 2002: 
(7) Decreasing trend in average age of interim procedure changes; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(11) Interim procedure changes are made in less than 15 days; 
GAO comments/observations: Partial improvement - quantitative measure 
added, time frame to meet and sustain goal lacking.

Key area of management weakness: Corrective action plans; 
Original goals from corrective action plan, July 2002: 
(8) Decreasing trend in number of repetitive quality problems; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(12) 5 percent or less of all corrective actions still have quality 
problems; 
(13) 5 percent or less of all quality problems are repeated; 
GAO comments/observations: Partial improvement - quantitative measures 
added, time frames to meet and sustain goals lacking.

Key area of management weakness: Corrective action plans; 
Original goals from corrective action plan, July 2002: 
(9) Decreasing trend in average time to resolve significant quality 
problems; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(same as goals 3 and 4 in this column); 
GAO comments/ observations: Partial improvement - quantitative measure 
added, time frames to meet and sustain goals lacking.

Key area of management weakness: Corrective action plans; 
Original goals from corrective action plan, July 2002: 
(10) Less than 10 percent of quality problems are resolved late; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(same as goals 3 and 4 in this column); 
GAO comments/observations: Partial improvement - quantitative measures 
added, time frames to meet and sustain goals lacking.

Key area of management weakness: Work environment; 
Original goals from corrective action plan, July 2002: 
(11) Decreasing number of substantiated employee concerns for 
harassment, retaliation, intimidation, and discrimination; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(14) Zero concerns related to harassment, intimidation, retaliation, or 
discrimination are substantiated; 
GAO comments/observations: Partial improvement - quantitative measures 
added, time frame to meet and sustain goal lacking.

Key area of management weakness: Work environment; 
Original goals from corrective action plan, July 2002: 
(12) Evaluation of routine employee concerns in less than 30 days, or 
90 days for complex employee concerns involving harassment or 
intimidation; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(15) 25-day or less response time for routine concerns; 
(16) 80-day or less response time for complex concerns or harassment, 
retaliation, intimidation, or discrimination concerns; 
GAO comments/ observations: Partial improvement - quantitative measures 
changed, time frames to meet and sustain goals lacking.

Key area of management weakness: Work environment; 
Original goals from corrective action plan, July 2002: 
(13) External evaluation of work environment shows positive changes; 
Goals related to the corrective action plan in projectwide performance 
tool, December 2003: 
(17) At least 80 percent favorable response rates to six employee 
survey questions; 
GAO comments/observations: Partial improvement - quantitative measure 
added, time frame to meet and sustain goal lacking. 

Source: GAO analysis of DOE data.

Note: Performance goals in projectwide tool related to the corrective 
action plan represent a small fraction of the more than 200 goals being 
used for the project.

[End of table]

DOE Is Unable to Evaluate the Effectiveness of Corrective Actions:

DOE has recently assessed the implementation of corrective actions, but 
it has not yet assessed the effectiveness of these actions in 
correcting recurring problems. In December 2003, DOE outlined the 
approach it used to determine whether corrective actions have been 
implemented.[Footnote 15] This approach is part of the overall process 
described in the 2002 action plan--appendix III provides an overview of 
the action plan and the status of the process. To determine if 
corrective actions had been implemented, DOE relied on the collective 
judgment of project managers regarding how effectively they have 
incorporated corrective actions into their regular project activities. 
A March 2004 DOE review analyzed the implementation of corrective 
actions for each of the management weaknesses but was not able to 
evaluate the effectiveness of the corrective actions.[Footnote 16]

DOE's March 2004 review noted strong management commitment to 
improvement and described recent actions taken to ensure that work 
products meet quality objectives for a successful license application. 
However, the review identified continuing weaknesses in DOE's ability 
to determine the effectiveness of the actions it has taken. The review 
team attempted to measure how effectively DOE had met each of the 
plan's original 13 goals. The team was unable to measure whether 10 of 
the 13 goals had been met, but concluded that the project had met 2 of 
the goals and made progress toward another goal, based on an analysis 
of trends in quality problems identified. However, these conclusions 
were not based on an evaluation of quantifiable goals with time frames 
for meeting and sustaining the desired performance. The review also 
concluded that the performance indicators developed to evaluate the 
success of the actions lacked the level of objectivity and testing 
needed to measure effectiveness and that some lacked the data needed to 
assess effectiveness. The review recommended that DOE continue its 
corrective actions and refine performance indicators so that the 
effectiveness of corrective actions in meeting the plan's goals can be 
more readily measured.

In April 2004, DOE notified NRC that it had completed, validated, and 
independently assessed the commitments it made in the 2002 corrective 
action plan, institutionalized the corrective actions, and established 
a baseline to foster and sustain continuous improvement. DOE officials 
stated they have achieved the initial goals of the 2002 plan through 
these actions. These officials indicated they would continue to refine 
and improve project tools used to evaluate the effectiveness of 
corrective actions. However, because of the limitations noted in its 
March 2004 review, DOE has not yet evaluated the effectiveness of 
corrective actions.

Conclusions:

Despite working nearly 3 years to address recurring quality assurance 
problems, recent audits and assessments have found that problems 
continue with data, models, and software, and that management 
weaknesses remain. As NRC has noted, quality assurance problems could 
delay the licensing process. Despite recurring quality problems, DOE 
has recently closed the corrective action reports for data and software 
and intends to close the corrective action report for models in August 
2004 without first evaluating the effectiveness of the corrective 
actions taken to address the problems in these areas. DOE also does not 
intend to improve the goals of the 2002 plan associated with management 
weaknesses so that they can be adequately measured. Instead, DOE 
continues to plan and implement further actions to correct its quality 
problems and management weaknesses. This approach provides no 
indication regarding when DOE may be in a position to show that 
corrective actions have been successful. Entering into the licensing 
phase of the project without resolving the recurring problems could 
impede the application process, which at a minimum could lead to time-
consuming and expensive delays while weaknesses are corrected and could 
ultimately prevent DOE from receiving authorization to construct a 
repository. Moreover, recurring problems could create the risk of 
introducing unknown errors into the design and construction of the 
repository that could lead to adverse health and safety consequences. 
Because of its lack of evidence that its actions have been successful, 
DOE is not yet in a position to demonstrate to NRC that its quality 
assurance program can ensure the safe construction and long-term 
operation of the repository.

Recommendations for Executive Action:

To better evaluate the effectiveness of management actions in 
correcting recurring quality problems, we recommend that the Secretary 
of Energy direct the Director, Office of Civilian Radioactive Waste 
Management, to:

* revise the performance goals in the 2002 action plan to include 
quantifiable measures of the performance expected and time frames for 
achieving and maintaining this expected level of performance and:

* close the 2002 plan once sufficient evidence shows that the recurring 
quality assurance problems and management weaknesses that are causing 
them have been successfully corrected.

Agency Comments and Our Evaluation:

We provided a draft of this report to DOE and NRC for their review and 
comments. DOE's written comments, which are reproduced in appendix IV, 
expressed disagreement with the report's findings and recommendations. 
DOE commented that the report did not properly acknowledge improvements 
the department has made in the quality assurance program; failed to 
properly characterize the 2002 Management Improvement Initiatives as a 
"springboard" to address management issues; did not consider DOE's use 
of the full range of performance indicators related to quality 
assurance; and mischaracterized the results of several independent, 
external reviews, taking a solely negative view of the findings.

We disagree with most of DOE's comments. Our draft report acknowledged 
that DOE has taken a number of actions to address past problems in the 
quality assurance program, but to ensure clarity on this point, we have 
incorporated additional language to this effect in the report. However, 
our primary focus for this review was to evaluate the effectiveness of 
DOE's corrective actions in addressing the recurring quality problems. 
Despite the many actions taken to improve the quality assurance 
program, the management weaknesses and quality problems with data, 
models, and software have continued, indicating that the corrective 
actions have not been fully effective. Regarding DOE's views on our 
treatment of the 2002 Management Improvement Initiatives, DOE itself 
characterized the initiative as a "comprehensive corrective action 
plan." DOE stated that the implementation of the plan has been 
successful based on the evidence that responsible managers have taken 
agreed-upon action. This approach can be misleading, however, because 
it does not incorporate a determination of whether these actions have 
been effective. In fact, DOE has not evaluated the effectiveness of 
these actions in solving recurring problems. DOE further stated that we 
did not consider the full range of performance indicators related to 
quality assurance that DOE uses to manage the project. We agree. We 
asked DOE staff to compare their new performance indicators to the 
goals in the 2002 plan, and those are the goals that we presented for 
comparison in table 1 of our report. A discussion of the remainder of 
the hundreds of other goals was beyond the scope of our review and 
would not have added to an understanding of the overall problems with 
DOE's goals. Finally, we disagree with DOE's comment that we 
mischaracterized the results of recent independent reviews. We noted 
instances in these reports where improvements were found. However, we 
also devoted appropriate attention to evidence in these reports that 
address whether DOE's corrective actions have been effective. As our 
report states, these reports consistently found that these actions have 
not yet had their intended effect.

In NRC's written comments, reproduced in appendix V, the agency agreed 
with our conclusions but suggested that DOE be given the flexibility to 
choose alternative approaches to achieve and measure quality assurance 
program performance. We agree that alternative approaches could be used 
to measure performance; however, to ensure the success of any 
approaches, DOE must include objective measurements and time frames for 
reaching and sustaining desired performance and include an end point 
for closing out the corrective action plan.

Scope and Methodology:

To assess the status of DOE's corrective actions to resolve recurring 
quality problems, we reviewed audits and deficiency reports written by 
the program over the past 5 years that identified problems with data, 
models, and software. We did not independently assess the adequacy of 
data, models, and software, but rather relied on the results of the 
project's quality assurance audits. In addition, we reviewed numerous 
documents that NRC prepared as part of its prelicensing activities at 
Yucca Mountain, including observations of quality assurance audits, NRC 
on-site representative reports, and correspondence between NRC and DOE 
on quality matters. We also observed an out-briefing of a quality 
assurance audit to obtain additional knowledge of how quality problems 
are identified and reported. To document the status of actions taken, 
we reviewed evidence used by DOE's Office of Civilian Radioactive Waste 
Management to prove corrective actions had been implemented and 
interviewed officials with DOE, at the Yucca site and in headquarters, 
and officials with Bechtel, the primary contractor. We also reviewed 
the results of four DOE assessments completed in the fall of 2003 that 
included the quality assurance program, interviewing the authors of the 
assessment reports to obtain a clear understanding of the problems 
identified. We attended quarterly meetings held between DOE and NRC to 
discuss actions taken under the plan and met with representatives of 
the State of Nevada Agency for Nuclear Projects and with 
representatives of the Nuclear Waste Technical Review Board, which was 
established to advise DOE on scientific and technical aspects of the 
Yucca Mountain project.

To determine the adequacy of DOE's plan to measure the effectiveness of 
the actions it has taken, we examined the July 2002 corrective action 
plan and subsequent project performance measurement documents to 
determine how DOE intended to use goals and performance measures to 
evaluate the plan's effectiveness. We asked Bechtel officials to assist 
us in identifying and matching performance goals in the projectwide 
performance measurement tool with those in the 2002 corrective action 
plan. We compared DOE's approach in its corrective action plan and 
subsequent projectwide tool with GAO and OMB guidance on performance 
measurement. We discussed the implementation of the corrective action 
plan and methods for measuring its effectiveness with various DOE and 
NRC officials and DOE contractors in Washington, D.C., and at the Yucca 
Mountain project office in Las Vegas, Nevada. We also interviewed other 
GAO personnel familiar with performance measurement to more fully 
understand the key elements needed for effective assessments.

We will send copies of this report to the appropriate congressional 
committees, the Secretary of Energy, and the Chairman of the Nuclear 
Regulatory Commission. We will also make copies available to others on 
request. In addition, this report will be available at no charge on the 
GAO Web site at [Hyperlink, http://www.gao.gov].

Signed by:

If you or your staffs have any questions about this report, please call 
me on (202) 512-3841. Major contributors to this report are listed in 
appendix VI.

Signed by: 

Robin M. Nazzaro 
Director, Natural Resources and Environment:

[End of section]

Appendixes: 

Appendix I: Role of Quality Assurance in the Licensing Process:

After the Department of Energy (DOE) submits its license application to 
the Nuclear Regulatory Commission (NRC), NRC will review it to 
determine whether all NRC requirements have been met and whether the 
repository is likely to operate safely as designed. NRC's review will 
be guided by its Yucca Mountain Review Plan, which NRC developed to 
ensure the quality, uniformity, and consistency of NRC reviews of the 
license application and of any requested amendments.[Footnote 17] The 
review plan is not a regulation, but does contain the licensing 
criteria contained in federal regulations.[Footnote 18] DOE's 
application is to include general, scientific, and administrative 
information contained in two major sections: (1) a general information 
section that provides an overview of the engineering design concept for 
the repository and describes aspects of the Yucca Mountain site and its 
environs that influence repository design and performance, and (2) a 
detailed safety analysis section that provides a review of compliance 
with regulatory performance objectives that are based on permissible 
levels of radiation doses to workers and the public, established on the 
basis of acceptable levels of risk. The general information section 
covers such topics as proposed schedules for construction, receipt, and 
emplacement of waste; the physical protection plan; the material 
control and accounting program; and a description of site 
characterization work. The detailed safety analysis is the major 
portion of the application and includes DOE's detailed technical basis 
for the following areas:

* the repository's safety performance before permanent closure in 100 
to 300 years;

* the repository's safety performance in the 10,000 years after 
permanent closure, on the basis of the "performance assessment" 
computer model;

* a research and development program describing safety features or 
components for which further technical information is required to 
confirm the adequacy of design and engineered or natural barriers;

* a performance confirmation program that includes tests, experiments, 
and analyses that evaluate the adequacy of information used to 
demonstrate the repository's safety over thousands of years; and:

* administrative and programmatic information about the repository, 
such as the quality assurance program, records and reports, training 
and certification of personnel, plans for start-up activities, 
emergency planning, and control of access to the site.

After DOE submits the license application (currently planned for 
December 2004), NRC plans to take 90 days to examine the application 
for completeness to determine whether DOE has addressed all NRC 
requirements in the application. One of the reviews for completeness 
will include an examination of DOE's documentation of the quality 
assurance program to determine if it addresses all NRC criteria. These 
criteria include, among other things, organization, design and document 
control, corrective actions, quality assurance records, and quality 
audits. If it deems any part of the application incomplete, NRC may 
either reject the application or require that DOE furnish the necessary 
documentation before proceeding with the detailed technical review of 
the application. If it deems the application complete, NRC will docket 
the application, indicating its readiness for a detailed technical 
review.

Once the application is docketed, NRC will conduct a detailed technical 
review of the application over the next 18 months to determine if the 
application meets all NRC requirements, including the soundness of 
scientific analyses and preliminary facility design, and the NRC 
criteria established for quality assurance. If NRC discovers problems 
with the technical information used to support the license application, 
it may conduct specific inspections to determine the extent and effect 
of the problem. Because the data, models, and software used in modeling 
repository performance are integral parts of this technical review, 
quality assurance plays a key role since it is the mechanism used to 
verify the accuracy of the information DOE presents in the application. 
NRC may conduct inspections of the quality assurance program if 
technical problems are identified that are attributable to quality 
problems. According to NRC, any technical problems and subsequent 
inspections could delay the licensing of the repository or, in a rare 
instance, lead to ultimate rejection of the application. NRC will hold 
public hearings chaired by its Atomic Safety and Licensing Board to 
examine specific topics. Finally, within 3 to 4 years from the date 
that NRC dockets the application, NRC will make a decision to grant the 
application, reject the application, or grant it with 
conditions.[Footnote 19] Figure 1 presents the licensing process and 
timeline.

Figure 1: License Application Review Process and Timeline:

[See PDF for image]

[End of figure]

[End of section]

Appendix II: Employee Concerns Programs at the Yucca Mountain Project:

DOE and Bechtel/SAIC Company, LLC (Bechtel), have each established an 
employee concerns program to allow employees to raise concerns about 
the work environment without fear of reprisal. NRC requires licensees 
to establish a safe work environment where (1) employees are encouraged 
to raise concerns either to their own management or to NRC without fear 
of retaliation and (2) employees' concerns are resolved in a timely and 
appropriate manner according to their importance. DOE and contractor 
employees at Yucca Mountain have various means through which to raise 
concerns about safety, quality, or the work environment, including:

* normal supervisory channels;

* a corrective action program--a process in which any employee can 
formally cite a problem on the project, including the work environment, 
that needs to be investigated and corrective actions taken;

* a DOE or contractor employee concerns program; or:

* filing a concern directly with NRC.

NRC encourages, but does not require, licensees to establish employee 
concerns programs. Both the DOE and Bechtel concerns programs at Yucca 
Mountain have three main steps:

1. An employee notifies concerns program staff about an issue that he/
she feels should be corrected, such as safety and health issues, free 
from harassment, retaliation, or quality assurance problems.

2. The concerns program staff documents and investigates the employee's 
concern.

3. The concerns program notifies the employee of the results of the 
investigation and notifies management of any need for corrective 
actions.

DOE and Bechtel each have established a communication network to allow 
employees to register concerns. These networks include brochures and 
regular newsletters on the program and numerous computer links to the 
program on the contractor's intranet where employees can obtain 
concerns program forms on line.

Recent statistics released by DOE show that most of the 97 concerns 
investigated by the DOE and Bechtel concerns programs in 2003 related 
to complaints against management. A summary of the concerns 
investigated in 2003 is shown in table 2.

Table 2: Employee Concerns Investigated by DOE and Bechtel in 2003:

Category of concern: Management problems or claims of mismanagement; 
Substantiated concerns: 26; Concerns not substantiated: 24; Total 
number of concerns: 50.

Category of concern: Human resources; 
Substantiated concerns: 8; 
Concerns not substantiated: 6; 
Total number of concerns: 14.

Category of concern: Harassment, intimidation, retaliation, or 
discrimination; 
Substantiated concerns: 4; 
Concerns not substantiated: 8; 
Total number of concerns: 12.

Category of concern: Quality; 
Substantiated concerns: 5; 
Concerns not substantiated: 3; 
Total number of concerns: 8.

Category of concern: Fraud, waste, and abuse; 
Substantiated concerns: 2; 
Concerns not substantiated: 1; 
Total number of concerns: 3.

Category of concern: Safety; 
Substantiated concerns: 0; 
Concerns not substantiated: 2; 
Total number of concerns: 2.

Category of concern: Equal employment opportunity; 
Substantiated concerns: 0; 
Concerns not substantiated: 1; 
Total number of concerns: 1.

Category of concern: Security; 
Substantiated concerns: 0; 
Concerns not substantiated: 0; 
Total number of concerns: 0.

Category of concern: Health; 
Substantiated concerns: 0; 
Concerns not substantiated: 0; 
Total number of concerns: 0.

Category of concern: Environment; 
Substantiated concerns: 0; 
Concerns not substantiated: 0; 
Total number of concerns: 0.

Category of concern: Workplace violence; 
Substantiated concerns: 0; 
Concerns not substantiated: 0; 
Total number of concerns: 0.

Category of concern: Other; 
Substantiated concerns: 5; 
Concerns not substantiated: 2; 
Total number of concerns: 7.

Category of concern: Total; 
Substantiated concerns: 50; 
Concerns not substantiated: 47; 
Total number of concerns: 97. 

Source: DOE.

Note: Three concerns filed in 2003 were not included in this table. A 
concerns program official told us that two of these concerns were 
addressed by other organizations, and the resolution of the remaining 
concern was limited to providing information to management, as 
requested by the concerned individual.

[End of table]

[End of section]

Appendix III: 2002 Corrective Action Plan Process and Status:

DOE has established a process for completing corrective actions 
associated with the 2002 corrective action plan and evaluating their 
effectiveness. According to this process, after managers report they 
have taken actions to correct management weaknesses and specific 
problems with models and software, a confirmation team of DOE and 
contractor personnel verify that the actions have been completed. Once 
this step is completed, DOE conducts internal and external 
effectiveness reviews to determine if the actions have been effective 
in correcting the reported conditions. After the reviews of 
effectiveness, the results are assessed and reported to the Director of 
the Office of Civilian Radioactive Waste Management (OCRWM). The 
director then notifies NRC officials of the results of the 
effectiveness reviews, and the 2002 corrective action plan is closed. 
Figure 2 shows the corrective action plan process and the status of 
each step.

Figure 2: 2002 Corrective Action Plan Process and Status:

[See PDF for image]

[End of figure]

[End of section]

Appendix IV: Comments from the Department of Energy:

Department of Energy 
Washington, DC 20585:

April 19, 2004:

QA: N/A:

Ms. Robin Nazzaro:

Director, Natural Resources and Environment 
U.S. General Accounting Office:
441 G Street, NW 
Washington, D.C. 20548:

Dear Ms. Nazzaro:

Thank you for the opportunity to provide comments on the General 
Accounting Office (GAO) draft report, "Persistent Quality Assurance 
Problems Could Delay Repository Licensing and Operation." 
Unfortunately, the Department must respectfully disagree with the 
report's findings and recommendations, and therefore with the 
conclusion that forms its title.

I want to emphasize that the Office of Civilian Radioactive Waste 
Management (OCRWM) has shaped its quality assurance (QA) program to be 
consistent with Nuclear Regulatory Commission (NRC) requirements and 
standard industry practices. The role of QA is to verify that 
activities important to safety and waste isolation have been correctly 
performed. Detailed procedures address how technical work is documented 
so that the work and its results are reproducible, retrievable, 
transparent, and traceable. QA checks, audits, and inspections identify 
variances from procedures, and corrective actions are managed through a 
structured process. A nuclear QA program is effective if all personnel 
(not just those assigned to QA) proactively identify conditions that 
may affect quality or require attention, and if the organization can 
plan, implement, complete, and verify appropriate corrective actions in 
a timely manner. Evaluated against these criteria, I believe the OCRWM 
QA program has made significant progress and is operating effectively.

In the Department's view, the major deficiencies of the draft report 
are as follows:

The report authors did not acknowledge clear QA improvements we have 
made. By the measures of effectiveness that are important in the 
nuclear regulatory context, OCRWM has made substantial QA improvements. 
A few examples:

In the last 15 months, self-identification by line management of 
conditions adverse to quality has increased by approximately 100 
percent (Reference 1). This is a very positive development, showing 
that line managers are consistently reviewing their own work and 
ensuring that it is properly documented, reproducible, retrievable, 
transparent, and traceable, prior to being finalized.

Some corrective actions require the modification of procedures (e.g., 
to better define requirements for documentation of scientific work), 
and the procedural change itself must follow strict QA processes. We 
have improved our ability to ensure that QA procedures are appropriate 
to the quality objectives they support, reducing the average time it 
takes to modify a procedure from a past average of seven months to a 
current average of three months (Reference 2).

Line managers are increasingly effective in developing corrective 
actions for identified issues. Since October 2003, 75% of line 
managers' corrective action plans were accepted upon the first review 
by QA personnel (Reference 1).

Our ability to address and close quality issues was demonstrated by the 
closure of two major, longstanding Corrective Action Reports on data 
management and software qualification in March 2004.

Our safety conscious work environment - in the nuclear context, an 
environment where employees feel free to raise concerns about quality 
or safety without fear of reprisal - has been strengthened, as shown by 
internal surveys, performance indicators, and a comprehensive 
independent survey conducted in August 2003 by International Survey 
Research (Reference 3). That independent survey characterized OCRWM as 
"significantly and largely more positive" in the area of safety 
conscious work environment than other Federal agencies associated with 
research and technology.

* The report authors failed to properly characterize the 2002 Management 
Improvement Initiatives (MII) effort, which is referred to inaccurately 
throughout the GAO draft report as the "2002 corrective actions plan," 
and its relationship to ongoing QA and management activities. The draft 
report asserts that although the 2002 MII is complete, there are 
"lingering quality problems with data, models, and software, and 
continuing management weaknesses." As noted above, the corrective 
actions associated with data management and software were verified and 
closed in March 2004. The corrective action report on validation of 
model reports is on track for closure within four months. With regard 
to alleged management weaknesses, the 2002 MII was initiated by the 
Department as an aggressive "springboard" effort to address management 
issues and transition improvements into day-to-day line management 
activities. The 2002 MII addressed five areas: roles, responsibilities, 
authority, and accountability; quality assurance programs and 
processes; program procedures; the Corrective Action Program; and 
safety conscious work environment. The focus on roles, 
responsibilities, authority, and accountability was important because 
lack of clarity in those areas had been cited in the past as a root 
cause of QA problems. Implementation of the 2002 MII has been 
successful: responsible managers demonstrated with objective evidence 
that they completed the commitments set out in MII action plans, and an 
independent assessment by Longenecker and Associates confirmed that the 
MII action statements had been appropriately completed (References 4, 
5, and 6). On 
April 5, 2004, I wrote to the NRC to indicate that we have completed, 
validated, and independently assessed MII implementation and have 
transitioned the 2002 MII goals to ongoing line management activities 
(Reference 7).

* The full range of performance indicators used by OCRWM to manage QA-
related issues was not considered. The draft report suggests that the 
Department cannot assess the effectiveness of the 2002 MII because 
performance goals lack objective measures and timeframes. The 
effectiveness indicators that we defined as part of the 2002 MII were 
management metrics that supported improvement goals by setting high 
expectations and describing a desired future state to work toward. Work 
execution metrics, by contrast, quantify performance and set timeframes 
as appropriate - GAO included a small selection of these metrics in its 
chart on page 20 of the draft. OCRWM has in fact more than 300 
performance indicators that we use to assess progress and identify 
issues on a continuous basis (Reference 8). Performance indicators are 
evaluated in detail at Monthly Operating Reviews (Reference 9).

These elements of the OCRWM management tool inventory were not 
adequately addressed in the draft GAO report.

* The draft report mischaracterizes the results of several independent, 
external reviews. The report does not acknowledge the positive findings 
that external evaluators have made in several independent assessments 
and seems to take a solely negative view of the recommendations made by 
those evaluators (References 6, 10, 11, 12). We view the identification 
of issues as positive opportunities that should be routinely sought, 
listened to, and acted upon. Where GAO sees "continuing problems," we 
see a measurable record of progress to date and a commitment to 
continuing improvement in the future. It is understood by the 
Department, by the NRC, and by knowledgeable outside observers that the 
repository program must meet rigorous quality assurance expectations 
for our license application to be acceptable to the Commission. The 
fact is, we are on schedule to submit our license application in 
December 2004, and we have an effective quality assurance program in 
place that will enable us to meet that objective.

In summary, we have demonstrated steady and significant progress. We 
initiated the MII in 2002 to provide a special focus on specific 
improvement targets; we achieved the objectives of MII and have 
transitioned improvement initiatives to day-to-day management. Our 
continuous improvement culture means that we expect progress to 
continue, and our performance metrics enable us to assess that progress 
and direct management attention as needed. Based on these facts, we 
cannot concur with the findings and recommendations of the draft 
report.

I urge GAO to further examine available information about our quality 
assurance program, performance indicators, safety conscious work 
environment, and other relevant aspects of the Program. Some highly 
pertinent information that was available during the time of GAO's 
audit, between April 2003 and March 2004, is not reflected in the draft 
report. More recent documentation - for instance, the MII Independent 
Review Report, which was published on March 19, 2004 - is also 
significant. The enclosed list of references identifies documents that, 
we believe, are critical for GAO to review and fully consider prior to 
working further on the draft report. Without full consideration of this 
information, GAO's findings on the Department's progress in addressing 
quality assurance issues are incomplete, and its conclusions are 
broadly inaccurate.

I strongly urge you to review and incorporate additional information in 
your final report. You are welcome to revisit our offices, and we will 
provide any documentation you may require.

Sincerely,

Signed by: 

Margaret S.Y. Chu, Ph.D. 
Director:

Office of Civilian Radioactive Waste Management:

Enclosure:

ENCLOSURE: LIST OF REFERENCES:

1. DOE/NRC Quarterly Quality Assurance Meeting report, February 18, 
2004.

2. Metric Definition Sheet 2.5.1.2 (part of Yucca Mountain Project 
Performance Indicators Database), March 2004.

3. International Survey Research, Survey Summary Report, October 2003.

4. Memorandum, John Arthur to Margaret Chu, April 2, 2004.

5. Management Improvement Initiatives Transition Approach, December 
2003.

6. Longenecker and Associates, Inc., Management Improvement Initiatives 
Independent Review Report, March 19, 2004.

7. Letter, Margaret Chu to Martin Virgilio, April 5, 2004.

8. Yucca Mountain Project Performance Indicators Database, ongoing 
internal management tool.

9. Office of Repository Development, Monthly Operating Review 
Annunciator Panel, ongoing internal management tool.

10. DOE Office of Independent Oversight and Performance Assurance, 
Management Assessment of the Office of Repository Development, November 
2003.

11. D.L. English Consulting, Inc., FY 2003 Quality Assurance Management 
Assessment of the 
Office of Civilian Radioactive Waste Management, November 2003.

12. Booz Allen Hamilton, Performance Management Assessment: DOE Office 
of Civilian Radioactive Waste Management, September 30, 2003.


The following are GAO's comments on the Department of Energy's letter 
dated April 19, 2004.

GAO Comments:

1. We disagree. Our report states that the recent independent 
assessments have shown improvements in the key management areas 
identified in the 2002 corrective action plan. However, the assessments 
also showed that problems remain in these areas and thus the corrective 
actions have not yet been successful in correcting these weaknesses. 
DOE's examples of progress illustrate our point regarding improperly 
specified goals. For example, DOE states in its comments that line 
management's self-identification of conditions adverse to quality has 
increased approximately 100 percent in the last 15 months (as opposed 
to the identification of such conditions by quality assurance 
auditors). However, despite this seemingly dramatic increase, DOE has 
yet to meet its goal of line management's self-identifying 80 percent 
of all quality problems. (DOE's 100 percent increase brought them up to 
about 50 percent of all quality problems being self-identified by line 
managers.) Further, the goal continues to lack a time frame for when 
the 80 percent goal should be attained and for how long it should be 
sustained before the corrective actions can be judged successful. As 
our report points out, without such specificity, improvements cannot be 
evaluated in terms of overall success.

2. We disagree. The 2002 Management Improvement Initiatives clearly 
state that it was a "comprehensive corrective action plan necessary to 
address weaknesses in the implementation of [DOE's] quality assurance 
requirements and attain a level of performance expected of an NRC 
license applicant." Contrary to DOE's assertion, the initiative does 
not indicate it was a "springboard effort to address management issues 
and transition improvements into day-to-day line management 
activities." Although the transitioning of improvements to the line is 
laudable, the initiative focused on implementing corrective actions and 
evaluating the effectiveness of the actions in correcting problems. 
This approach is consistent with DOE's criteria for correcting 
significant conditions adverse to quality, and it is the criteria we 
relied on to determine whether the corrective actions specified in the 
initiatives were successful.

3. We agree. We did not include the full range of performance 
indicators (goals) that have recently been developed, and continue to 
change, to assess the 2002 plan. Instead, of the hundreds of indicators 
that are being developed to manage the project, we relied on those few 
that Bechtel officials told us were connected to the goals of the 2002 
plan. As table 1 shows, some improvements have been made in specifying 
the quantitative aspects of the goals, but weaknesses continue to exist 
in the new goals. In fact, table 1 shows that DOE no longer has a goal 
in its performance tool that specifically tracks the trend in problems 
related to roles and responsibilities. This omission is particularly 
important because the area of roles and responsibilities was noted in 
the 2002 plan as one of the biggest sources of problems in the quality 
assurance process, and, as the recent assessments have found, this is 
an area with continuing problems.

4. We disagree. We acknowledge that these reviews found positive 
improvements in a number of management areas. However, we also note 
that continuing problems were found with management weaknesses despite 
all corrective actions having been implemented in 2003.

5. While DOE believes that it has achieved the objectives of the 2002 
plan, it lacks evidence that its actions have been effective in 
addressing the management weaknesses and correcting the recurring 
quality problems with data, models, and software. Evaluating 
performance against measurable goals with time frames for meeting and 
sustaining the goals would provide the needed evidence.

6. The draft report that we sent to DOE for review included reviews of 
9 of the 12 documents listed in the enclosure of DOE's letter. We have 
since reviewed the 3 remaining documents. The information in the 3 
documents did not change our assessment of DOE's efforts to correct its 
quality assurance program.

After full consideration of the information included in DOE's comments, 
we believe that our findings are complete and our conclusions are 
accurate.

[End of section]

Appendix V: Comments from the Nuclear Regulatory Commission:

UNITED STATES NUCLEAR REGULATORY COMMISSION 
WASHINGTON, D.C. 20555-0001:

April 16, 2004:

Ms. Robin M. Nazarro 
Director, Science Issues 
Natural Resources and Environment 
United States General Accounting Office 
441 G Street, NW:

Washington, D.C. 20345:

Dear Ms. Nazarro;

I would like to thank you for the opportunity to review end submit 
comments on the draft report, "YUCCA MOUNTAIN; Persistent Quality 
Assurance Problems Could Delay Repository Licensing and Operation" 
(GAO-04-460). The U.S. Nuclear Regulatory Commission (NRC) appreciates 
the time and effort that you and your staff have taken to review this 
important topic.

The NRC agrees with the GAO conclusion that the Department of Energy 
(DOE) should continue to improve the Quality Assurance Program for the 
proposed Yucca Mountain Repository. With respect to the specific GAO 
recommendations contained in the draft report, NRC suggests that DOE be 
given the flexibility to choose	alternative approaches to achieve and 
measure improved Quality Assurance Program performance, since 
alternatives may be more suitable for the situation as DOE nears, then 
moves beyond, submittal of the license application.

Two minor clarifying comments ore the draft report are enclosed. If you 
have any questions, please contact Mr. Tom Matuia at (301) 415-6700 or 
Mr. Ted Carter at (301) 415-6684, of my staff.

Sincerely,

Signed by: 

William D. Travers 
Executive Director for Operations:

Enclosure:

Specific Comments on Draft Report GAO-04-460:

The following is GAO's comment on the U.S. Nuclear Regulatory 
Commission's letter dated April 16, 2004.

GAO Comment:

1. We agree that alternative approaches could be used to measure 
performance; however, to ensure the success of any approaches, DOE must 
include objective measurements and time frames for reaching and 
sustaining desired performance and include an end point for closing out 
the corrective action plan.

[End of section]

Appendix VI: GAO Contact and Staff Acknowledgments:

GAO Contact:

Daniel Feehan, (303) 572-7352:

Staff Acknowledgments:

In addition to the individual named above, Robert Baney, Lee Carroll, 
Thomas Kingham, Chalane Lechuga, Jonathan McMurray, Judy Pagano, 
Katherine Raheb, Anne Rhodes-Kline, and Barbara Timmerman made key 
contributions to this report.

(360268):


FOOTNOTES

[1] DOE and its subcontractor, Navarro Quality Services, which is a 
division of Navarro Research and Engineering, Inc., are responsible for 
carrying out quality assurance oversight activities, including 
conducting audits. DOE's primary contractor at the site, Bechtel/SAIC 
Company, LLC, is responsible for implementing DOE's quality assurance 
requirements related to ongoing project activities and for conducting 
audits of line activities. 

[2] Department of Energy, Office of Civilian Radioactive Waste 
Management, Management Improvement Initiatives (Washington, D.C.: July 
19, 2002).

[3] U.S. General Accounting Office, Nuclear Waste: Preliminary 
Observations on the Quality Assurance Program at the Yucca Mountain 
Repository, GAO-03-826T (Washington, D.C.: May 28, 2003).

[4] U.S. General Accounting Office, Nuclear Waste: Repository Work 
Should Not Proceed Until Quality Assurance Is Adequate, GAO/RCED-88-159 
(Washington, D.C.: Sept. 29, 1988).

[5] Department of Energy, Office of Civilian Radioactive Waste 
Management, Report for Performance-Based Audit OQAP-BSC-03-14 of 
Technical Product Inputs at Bechtel SAIC Company, LLC, September 8-19, 
2003 (Las Vegas, NV: Nov. 6, 2003).

[6] Department of Energy, Office of Civilian Radioactive Waste 
Management, Report for Performance-Based Audit OQAP-BSC-03-10 of 
Analysis Model Report Processes and Products at Bechtel SAIC Company, 
LLC, October 21-31, 2003 (Las Vegas, NV: Jan. 20, 2004).

[7] Department of Energy, Office of Civilian Radioactive Waste 
Management, Report for Audit OQAP-BSC-03-07 of Software and Software 
Activities at Bechtel SAIC Company, LLC, Lawrence Berkeley National 
Laboratory, and Lawrence Livermore National Laboratory, June 3-13, 2003 
(Las Vegas, NV: Aug. 13, 2003). 

[8] Booz, Allen, Hamilton, Inc., Performance Management Assessment: DOE 
Office of Civilian Radioactive Waste Management (Las Vegas, NV: Sept. 
30, 2003).

[9] D.L. English Consulting, Inc., FY 2003 Quality Assurance Management 
Assessment of the Office of Civilian Radioactive Waste Management 
(South Dartmouth, MA: October 2003).

[10] Department of Energy, Office of Independent Oversight and 
Performance Assurance, Management Assessment: Office of Repository 
Development (Washington, D.C.: November 2003).

[11] International Survey Research, OCRWM 2003 Safety Conscious Work 
Environment Survey (Walnut Creek, CA: Nov. 7, 2003).

[12] U.S. Nuclear Regulatory Commission, U.S. Nuclear Regulatory 
Commission Staff Evaluation of U.S. Department of Energy Analysis Model 
Reports, Process Controls, and Corrective Actions (Washington, D.C.: 
Apr. 7, 2004).

[13] U.S. General Accounting Office, Internal Control Standards: 
Internal Control Management and Evaluation Tool, GAO-01-1008G 
(Washington, D.C.: August 2001).

[14] Executive Office of the President, Office of Management and 
Budget, Circular No. A-123 (Washington, D.C.: June 24, 1995).

[15] Department of Energy, Office of Civilian Radioactive Waste 
Management, Management Improvement Initiatives Transition Approach, 
Revision 1 (Washington, D.C.: December 2003).

[16] Longenecker & Associates, Inc., under contract to Booz Allen 
Hamilton, Inc., OCRWM Management Improvement Initiatives (MII) 
Independent Review Report (Las Vegas, NV: Mar. 19, 2004).

[17] Nuclear Regulatory Commission, Office of Nuclear Material Safety 
and Safeguards, Yucca Mountain Review Plan Final Report, NUREG-1804, 
Revision 2 (Washington, D.C.: July 2003).

[18] U.S. Code of Federal Regulations, Title 10, Part 63.

[19] A 4th year can be added to the process if NRC decides that the 
additional time is needed for hearings.

GAO's Mission:

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. General Accounting Office

441 G Street NW,

Room LM Washington,

D.C. 20548:

To order by Phone: 	

	Voice: (202) 512-6000:

	TDD: (202) 512-2537:

	Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.

General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.

20548: