This is the accessible text file for GAO report number GAO-11-127R 
entitled 'America COMPETES Act: It Is Too Early to Evaluate Programs 
Long-Term Effectiveness, but Agencies Could Improve Reporting of High- 
Risk, High-Reward Research Priorities' which was released on October 
8, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

GAO-11-127R: 

United States Government Accountability Office:
Washington, DC 20548: 

October 7, 2010: 

The Honorable John D. Rockefeller, IV:
Chairman:
The Honorable Kay Bailey Hutchison:
Ranking Member:
Committee on Commerce, Science, and Transportation:
United States Senate: 

The Honorable Jeff Bingaman:
Chairman:
The Honorable Lisa Murkowski:
Ranking Member:
Committee on Energy and Natural Resources:
United States Senate: 

The Honorable Bart Gordon:
Chairman:
The Honorable Ralph M. Hall:
Ranking Member:
Committee on Science and Technology:
House of Representatives: 

Subject: America Competes Act: It Is Too Early to Evaluate Programs 
Long-Term Effectiveness, but Agencies Could Improve Reporting of High- 
Risk, High-Reward Research Priorities: 

Scientific and technological innovation and a workforce educated in 
advanced technology are critical to the long-term economic 
competitiveness and prosperity of the United States. In recent years, 
leaders in government, business, and education have reported their 
concerns that declining federal funding for basic scientific research 
could diminish the United States' future economic competitiveness. 
These leaders have also reported their concerns that our educational 
system is producing too few students trained in the fields of science, 
technology, engineering, and mathematics (STEM), which they believe 
may drive jobs in technical fields--followed by jobs in manufacturing, 
administration, and finance--from the United States to other countries. 

Congress passed the America Creating Opportunities to Meaningfully 
Promote Excellence in Technology, Education, and Science Act (COMPETES 
Act) of 2007[Footnote 1] with the overall goal of increasing federal 
investment in scientific research to improve U.S. economic 
competitiveness. To that end, the act also increased support for 
education in STEM fields. Specifically, the act authorized $33.6 
billion from fiscal year 2008 through fiscal year 2010, in 
appropriations to be spent by four federal agencies: 

* the Department of Education, 

* the Department of Energy (DOE), 

* the National Institute of Standards and Technology (NIST) within the 
Department of Commerce, and: 

* the National Science Foundation (NSF). 

Within these four agencies, the act authorized funding for 24 new 
programs and the expansion of 20 existing programs to increase federal 
investment in basic scientific research and STEM education in the 
United States. The act also authorized the establishment of a new 
agency--the Advanced Research Projects Agency-Energy (ARPA-E)--within 
DOE to support transformational energy technology research projects to 
enhance the country's economic and energy security. 

In addition, the act established specific goals for some of the 
individual programs and includes a number of reporting provisions. 
Section 1008 of the act expresses the sense of Congress that each 
executive agency conducting research in STEM fields should strive to 
support and promote innovation by setting a goal of allocating an 
appropriate percentage of its basic research budget toward funding 
high-risk, high-reward research. The act describes high-risk, high- 
reward research projects as those that should also (1) meet 
fundamental scientific or technical challenges, (2) involve 
multidisciplinary work, and (3) involve a high degree of novelty. 
[Footnote 2] With respect to agencies conducting basic STEM research, 
the COMPETES Act provides for the following actions: 

* Goal setting--Agencies are annually required to report whether they 
have set a percentage funding goal for high-risk, high-reward research. 

* Spending toward goal--Agencies that set such a goal must report 
whether the goal is being met by the agencies and describe the 
activities funded. 

* Manner of reporting--Agencies are required to report this 
information to Congress along with documents supporting their annual 
budget. 

* The COMPETES Act requires GAO to evaluate, within 3 years following 
its enactment, the effectiveness of authorized programs. To satisfy 
this reporting requirement, we briefed your staffs on the results of 
our work on August 5, 2010, and this report provides additional 
details. Our reporting objectives for this review were to examine (1) 
the extent to which the four agencies that received funding have 
obligated and reported funding for new or expanded programs and 
activities and (2) the effectiveness of the new or expanded programs 
and activities in meeting the goals of the act.[Footnote 3] 

To examine the extent to which agencies have obligated funds and 
implemented programs, we reviewed the relevant provisions of the act, 
program documents, budget information, and interviewed agency 
officials. To evaluate the effectiveness of new or expanded programs, 
we reviewed a nongeneralizable, nonprobability sample of seven 
scientific research and education projects that illustrate authorized 
programs within each of the four agencies that received funding. We 
reviewed the mechanisms the agencies are using to measure the 
projects' effectiveness, and interviewed officials at research 
universities and a private company to learn how they evaluate research 
and education efforts. Enclosure I contains a more detailed 
description of our scope and methodology. 

We conducted this performance audit from March 2010 through October 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

Results in Brief: 

The four agencies that received funds authorized under the COMPETES 
Act obligated about $30 billion for new and expanded research and 
education programs and activities from fiscal year 2008 to fiscal year 
2010; scientific research obligations totaled about $27 billion and 
STEM education obligations totaled about $3 billion. Three of the four 
agencies we reviewed--DOE, NSF, and NIST--conducted basic scientific 
research. However, they did not consistently set a percentage funding 
goal to support high-risk, high-reward research--something that 
Congress provided they should do. In addition, two of these three 
agencies did not report this information with their annual budget 
submissions, as the law provides. Agency officials provided us with 
information indicating that they faced challenges in defining such 
research, and as a result, each program applied the criteria in the 
act differently. In addition, the directors of the Office of 
Management and Budget (OMB) and the Office of Science and Technology 
Policy (OSTP) jointly issued a memo in August 2009 specifying that 
agencies conducting science research should explain in their budget 
submissions how they will support such activities and cooperate with 
them to develop datasets on federal science and technology 
investments, although this memorandum did not direct agencies on how 
to do so or direct them to set a percentage goal. 

Because the new programs authorized and funded under the COMPETES Act 
have only recently received and obligated funding, and because of the 
difficulties we and others have reported as being inherent in 
measuring outcomes of research and educational programs, it is too 
early to assess the effectiveness of these programs. However, all four 
of the agencies we reviewed are taking steps to oversee the 
implementation of various projects and monitor their progress. For 
example, the agencies are collecting various types of project data to 
monitor progress toward cost, schedule, and program outputs. 

We are recommending that DOE, NSF, and NIST each set a goal for 
funding high-risk, high-reward research and that the agencies 
coordinate in doing so. We are also recommending that the agencies 
include information on high-risk, high-reward research with their 
annual budget requests, which are available to the public. 

In commenting on our draft report, Commerce agreed with both 
recommendations and NSF agreed with the recommendation to report to 
Congress. Additionally, OSTP, DOE, and NSF expressed concerns about 
aspects of our recommendation that agencies funded under the COMPETES 
Act identify a percentage goal for funding high-risk, high-reward 
research. Despite these concerns, we believe that agencies should seek 
to provide the information, as expressed in the sense of the Congress 
as provided in the act, and therefore believe this recommendation 
remains valid. 

Agencies Obligated about $30 Billion for Research and Education 
Programs but Have Not Consistently Reported about their High-Risk, 
High-Reward Research Activities: 

The four agencies obligated funds and implemented new and expanded 
research and education programs and activities. However, the three 
agencies that conduct basic STEM research did not consistently report 
about their high-risk, high-reward research activities to Congress as 
the act provides they should. In addition, the directors of OMB and 
OSTP jointly issued a memo specifying that agencies conducting science 
research should explain in their budget submissions how they will 
support such activities. 

Research Obligations Totaled about $27 Billion, and Education 
Obligations Totaled about $3 Billion: 

We found that the four agencies obligated about $30 billion under the 
act--scientific research obligations totaled about $27 billion, and 
STEM education obligations totaled about $3 billion. Table 1 shows the 
distribution of these funds for the four agencies. 

Table 1: Obligations by Agency of Appropriations Authorized by the 
COMPETES Act: 

(Millions of current-year dollars): 

Scientific Research Programs Total: 
FY 2008 obligated: $5,753.2; 
FY 2009 obligated: $8,560.2; 
FY 2010 obligated[A]: $13,133.8; 
Total: $27,447.2. 

NSF: 
FY 2008 obligated: $5,020.1; 
FY 2009 obligated: $7,629.8; 
FY 2010 obligated[A]: $6,366.9; 
Total: $19,016.8. 

DOE[B]: 
FY 2008 obligated: 0; 
FY 2009 obligated: $4.2; 
FY 2010 obligated[A]: $5,350.8; 
Total: $5,355.0. 

NIST: 
FY 2008 obligated: $733.1; 
FY 2009 obligated: $926.2; 
FY 2010 obligated[A]: $1,416.1; 
Total: $3,075.4. 

STEM Education Programs Total: 
FY 2008 obligated: $768.3; 
FY 2009 obligated: $932.7; 
FY 2010 obligated[A]: $890.0; 
Total: $2,591.0. 

NSF: 
FY 2008 obligated: $766.3; 
FY 2009 obligated: $930.5; 
FY 2010 obligated[A]: $887.8; 
Total: $2,584.6. 

Department of Education: 
FY 2008 obligated: $2.0; 
FY 2009 obligated: $2.2; 
FY 2010 obligated[A]: $2.2; 
Total: $6.4. 

Total: $30,038.1. 

Source: GAO analysis of these agencies' data. 

[A] Fiscal year 2010 obligations include estimated obligations until 
the end of the fiscal year. Estimates were provided by DOE, NIST, and 
the Department of Education in July 2010, and by NSF in April 2010. 

[B] ARPA-E did not receive appropriations until fiscal year 2009. In 
addition, authorization of funding in the COMPETES Act for DOE's 
Office of Science applied only to fiscal year 2010; funding for the 
Office of Science was authorized for prior fiscal years including 
fiscal year 2008 and fiscal year 2009 in the Energy Policy Act of 2005 
(P.L. 109-58). 

[End of table] 

We found that the agencies funded a broad range of research from 
conceptual basic science to near-market efforts. For example, basic 
scientific research included NIST's research in precise measurement, 
which may result in a range of applications for advances in computing 
power and accurate timekeeping--advances that may in turn have broad 
economic benefits, such as improving electric power delivery or 
radiation detection. Near-market research included ARPA-E funding for 
projects designed for more specific applications, such as advancing 
residential energy efficiency or reducing the costs of generating 
renewable energy. 

The agencies' obligations for STEM education programs totaled about $3 
billion, and were primarily funded through NSF. These programs 
included the Robert Noyce Teacher Scholarship Program, which seeks to 
encourage STEM majors and professionals to become Kindergarten through 
12th grade (K-12) mathematics and science teachers. In addition, the 
Department of Education obligated about $6 million for the Teachers 
for a Competitive Tomorrow program, which is designed to help student 
teachers in STEM fields and critical foreign languages, and help 
professionals earn master's degrees in teaching, which would in turn 
provide certified teachers to educate K-12 students.[Footnote 4] 

Agencies Have Not Consistently Reported Spending on High-Risk, High- 
Reward Research to Congress and Have Defined This Research Differently: 

The three agencies we reviewed that conducted basic scientific 
research in STEM fields have not consistently reported to Congress on 
spending supporting high-risk, high-reward research.[Footnote 5] 
Furthermore, we found that the agencies defined this research 
differently, which led to their using different approaches for 
defining high-risk, high-reward research. Specifically, for these 
three agencies, we found the following: 

DOE. DOE's Office of Science did not set a goal to fund high-risk, 
high-reward research, and reported this in its annual budget 
submissions. DOE officials noted that the agency has a long history of 
supporting basic research, which has resulted in significant 
scientific and other accomplishments. With respect to goal setting 
under the act, DOE's Office of Science reported to Congress that it 
did not set a percentage funding goal for high-risk, high-reward 
research. By doing so, DOE's Office of Science met the reporting 
requirement. However, information provided by the Office of Science 
during our review indicated that it considers a significant proportion 
of its research to be high-risk, high-reward. In our discussions with 
officials from the Office of Science, they expressed concern that high-
risk, high-reward research was difficult to define. Furthermore, 
officials stated that individual research projects included both high-
risk, high-reward elements, along with other elements, which made it 
difficult to identify how much funding supported the high-risk 
components. DOE officials also stated that defining high-risk, high-
reward research depends largely on the particular scientific and 
technical field and the stage of research. With regard to reporting 
spending toward its goal, because DOE did not set a percentage funding 
goal, it was not required to report such spending. With respect to the 
manner of reporting, DOE reported this information with its annual 
budget documents.[Footnote 6] 

DOE's ARPA-E does not fund basic scientific research, and, as such, 
officials did not set a percentage goal of basic research. Instead, 
ARPA-E officials told us that the agency predominantly funds research 
and development that has moved beyond basic scientific research, but 
is not yet commercially viable. These types of projects have not 
routinely received funding from public or private sources. ARPA-E 
officials told us that they focus on research projects that have 
potentially high rewards and some high-risk elements, such as a high 
risk that there may not be a market for the resulting technologies and 
a high risk that certain technological hurdles will not be easily 
overcome. 

NSF. NSF did not set a goal to fund high-risk, high-reward research, 
but did not report this in its annual budget submissions. With respect 
to goal setting, NSF met this aspect of the reporting requirement by 
reporting that it had not set a percentage funding goal to support 
high-risk, high-reward research. NSF officials noted that high-risk, 
high-reward research was not easily defined. NSF did not report what 
percentage of its budget for basic research was being allocated toward 
this type of research, as the act provides that it should, because NSF 
maintains that there is no formula that could establish an appropriate 
percentage of basic research that should be high-risk, high-reward. 
Instead, NSF referred to what it calls "potentially transformative 
research"--which it defines as high-reward research that may or may 
not be high-risk. NSF explained that potentially transformative 
research is similar, but not synonymous with, high-risk, high-reward 
research. NSF reported that it plans to spend at least $94 million in 
fiscal year 2010 on this research--less than 2 percent of its fiscal 
year 2010 research budget. In addition, NSF officials reported that 
they believe it is most effective to foster a research climate 
conducive to potentially transformative research. With regard to 
spending toward its goal, because NSF did not set a goal for high-
risk, high-reward research, it was not required to report spending. 
With respect to the manner of reporting, NSF officials told us they 
did not provide its report with its annual budget submission but 
instead submitted it in separate letters to congressional leaders, 
which are not readily available to the public.[Footnote 7] NSF 
included similar information in one of its recent publicly available 
budget submissions regarding high-risk, high-reward research, but NSF 
did not report whether it had set a percentage funding goal. 

NIST. NIST set a goal to fund high-risk, high-reward research, but did 
not report this in its annual budget submissions. With respect to 
setting a goal, NIST reported that it had set a percentage funding 
goal to fund high-risk, high-reward research. NIST determined that 
several programs in its research portfolio represented high-risk, high-
reward research and aggregated these programs' proposed budgets to 
develop its percentage funding goal. With respect to spending toward 
its goal, NIST has not reported its prior year spending but reported 
planned spending in each fiscal year in which it reported. With 
respect to the manner of reporting, NIST did not report this 
information along with its other budget documents but instead 
submitted it in separate letters to authorizing congressional 
committees, without publicly releasing them.[Footnote 8] For fiscal 
years 2009 and 2010, NIST did not mention high-risk, high-reward 
research in updates to its 3-year programmatic plans. 

Definition of high-risk, high-reward research. The COMPETES Act 
provided some elements that the agencies could use to determine what 
basic research constituted high-risk, high-reward research; however, 
agency officials told us that high-risk, high-reward research was 
difficult to define and that they were not certain how these criteria 
applied to their programs. Because each agency perceived ambiguities 
in how high-risk, high-reward applied to the research they oversee, 
each agency made its own determination on how to apply these criteria. 
Consequently, the agencies have not reported their funding for this 
research consistently, and Congress has not received the information 
it sought regarding this research. As a result, it is more difficult 
for Congress to monitor spending for high-risk, high-reward research 
by individual agencies or track the effectiveness of these 
investments. In addition to the act, the directors of OMB and OSTP 
specified in an August 2009 memo that agencies conducting science 
research should explain, in their fiscal year 2011 budget submissions, 
how they will support high-risk, high-reward research, although this 
memorandum did not specify that agencies should set a percentage goal. 
[Footnote 9] In this memo, the directors of OMB and OSTP further 
stated that to explain how federal science and technology investments 
contribute to increased economic productivity and progress, new energy 
technologies, improved health outcomes, and other national goals, 
federal agencies should cooperate with them to develop datasets better 
documenting federal science and technology investments. The memo also 
states that these data should be open to the public in accessible, 
useful formats. Officials from DOE, NSF, and NIST told us that they 
consulted with OMB concerning high-risk, high-reward research. 
However, although each agency has expertise with evaluating basic 
research, they have not consulted with each other or with OMB and OSTP 
to develop a more consistent definition of this research. As we have 
previously reported, coordination among agencies can be difficult, as 
it requires staff working across agency lines to define and articulate 
the common federal outcome or purpose they are seeking to achieve that 
is consistent with their respective agency goals and mission.[Footnote 
10] We have also reported that collaboration among federal agencies 
can be enhanced by establishing mechanisms to operate across 
organization boundaries, such as by developing a common definition of 
what constitutes high-risk, high-reward research. Furthermore, we have 
reported that collaboration can provide more public value than 
agencies could otherwise provide alone.[Footnote 11] 

While It Is Too Early to Evaluate Programs' Effectiveness, Agencies 
Are Taking Steps to Oversee Project Implementation and Using Different 
Approaches to Assess Progress toward Long-Term Outcomes: 

It is too early to evaluate the four agencies' new or expanded 
programs and activities in meeting the goals of the act. All four of 
the agencies are taking steps to oversee the implementation of various 
projects and using different approaches to collect data for assessing 
progress toward achieving long-term outcomes. 

Evaluating Effectiveness of Federal Research and STEM Education 
Programs in General Can Be Difficult: 

Because the new programs authorized under the COMPETES Act have only 
recently obligated the money provided to them, it is too early to 
assess their effectiveness. Four of the newly authorized programs 
received appropriated funds and were implemented--DOE's ARPA-E, NIST's 
Technology Innovation Program, the Department of Education's Teachers 
for a Competitive Tomorrow, and NSF's Science Master's Degree Program; 
[Footnote 12] the other 20 newly authorized programs were not funded. 
DOE's ARPA-E began receiving funding in fiscal year 2009 through the 
American Recovery and Reinvestment Act of 2009 (ARRA).[Footnote 13] 
ARPA-E makes up the largest portion of new programs funded under the 
COMPETES Act. From April 2009 to July 2010, it completed three rounds 
of funding, awarding a total of $349 million to 117 transformative 
energy projects--research into technologies with the potential to 
change how the U.S. generates, stores, and utilizes energy--in 18 
states. With regard to NIST's Technology Innovation Program, it began 
receiving funding in fiscal year 2008. In the program's 2008 Annual 
Report, NIST reported that it spent the initial year staffing the 
program, identifying critical needs areas, and issuing implementing 
regulations required by the act.[Footnote 14] NIST announced the first 
project awards in January 2009, and to date has awarded a total of 
$113.5 million to 29 projects in civil infrastructure and 
manufacturing.[Footnote 15] The Department of Education's Teachers for 
a Competitive Tomorrow program began obligating funding in fiscal year 
2008. The program has awarded $6 million to 8 grant recipients since 
it began. NSF's Science Master's Program received its initial 
appropriations in fiscal year 2009 through ARRA. The program awarded 
21 grants in fiscal year 2010, totaling $14.6 million. Collectively, 
these new programs have awarded about $483 million in funding to a 
total of 174 projects. 

For programs expanded by the act, it is too early to tell how 
effective these programs have been, and agency officials told us that 
it is also difficult to distinguish the incremental activities funded 
under the COMPETES Act. For example, DOE Office of Science budget 
officials told us it could not easily identify the effectiveness of 
the projects that were specifically expanded across its six research 
programs resulting from the incremental increases in program funding 
provided based on authorizations in the act. These officials noted 
that in some cases, projects were expanded from their original 
research focus but not in ways that would allow the officials to 
attribute the specific results to the incremental funding. 

Moreover, we and others have also found that evaluating the 
effectiveness of federal basic research and STEM education programs 
such as those authorized by the act can be inherently difficult. 
[Footnote 16] We have long recognized the difficulties of developing 
useful results-oriented performance measures for federal research 
programs, and that the uncertain nature of research outcomes over time 
can make it challenging to set specific and measurable goals that 
demonstrate the results of these programs.[Footnote 17] Some of the 
challenges we and others have identified in evaluating the 
effectiveness of basic research programs include: 

* results may take a long time; 

* unpredictability of the pace of research makes it hard to annually 
measure outcomes; and: 

* research may not achieve its intended results but can lead to 
unexpected discoveries that provide potentially more interesting and 
valuable results. 

Challenges we and others have identified in evaluating the 
effectiveness of STEM education programs include: 

* linking results from an individual program with agency wide or 
government-wide goals; 

* limited evidence collected by agencies that provides a basis for 
drawing conclusions about the effectiveness of these programs; and: 

* ambiguities in identifying careers which are not traditionally 
classified as STEM, which can create challenges in tracking long-term 
career outcomes.[Footnote 18] 

Agencies Are Taking Steps to Oversee Project Implementation and Using 
Different Approaches to Assess Progress Toward Long-Term Outcomes: 

The agencies are collecting various types of project data to monitor 
progress toward cost, schedule, and program outputs. For example, for 
the construction projects we visited, agency officials told us they 
are using the earned value management system,[Footnote 19] which 
tracks progress toward cost and schedule milestones. These projects 
include NSF's Ocean Observatories Initiative and NIST's construction 
of the new Precision Measurement Laboratory at its Boulder, CO, 
facility. (For more information regarding these projects, see 
Enclosure I). ARPA-E officials reported that they are overseeing the 
performance of their research projects in a number of ways, including 
requiring award recipients to submit periodic progress reports, 
regularly visiting project sites, and conducting annual reviews of 
project performance. These officials told us that the award agreements 
for each project includes a set of negotiated technical milestones and 
that each project will be annually assessed to determine whether it 
should proceed or be modified or terminated. 

The agencies are also collecting various output data as indicators of 
program performance. For example, according to NIST documents, NIST's 
Technology Innovation Program plans to evaluate project performance 
through several short-term output metrics, such as the number of 
patent applications, journal publications, and amount of additional 
follow-on investment. However, NIST documents indicated that these 
measures are time lagged, and that the agency does not expect results 
to be generated until at least 3 years of project research are 
complete. 

For the STEM education programs, the Department of Education's 
Teachers for a Competitive Tomorrow program is collecting performance 
information and annual reports from each of its grant recipients to 
evaluate the extent to which they are succeeding in meeting the 
purposes of the program. Data to be collected include the number of 
student teachers participating in the program, their majors, 
demographics, and data on employment placement, and graduates 
continuing to teach in the STEM fields, particularly in schools 
determined to have the highest need. 

Also, the agencies are using different approaches to evaluate 
effectiveness and progress towards long-term outcomes. The research 
agencies we reviewed--DOE's Office of Science, NSF, and NIST--each 
used peer review to various degrees to qualitatively assess the 
effectiveness of their research programs and evaluate progress towards 
long-term outcomes. For example, DOE's Office of Science, which 
generally funds basic scientific research, periodically reviews each 
of its six research programs once every 3 years using panels comprised 
of expert reviewers from academia, DOE's national laboratories, other 
federal agencies, and the private sector. DOE also uses other peer- 
review mechanisms to manage its research portfolio. NSF conducts 
similar activities, such as panels comprised of expert reviewers, 
which it uses to assess the quality of research and its effectiveness 
in meeting NSF's goals. These reviews assess the quality of the 
processes used to solicit and review project proposals and the 
resulting quality of the program's research portfolio. In evaluating 
progress toward long-term outcomes, the experts review a range of 
program information to qualitatively assess progress on a scale from 
poor to excellent. Also, ARPA-E is currently developing its strategic 
plan, which will include long-term goals and measures that it will use 
to evaluate its program outcomes. For STEM education programs, we also 
found that the NSF and Education are taking steps to evaluate the long-
term effectiveness of their funded projects. As part of its broader 
initiative to pilot and reviewed new approaches to the evaluation of 
its programs, NSF developed goals and metrics for activities in its 
education portfolio to reflect its increased expectations for 
evaluation of its funded projects. NSF documents represent that these 
metrics will be used to assess the programs and provide information 
for improving the programs and opportunities to move in new 
directions. For example, for its Robert Noyce Teacher Scholarship 
Program, NSF is negotiating a contract to conduct longitudinal studies 
of program graduates as their careers progress, effect of program on 
recruitment to teacher preparation programs, and comparative studies 
to examine practices that are most related to keeping teachers in high-
need areas. NSF is also collaborating with the President's Office of 
Science and Technology Policy and other federal agencies on the STAR 
METRICS project.[Footnote 20] This project is working to improve 
collaboration between federal agencies and those in the research 
community to better document the evidence needed to describe and 
assess the impacts of the federal investments in science research and 
education. To evaluate the effectiveness of Education's Teachers for a 
Competitive Tomorrow program in meeting long-term program outcomes, 
program documents indicate that the agency will evaluate recipient's 
annual reports and data collected on a range of performance measures 
to assess the extent to which the program is succeeding in increasing 
the percentage of highly qualified STEM teachers in high-need areas, 
increasing the number of students enrolled in STEM programs, and data 
on teacher placement and retention rates, among other aspects. 

Conclusions: 

The COMPETES Act seeks to address many factors contributing to 
scientific and educational achievements in the United States, such as 
sustained investments in scientific research and education. While it 
is too soon to tell how effective the research and educational 
investments authorized by the act will be in improving the science and 
technology outcomes laid out in the act, agencies have made progress 
collecting data and monitoring the outputs of the programs they 
oversee to prepare for such an evaluation in the future. 

While it is difficult to precisely define high-risk, high-reward 
research, agencies could improve their reporting of these activities, 
which would aid in improving congressional oversight. Toward this end, 
the law provided a sense of the Congress that agencies should provide 
key information which not all agencies provided. In particular, none 
of the agencies reported a percentage funding goal for high-risk, high-
reward research with their annual budget requests. Although the 
COMPETES Act allows agencies to report that they have not set a goal-- 
as DOE chose to do to comply with the reporting requirement--Congress 
provided that agencies should set a goal. Such information could be 
useful in evaluating whether agencies aim to pursue the appropriate 
balance of such research as part of their overall research budget. 
However, because agencies did not provide it, Congress did not have 
this information readily available for review during its consideration 
of the overall budget. Although OMB and OSTP suggested that the 
agencies should cooperate with them to develop datasets better 
documenting federal science and technology investments, such 
cooperation or coordination among the agencies has not taken place to 
date to consistently define this research. Officials with each of the 
agencies we reviewed also voiced difficulty regarding defining high- 
risk, high-reward research meaningfully and consistently. As a result, 
agencies used differing methods to define high-risk, high-reward 
research--with one agency, NIST, identifying the budgets of entire 
programs, and other agencies, such as NSF and DOE, focusing on 
specific research proposals. Congress needs consistent information to 
effectively oversee the degree to which high-risk, high-reward 
research is being conducted within the programs and investments it 
authorized with the America COMPETES Act. We recognize that 
coordination can be difficult, but if agencies work together to refine 
their approaches and provide this information to Congress, Congress 
could in turn, better determine if this approach meets its needs or if 
further clarification is needed. While it may have been difficult to 
set goals for high-risk, high-reward research immediately after 
enactment of the COMPETES Act, full implementation makes agencies' 
goal setting and complete reporting important for Congressional 
monitoring and oversight. 

Recommendations for Executive Action: 

To better inform Congress regarding spending priorities for high-risk, 
high-reward basic research, we recommend that the Secretary of 
Commerce (by directing the Director of the National Institute of 
Standards and Technology), the Secretary of Energy, and the Acting 
Director of the National Science Foundation each take the following 
two actions: 

* establish a percentage goal to fund high-risk, high-reward research, 
and in setting a goal, cooperate and coordinate with other agencies 
funded under the COMPETES Act that perform basic scientific research-- 
as well as OMB and OSTP--to more clearly define and identify these 
research activities, and: 

* report this information as part of their annual budget submissions 
to Congress--which are available to the public--as provided by the act. 

Agency Comments and Our Evaluation: 

We provided a copy of our draft report to the Director of the Office 
of Science and Technology Policy; the Secretaries of Commerce, 
Education, and Energy; and the Acting Director of the National Science 
Foundation. OSTP, the Department of Commerce, DOE, and NSF provided 
written comments, which are reprinted in Enclosures II, III, IV, and V 
of this report, respectively.[Footnote 21] 

OSTP provided written comments noting that it found the report to be 
accurate, concise, and complete in its assessment of the America 
COMPETES Act and is supportive of high-risk, high-reward research, but 
was concerned about aspects of the Act's reporting provisions, 
particularly regarding setting numerical targets for high-risk, high- 
reward research. We understand OSTP's concern, but we continue to 
believe that, unless agencies attempt to fulfill the sense of the 
Congress and the act's reporting provision, Congress cannot receive 
the views of agencies regarding this concern; consequently, we did not 
change our recommendation. OSTP's comments and our evaluation of them 
are attached as Enclosure II. Commerce provided written comments 
concurring with our findings and recommendations. Commerce's comments 
are attached as Enclosure III. DOE provided written comments stating 
that the agency disagreed with some of our findings, conclusions, and 
recommendations. In particular, the agency disagreed with our 
characterization of some activities of its Office of Science and with 
our recommendation that DOE establish a percentage goal for high-risk, 
high-reward research, as Congress provided they should. We 
incorporated DOE's comments as appropriate by changing the text to 
clarify our findings, such as including more information about DOE's 
efforts in promoting high-risk, high-reward research. However, we 
continue to believe that each agency charged with the reporting 
provision should attempt to fulfill it by using its own definition; 
consequently, we did not change our recommendations. DOE's comments 
and our evaluation of them are attached as Enclosure IV. NSF provided 
written comments agreeing with the second recommendation to report to 
Congress, but expressing concerns about some elements in our draft 
report and the first recommendation. In particular, NSF expressed 
concern about our findings on their reporting of high-risk, high-
reward research, and about our recommendation that NSF set a 
percentage goal for funding high-risk, high-reward research. We 
changed the text to clarify our findings regarding NSF's reporting of 
its high-risk, high-reward research, but we continue to recommend that 
the agencies, which are the most informed about the research they 
fund, fulfill the sense of the Congress and the reporting provision. 
NSF's comments and our evaluation of them are attached as Enclosure V. 
Education and DOE also provided technical comments, which we 
incorporated where appropriate. 

We are sending copies of this report to the appropriate congressional 
committees; Secretaries of Commerce, Education, and Energy; the 
Director of the National Institute of Standards and Technology; the 
Acting Director of the National Science Foundation; and other 
interested parties. In addition, this report will be available at no 
charge on the GAO Web site at [hyperlink, http://www.gao.gov]. 

If you or your staff members have any questions about this report, 
please contact me at (202) 512-3841 or ruscof@gao.gov. Contact points 
for our Offices of Congressional Relations and Public Affairs may be 
found on the last page of this report. Key contributors to this report 
were Jon Ludwigson (Assistant Director), Lee Carroll, Jonathan 
Kucskar, Michael Meleady, Alison O'Neill, and Laina Poon. In addition, 
Casey Brown and Virginia Vanderlinde also made important contributions. 

Signed by: 

Frank W. Rusco: 
Director, Natural Resources and Environment: 

[End of section] 

Enclosure I: Objectives, Scope, and Methodology: 

The COMPETES Act required GAO to evaluate, within three years 
following its enactment, the effectiveness of authorized programs. In 
response to this requirement, our reporting objectives for this review 
were to examine (1) the extent to which the agencies have obligated 
funds for new or expanded programs and activities, and (2) the 
effectiveness of the new or expanded programs and activities in 
meeting the goals of the act. 

To assess the extent to which agencies have obligated funds for new 
and expanded programs under the act, we reviewed the relevant 
provisions of the act, program documents, budget information, and 
interviewed agency officials. We defined new programs as those 
programs authorized by the act to receive their initial appropriations 
beginning in fiscal year 2008. We defined expanded programs to mean 
existing programs that the act authorized to receive increased 
appropriations from fiscal year 2008 to fiscal year 2010. We evaluated 
the reliability of the data provided by agencies on their budgetary 
obligations by corroborating this data with other published sources. 
Because financial obligations for fiscal year 2010 were not final, we 
relied on the agencies' estimates for that fiscal year. 

To evaluate the effectiveness of these programs, we reviewed a 
judgmental sample of seven scientific research and education projects 
that illustrate authorized programs within the four agencies that 
received funding. We selected the sample to include projects within 
both new and existing programs. We focused on those that were 
implemented in 2008 or 2009 because they were more likely to be 
established enough for us to evaluate their effectiveness. We also 
looked for a range of project characteristics, such as award size, 
project scale, location, focus (scientific research or STEM 
education), and agency. See Table 2 for a summary of the projects we 
reviewed. To reviewed the mechanisms agencies are using to measure 
these projects' effectiveness, we analyzed documents, interviewed 
officials, and visited these projects' sites.[Footnote 22] In 
addition, to expand our understanding of methods for evaluating the 
effectiveness of scientific research and STEM education programs, we 
interviewed officials responsible for research at Stanford, Harvard, 
the University of Washington, and Google to learn how they evaluate 
research and education efforts. 

Table 2: Summary Information for Projects We Reviewed under the 
COMPETES Act: 

Project: Robert Noyce, San Jose State U.; 
Approximate cost: $0.5 million; 
Project Duration: Spring 2004 to Fall 2009. 

Project: NSF, Ocean Observatory Institute (OOI); 
Approximate cost: $126 million; 
Project Duration: September 2009 to September 2014. 

Project: DOE, ARPA-E, Foro Energy; 
Thermodynamic drilling; 
Approximate cost: $18 million; 
Project Duration: Early 2009 to mid-2012. 

Project: DOE, ARPA-E, Stanford Large Energy Reductions; 
Approximate cost: $6 million; 
Project Duration: April 2010 to April 2012. 

Project: NIST, Bldg 1 Expansion (Precision Measurement Laboratory), 
Boulder, CO; 
Approximate cost: $102 million; 
Project Duration: FY 2007-FY 2012. 

Project: NIST, Scientific and Technical Research Services, Boulder, CO 
Labs; 
Approximate cost: $100 million; 
Project Duration: ongoing. 

Project: Department of Education, Teachers for a Competitive Tomorrow, 
William Paterson University, NJ; 
Approximate cost: $1 million; 
Project Duration: FY 2008-FY 2013. 

Source: GAO analysis. 

[End of table] 

We conducted this performance audit from March 2010 through July 2010, 
in accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. 

[End of section] 

Enclosure II: Comments from the Office of Science and Technology 
Policy: 

Note: GAO comments supplementing those in the report text appear at 
the end of this enclosure. Agency comments refer to the draft report 
as GAO-10-1040R. That report number has since changed to GAO-11-127R. 

Executive Office Of The President: 
Office Of Science And Technology Policy: 
Washington, D.C. 20502: 

Response to GAO Report GAO-10-1040R: 

September 23, 2010: 

Dear Sir or Madam, 

Thank you for the opportunity to review the draft report, "America 
COMPETES Act: It is Too Early to Evaluate Programs' Long-Term 
Effectiveness, but Agencies Could Improve Reporting of Research 
Priorities." (GAO-10-1040R). 

Overall, we find the report to be accurate, concise, and complete in 
its assessment of the America COMPETES Act of 2007. We agree with the 
GAO's assessment of the new or expanded programs and activities 
authorized in the Act, namely that it is too early to assess the 
effectiveness of these programs but that agencies are taking steps to 
evaluate these programs' ongoing impacts using different approaches.
We find the assessment of agencies' responses to Section 1008 of the 
Act to be accurate. Our concerns are mostly with the Act's 
requirements rather than how agencies are meeting them. We note that 
the America COMPETES Act defines 'basic research' by reference to OMB 
Circular A-11, but does not define either the terms 'high risk' or 
'high reward.' It is our experience that there is enough background 
understanding of these concepts for OMB and OSTP to encourage agencies 
to support more HRHR research (as in the August 2009 memo referenced 
in the report), but without an explicit definition it is extremely 
difficult for agencies to set specific numerical targets for such 
research and to identify annually the percentages of agencies' basic 
research portfolios devoted to such research. [See comment 1] We do 
not see the need for a government-wide definition; we believe it is 
appropriate for agencies to use differing methods to define HRHR 
research according to agency missions, time horizons, and program 
designs. Therefore, while OSTP is supportive of encouraging agencies 
to support HRHR research (as noted in the report), OSTP does not 
support specific numerical targets for HRHR research, nor does OSTP 
support establishing an explicit, government-wide definition for HRHR 
research. OSTP is also concerned that setting a numerical target for 
HRHR research may have the unintended consequence of segregating risk 
and reward in a small segment of an agency's research portfolio, 
making the non-HRHR portion of an agency's research portfolio less 
transformative, less risky, and potentially less rewarding. [See 
comment 2] 

We are pleased with the report's recognition of OSTP, OMB, and agency 
efforts to develop datasets on federal science and technology 
investments. We wish to make clear that these datasets are not 
intended to document high-risk, high-reward research as a separate 
category, but instead to document federal science and technology 
investments and their contributions to achieving key national goals 
such as economic growth and improved health outcomes. We are pleased 
with the report's recognition of STAR METRICS as one example of these 
efforts. 

Again, thank you for the opportunity to provide comments on the draft 
report and for your continued interest in scientific and technological 
innovation. 

Sincerely, 

Signed by: 

Tom Kalil: 
Deputy Director for Policy: 
Office of Science and Technology Policy: 

The following are GAO's comments on the Office of Science and 
Technology Policy's (OSTP) letter dated September 23, 2010. 

GAO Comments: 

1. We agree with OSTP that there can be meaningful distinctions in how 
agencies and programs interpret the definition of high-risk, high- 
reward. However, we believe that, given that Congress provided that 
the agencies funded under the America COMPETES Act should provide 
information on high-risk, high-reward research, it is important for 
each agency charged with the reporting requirement to attempt to 
fulfill it, at least initially, by using its own definition. In this 
way, Congress can both receive the views of agencies regarding the 
definition and consider whether to alter the reporting requirement or 
provide further direction to the agencies regarding the definition. We 
did not revise our report in response to this comment. 

2. While we acknowledge these concerns, unless agencies attempt to 
fulfill the sense of the Congress, Congress will not receive the views 
of agencies regarding this concern. We did not revise the report in 
response to this comment. 

[End of section] 

Enclosure III: Comments from the Department of Commerce: 

Note: Agency comments refer to the draft report as GAO-10-1040R. That 
report number has since changed to GAO-11-127R. 

United States Department Of Commerce: 
The Secretary of Commerce: 
Washington, D.C. 20230: 

September 27, 2010: 

Mr. Franklin Rusco: 
Director: 
Natural Resources and Environment: 
United States Government Accountability Office: 
Washington, D.C. 20548: 

Dear Mr. Rusco: 

Thank you for the opportunity to comment on the draft report from the 
U.S. Government Accountability Office (GAO) entitled America COMPETES 
Act: Ills Too Early to Evaluate Programs Long-term Effectiveness, but 
Agencies Could improve Reporting of Research Priorities (GAO-10-1040R). 

We concur with the report's recommendations that the National 
Institute of Standards and Technology (NIST) sets a goal for funding 
high-risk, high-reward research and that NISI includes information on 
high-risk, high-reward research in its annual budget requests to 
Congress, as the law requires. The Department of Commerce has no 
comments to the report. 

We look forward to receiving your final report. Should you have any 
questions regarding this response, please contact Rachel Kinney at 
(301) 957-8707. 

Sincerely, 

Signed by: 

Gary Locke: 

[End of section] 

Enclosure IV: Comments from the Department of Energy: 

Note: GAO comments supplementing those in the report text appear at 
the end of this enclosure. Agency comments refer to the draft report 
as GAO-10-1040R. That report number has since changed to GAO-11-127R. 

Department of Energy: 
Office of Science: 
Washington, DC 20585: 

September 30, 2010: 

Mr. Franklin Rusco: 
Director, Natural Resources and Environment: 
Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Rusco, 

Thank you for the opportunity to comment on the draft Government 
Accountability Office (GAO) report entitled "America COMPETES Act: It 
Is Too Early to Evaluate Programs Longterm Effectiveness, but Agencies 
Could Improve Reporting of Research Priorities" (GAO-101040R). We have 
reviewed the draft report and provide general comments below. The 
comments provided here have been coordinated with other relevant 
offices of the Department of Energy (DOE). 

The GAO was charged by the America COMPETES Act to assess and evaluate 
the effectiveness of a representative sample of the new or expanded 
programs and activities, and report on those findings. We recognize 
that conducting a review of the agencies and activities authorized in 
the America COMPETES Act is an enormous undertaking, and in particular 
we appreciate the time the GAO took to review one of the Department's 
new programs, the Advanced Research Projects Agency—Energy (ARPA-E). 
Given that ARPA-E does not fund basic scientific research, DOE's 
comments on this report do not apply to the ARPA-E sections of this 
report. 

We find, however, that the draft report is not well balanced between 
reporting on the assessment of the additional programs that have been 
established or expanded as a result of the America COMPETES Act and 
the evaluation of the effectiveness of those programs, and 
implementation of other specific activities. In particular, the 
report's focus on the agencies' implementation of the Section 1008 
Sense of Congress regarding support for high-risk, high-reward basic 
research appears at the expense of providing meaningful details and 
discussion on agencies' programs and evaluation efforts. [See comment 
1] We would also like to note for the record, that while the GAO 
reviewed two of the ARPA-E programs, it decided to not assess a single 
DOE Office of Science program in depth as part of their review. [See 
comment 2] 

The report makes several generalized statements about the agencies' 
actions that suggest much broader deficiencies in the agencies' 
responsiveness to the requirements of the America COMPETES Act than 
the content of the GAO review and report substantively addresses. [See 
comment 3] For example, the report summary states, "however, unless 
the agencies improve their reporting of their research activities, it 
will be difficult for Congress to conduct meaningful oversight," This 
generalized statement, without the appropriate context, appears to 
apply to all of the research activities of the agencies covered under 
the America COMPETES Act. If the intention is really referring only to 
the reporting requirements specific to the Sense of Congress in 
Section 1008 of the Act, the language in the report needs to be much 
more specific so that this statement is not taken out of context. This 
is also an issue with the title of the report. The only reporting 
issue directly addressed in the report refers to Section 1008, but the 
title implies the issue is much broader. Several statements in the 
report also attempt to aggregate information that a subset of the 
reviewed agencies provided (e.g. statements that use phrases such as 
"one agency..." or "three of the agencies..."). We believe this 
approach does not provide the appropriate transparency about the 
information that the Department actually provided and those statements 
should be further clarified. 

With regard to Section 1008, the draft report's discussion of 
agencies' support of high-risk, high-reward basic research, including 
establishing a common definition and how such research is identified, 
does not acknowledge or take into consideration the associated 
challenges that have been well identified in studies conducted by the 
scientific community over the past decade through Federal advisory 
committees[Footnote 1], the National Academy of Sciences,[Footnote 2] 
and the American Academy of Arts and Sciences.[Footnote 3] These 
studies have described the many challenges with defining what is "high-
risk" and what constitutes "high-reward." recognizing that different 
organizations will associate different operational meaning to these 
phrases. These studies also recognized that agencies' support for high-
risk, high-reward basic research is not merely a funding issue, but is
also a cultural issue—both inside agencies and within the scientific 
communities. In that context, the draft report over-simplifies the 
Department's (and other agencies') challenges with establishing a 
common definition and with quantifying such research, and these 
complexities should be considered in the report's recommendations. 
[See comment 4] 

The draft report's discussion on the agencies implementation of 
Section 1008 does not acknowledge the Department's significant efforts 
to address the intent of Section 1008, which is to "strive to support 
and promote innovation in the United States through high-risk, high-
reward basic research projects." The Department has a long history of 
supporting high-risk, high-reward basic research, particularly in the 
Office of Science, The need for fundamental scientific and
technological breakthroughs to accomplish DOE mission goals requires 
that the Office of Science support high-risk, high-reward research 
ideas that challenge current thinking yet are scientifically sound. 
The Office of Science considers a significant proportion of its 
supported research as high-risk, high-reward. Because this basic 
research is integrated within program portfolios and projects, it is 
not possible to quantitatively separate the funding contributions of 
particular experiments or theoretical studies that are high-risk, high-
reward in a manner that is credible and auditable. The report's 
discussion of whether agencies have established a percentage funding 
goal needs better discussion in the context of what agencies are doing 
to support and promote high-risk, high-reward basic research. [See 
comment 5] 

The GAO's recommendation for the development of a "consistent" 
definition needs further clarification. What is considered "high-risk" 
and "high-reward" depends largely on the scientific or technical field 
and the stage of research (whether basic or applied research, or 
technology development and deployment). A definition that is 
"consistent" within an agency or among the agencies implies the 
establishment of a one-size-fits-all definition, which may limit the 
Department's ability to promote high-risk, high-reward research in a 
way that is most impactful to its diverse mission areas. The Office of 
Science and ARPA-E both provided the definition each office applies to 
its respective programs (basic research vs. applied research and 
technology development) to explain how a one-size-fits-all definition 
would not be practical within the Department. [See comment 6] 

Thank you, again, for the opportunity to provide comment on this 
draft. We look forward to receiving your final report. 

Sincerely, 

Signed by: 

Patricia M. Dehmer: 
Deputy Director for Science Programs: 
DOE Office of Science: 

Enclosure: 

Footnotes: 

[1] (a) Noonan, N. E., Report of the Advisory Committee for the GPRA 
Performance Assessment: FY 2004 (Washington, DC: NSF, 2004); (b) 
National Science Board, Report on Enhancing Support of Transformative 
Research at the National Science Foundation (Washington, DC: NSB, 
2007), and references therein. 

[2] Committee on Prospering the Global Economy of the 21st Century: An 
Agenda for American Science and Technology. 

[3] Committee on Science, Engineering, and Public Policy, Rising Above 
the Gathering Storm: Energizing and Employing American for a Brighter 
Economic Future (Washington, DC: National Academies Press, 2006)
Committee on Alternative Models for the Federal Funding of Science. 
Advancing Research in Science and Engineering: Investing in Early-
Career Scientists and High-Risk, High-Reward Research, (Cambridge, MA: 
American Academy of Arts and Sciences. 2008). 

The following are GAO's comments on the Department of Energy's (DOE) 
letter dated September 30, 2010. 

GAO Comments: 

1. As noted in our report, we found that it is too early to judge the 
effectiveness of spending under the act and that the areas funded 
under the COMPETES Act, namely R&D and STEM education programs, take 
significant time to produce outcomes. This view was shared by the DOE 
staff and funding recipients we met with during our work. We also 
determined that agency reporting of high-risk, high-reward research 
could be useful for evaluating the effectiveness of basic scientific 
research programs authorized under the act. We did not revise the 
report in response to DOE's comment. 

2. As we noted in the draft report, although we defined expanded 
programs to mean existing programs that the act authorized to receive 
increased appropriations from fiscal year 2008 to fiscal year 2010, 
our review of specific projects focused on those that were implemented 
in 2008 or 2009 because they were more likely to be established enough 
for us to evaluate their effectiveness. DOE Office of Science projects 
were only funded under the authorization of the act for fiscal year 
2010; prior years had been authorized under prior legislation. We made 
no change in response to DOE's comment. 

3. We agree that the summary language could have been misconstrued to 
refer to research activities more broadly when we were specifically 
referring to high-risk, high-reward research. We revised the report 
title and summary statements to better clarify this distinction in 
response to DOE's comment. 

4. We acknowledge the challenges faced by agencies in defining high- 
risk, high-reward research and have noted this in the report. Further, 
we acknowledge that there can be different opinions on what these 
terms can mean in a basic research context. In response to this 
comment, we added language to make this point more clear. However, 
because Congress is instrumental in funding such research, and because 
Congress provided that agencies should report this information, we 
believe it is essential that agencies work directly with Congress to 
resolve these issues. DOE made an effort to begin this dialogue, and 
complied with the reporting requirement, by reporting that it would 
not set the goal that it was the sense of the Congress that they 
should set. We believe that unless Congress repeals the reporting 
requirement, agencies should make an effort to provide the information 
on the goal, together with their definition of high-risk, high-reward 
research. In this way, agencies could constructively engage in an 
important dialog with the relevant congressional committees and 
research agencies, along with OSTP and OMB--each of which has sought 
to have agencies work to develop such information. Given each party's 
expertise, such a dialog and exchange of views could, over time, 
facilitation policy decisions, including the appropriate level of 
funding for such research at the individual agencies and across 
government. We maintain that reporting this information across the 
agencies funded under the COMPETES Act could result in better 
oversight and consequently have not revised the report in response to 
DOE's comment. 

5. We agree that DOE's Office of Science, and its predecessors, have a 
long history in supporting basic research that has contributed to 
significant scientific and other accomplishments. It is because of 
this history and experience that we believe DOE could provide more 
specific information on its funding for high-risk, high-reward 
research. In particular, DOE could serve as an example by drawing on 
its best resources to lay out what it considers to be the most 
appropriate basis for determining high-risk, high-reward research; set 
a goal that fits the needs of the agency and the scientific community; 
and constructively engage in a dialog with Congress over those 
matters. We believe that reporting this information to Congress is an 
iterative process that can be improved over time; for this process to 
take place, agencies need to take the first steps to report such 
information. We revised the report to include more information about 
DOE's efforts in the area of high-risk, high-reward research. 

6. We agree that there can be meaningful distinctions in how agencies 
and programs interpret the definition of high-risk, high-reward 
research. However, we believe that it is important for each agency 
covered under the reporting requirement to attempt to report by using 
its own definition. In this way, Congress can both receive the views 
of agencies regarding their definitions and consider whether to alter 
the reporting requirement or provide further direction to the agencies 
regarding the definition. We did not revise the report in response to 
DOE's comment. 

[End of section] 

Enclosure V: Comments from the National Science Foundation: 

Note: GAO comments supplementing those in the report text appear at 
the end of this enclosure. Agency comments refer to the draft report 
as GAO-10-1040R. That report number has since changed to GAO-11-127R. 

National Science Foundation: 
Office Of The Director: 
4201 Wilson Boulevard: 
Arlington, Virginia 22230: 

September 27, 2010: 

Mr. Franklin Rusco: 	
Director, Natural Resources and Environment: 
United States Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Rusco:	 

The National Science Foundation (NSF) appreciates the opportunity to 
comment on the draft report America COMPETES Act: It is Too Early to 
Evaluate Programs Long-term Effectiveness, but Agencies Could Improve 
Reporting of Research Priorities (GA0-10-1040R). 

NSF has a long history of supporting research with far-reaching 
impacts on the U.S. economy and the well-being of Americans. NSF 
endorses the underlying goal of Section 1008 of the America COMPETES 
Act (ACA) that agencies fund groundbreaking research that accelerates 
innovation. NSF is strongly committed and continually strives to 
promote innovation by supporting transformative research. "Promoting 
transformational, multidisciplinary research" is in fact the first 
investment priority listed under NSF's strategic goal "Discovery." 
Funding research that will lead to new discoveries and scientific and 
engineering breakthroughs is intrinsic to NSF's mission. NSF does 
endorse the draft report's recommendation to inform Congress on an 
annual basis about activities that are potentially transformative. 

While NSF appreciates the effort and interest of GAO staff to 
understand NSF's research activities—especially those associated with 
potentially transformative research, several aspects of the report 
warrant further tuning. 

NSF reporting on high-risk/high-reward research to Congress. This 
section (starting on page 8) contains seemingly contradictory 
information and does not capture fully NSF reporting activities. NSF 
has submitted during the past three years reports to Congress. The GAO 
draft correctly notes that letters were sent to Congressional leaders 
in 2008 and 2009. However, the draft report does not note that NSF 
also included in its 2010 Congressional budget request a report on 
high-risk/high-reward activities which captured similar information as 
that included in the earlier reports to Congress; the report in the NSF
FY 2010 budget submission is available to the public. The section's 
first sentence misstates NSF reporting actions; beginning the section 
with the second sentence would avoid confusion caused by the first 
sentence. [See comment 1] 

NSF and setting a percentage goal for high-risk/high-reward research. 
In the same section of the draft report, it states that "NSF maintains 
that there is no formula that could establish an appropriate 
percentage..." As noted in the exit conference and in subsequently 
provided documentation, NSF had considered the concept of a dedicated 
allocation for high-risk/high-reward research. NSF asked the
Advisory Committee on Government Performance Assessment (AC/GPA) in 
two separate years to address this concept. The conclusions reached in 
both years were similar. The 2004 AC/GPA report stated that, "No 
obvious formula exists to guide NSF as to the fraction of the 
portfolio that should be 'high risk."[Footnote 1] NSF's position to 
avoid creating an arbitrary metric draws from the committees' findings 
and is shared by external experts and advisors. 

Rather than allocate an arbitrary percentage of the research budget to 
high-risk/high-reward research projects, NSF believes that the most 
effective way to advance transformative research is to create an 
environment that is conducive to funding potentially transformative 
research. [See comment 2] 

During the past decade, NSF has undertaken specific steps to build 
that environment in the context of the structures, programs and 
policies to foster transformative research and innovation: 

* Formed an agency-wide working group to recommend policies, 
mechanisms, and practices to advance transformative research; 

* Revised NSF's merit review criteria to include language that 
explicitly alerts reviewers to consider whether proposals are 
potentially transformative; 

* Established a new funding mechanism focused on early-stage, 
untested, potentially transformative research (EAGER); 

* Created training materials on the subject for incoming NSF program 
officers; 

* Surveyed investigators about how welcoming NSF is to potentially 
transformative research proposals; 

* Developed a dedicated section on transformative research on the NSF 
website; 

* Requested outside advisors' feedback about NSF's potentially 
transformative research activities; and; 

* Conducts numerous outreach activities annually which include 
information for the research community about potentially 
transformative research and associated award opportunities. 

NSF funding of potentially transformative research. The draft report 
states that in FY 2010 NSF planned to spend $94 million on potentially 
transformative research, but then incorrectly equates that amount as 
NSF's total funding of potentially transformative research for the 
fiscal year and calculates NSF funding of transformative research to 
be less than 2% of the NSF budget. In several submissions to GAO in 
support of its ACA evaluation, NSF noted that the $94 million was 
devoted to efforts to study NSF processes and test new ones that might 
aid in discovering and funding potentially transformative research 
(e.g., creating "shadow panels" which have the primary purpose to 
identify potentially transformative research proposals, "sandpits," 
etc.). NSF also noted that NSF funding for potentially transformative 
research is not limited to the $94 million but rather spans across and 
is supported in NSF core research funding programs. [See comment 3] 

High-risk/high-reward vis-à-vis Potentially Transformative Research. 
Within NSF, the terms "high-risk/high-reward" and "potentially 
transformative" research are not synonymous—although they do overlap. 
While the term high-risk can mean a high degree of technical 
difficulty or novelty, high risk associated with research activities 
could also include the risk of not completing the research due to 
external factors (weather, access to critical instrumentation, 
dangerous working conditions) as well as the degree of experience a 
researcher possesses compared to the complexity of the research. For 
NSF, "transformative research involves ideas, discoveries or tools 
that radically change our understanding of an important, existing 
scientific or engineering concept or educational practice or leads to 
the creation of a new paradigm or field of science, engineering, or 
education." NSF can and does identify proposals that contain 
potentially transformative research ideas or concepts — across the 
entire spectrum of NSF-supported disciplines. [See comment 4] 

NSF and Program Evaluation. The draft report indicates that DOE's 
Office of Science convenes every three years experts to review its 
programs. The draft report does not mention that NSF undertakes 
similar activities through Committees of Visitors (COVs). COVs (1) 
assess the quality and integrity of program operations and program-
level technical and managerial matters pertaining to proposal 
decisions; and (2) comment on how the outputs and outcomes generated 
by awardees have contributed to the attainment of NSF's mission and 
strategic outcome goals. COV reviews are conducted at regular 
intervals of approximately every three years for programs and offices 
that recommend or award grants, cooperative agreements, and/or 
contracts and whose main focus is the conduct or support of NSF 
research and education in science and engineering. NSF relies on the 
judgment of external experts to maintain high standards of program 
management, to provide advice for continuous improvement of NSF 
performance, and to ensure openness to the research and education 
community served by NSF. [See comment 5] 

OMB/OSIP Memorandum. The August 2009 OMB/OSTP memorandum encourages 
agencies to pursue transformational solutions, and to describe funding 
for high-risk, high-payoff research and how to evaluate the success of 
techniques supporting high-risk research. The memo separately calls 
upon federal agencies to develop datasets to better document Federal 
science and technology investments that increase economic productivity 
and progress toward other national goals. The GAO draft report links 
the two points in a manner not found in the OMB/OSTP memo. [See 
comment 6] 

GAO proposal to establish a common high-risk/high-reward research 
funding percentage goal for federal agencies. After considering the 
GAO recommendation regarding high-risk/high-reward research, NSF does 
not support setting a percentage goal for high-risk/high-reward at 
either an agency level or government-wide. The ability to identify a 
priori during the review stage proposals that will lead to 
transformative results before the research is conducted and before the 
scientific community can assimilate the findings is challenging and, 
in most cases, impossible. In addition, as noted by advisory 
committees to NSF, there is no basis to determine an appropriate set-
aside for high-risk/high-reward research funding within the NSF 
context. Moreover, a common percentage goal for research-funding 
agencies ignores the requirements and missions unique to the 
individual agencies and assumes a one-size-fits-all approach. 

NSF is committed to supporting highly innovative research projects 
that have the potential to transform the frontiers of science and 
engineering and spur innovation. If you have any questions regarding 
this response, please contact Kathryn Sullivan at 703-292-7375. We 
look forward to receiving your final report. 

Sincerely, 

Signed by: 

Cora B. Marren: 
Acting Director: 

Footnote: 

[1] A similar statement was made by the 2005 AC/GPA when it noted 
"there is still no empirical way to determine what fraction of the 
portfolio should be the farthest out on the frontier." 

The following are GAO's comments on the National Science Foundation's 
(NSF) letter dated September 27, 2010. 

GAO Comments: 

1. We agree that NSF's letter reported similar information about high-
risk, high-reward research in its fiscal year 2010 budget submission 
as it had in its letters to Congressional leaders; however, NSF's 
fiscal year 2010 budget submission did not report whether the agency 
had set a percentage goal for such research--something Congress 
provided they should do in the COMPETES Act. In addition, NSF did not 
include information about high-risk, high-reward research in its 
fiscal year 2009 or 2011 budget submissions. We revised the report to 
clarify NSF's reporting. 

2. We are not recommending that agencies allocate an arbitrary 
percentage to high-risk, high-reward research. Rather, we are 
recommending that the agencies, who are the most informed about the 
state of the research communities they fund, fulfill the sense of the 
Congress and the reporting requirement--namely that they each 
establish a goal, perhaps as a range, and report funding toward the 
goal on an annual basis. The efforts to build a supportive research 
climate that are under way at NSF appear to be compatible with this 
approach. We revised the text to reflect the information on NSF's 
efforts to explore how to establish a percentage goal. 

3. We revised the report to more accurately reflect NSF funding of 
potentially transformative research in response to NSF's comment. 

4. Although our draft report reflected that potentially transformative 
research is not synonymous with high-risk, high-reward research, to 
further represent NSF's views, we revised the text to better clarify 
the distinctions. 

5. We added an explanation of the role of the Committee of Visitors, 
which is a panel of expert reviewers at NSF, in response to NSF's 
comment. 

6. We agree that the memorandum did not explicitly link these 
concepts. We found that the spirit of the memorandum encouraged 
agencies to improve reporting of scientific information, this 
reporting could include Congress' provision that agencies should 
report funding on high-risk, high-reward research. We revised the 
report to better clarify these facts in response to NSF's comment. 

7. While we acknowledge these concerns, unless agencies attempt to 
fulfill the sense of the Congress, Congress cannot receive the views 
of agencies regarding these concerns. We believe that it is important 
for each agency charged with the reporting requirement to attempt to 
fulfill the sense of Congress and the reporting provision by using its 
own definition. In this way, Congress can both receive the views of 
agencies regarding the definition and consider whether to alter the 
reporting requirement or provide further direction. NSF also commented 
that a common percentage goal does not take into account the missions 
and requirements of each individual agency. We modified the wording of 
our recommendation to make clear that we are recommending that each 
agency establish its own goal, as Congress provided. 

[End of section] 

Footnotes: 

[1] Pub. L. No. 110-69, 121 Stat. 572 (Aug. 9, 2007). 

[2] Also, Congress is currently considering proposed legislation that 
would establish additional requirements for prioritizing high-risk, 
high-reward research. See H.R.5116, §§ 221, 228(c)(2), 246(d)(2), 
referred to Senate Commerce, Science, and Transportation Committee 
(June 29, 2010). 

[3] For purposes of this review, we defined new programs as those 
programs authorized by the act to receive their initial appropriations 
beginning in fiscal year 2008. We defined expanded programs to be 
existing programs that the act authorized to receive increased 
appropriations from fiscal year 2008 to fiscal year 2010. 

[4] The Department of Education's Teachers for a Competitive Tomorrow 
program provides funding for both undergraduate-and graduate-level 
students. 

[5] This reporting requirement does not apply to the Department of 
Education as it does not conduct or fund basic scientific research in 
STEM fields. 

[6] Department of Energy, FY 2009 Congressional Budget Request, Volume 
4, Science, DOE/CF-027 (Washington, D.C., February 2008), 15; and 
Department of Energy, FY 2010 Congressional Budget Request, Volume 4, 
Science, DOE/CF-038 (Washington, D.C., May 2009), 12. 

[7] NSF submitted letters to the Speaker of the House and the House 
Minority Leader and to the Majority and Minority Leaders of the Senate 
in fiscal years 2009 and 2010. 

[8] NIST submitted these letters to the Chair and the Minority Leader 
of the House Committee on Science and Technology and the Senate 
Committee on Commerce, Science, and Transportation. NIST officials 
told us that, although the information was not publicly released on 
its Web site or with its budget documents, that they were publicly 
available if specifically requested. 

[9] See Executive Office of the President, Office of Management and 
Budget, Office of Science and Technology Policy, Science and 
Technology Priorities for the FY 2011 Budget (Washington, D.C., Aug. 
4, 2009). 

[10] GAO, Managing for Results: Barriers to Interagency Coordination, 
[hyperlink, http://www.gao.gov/products/GAO/GGD-00-106] (Washington, 
D.C.: Mar. 29, 2000). 

[11] GAO, Results-Oriented Government: Practices That Can Help Enhance 
and Sustain Collaboration among Federal Agencies, [hyperlink, 
http://www.gao.gov/products/GAO-06-15] (Washington, D.C.: Oct. 21, 
2005). 

[12] This program was authorized by section 7034 of the act as the 
"Professional Science Master's Degree Program." In addition to 
changing the name of the program, while the program was originally 
authorized to be funded through NSF's research and related activities 
account, NSF funded the program through its education and human 
resource funding beginning in fiscal year 2010, according to 
information from NSF. 

[13] Pub. L. No. 111-5, 123 Stat. 115 (Feb. 17, 2009). 

[14] See section 3012(b), amending the National Institute of Standards 
and Technology Act (15 U.S.C. 271 et seq.). 

[15] NIST issued a third funding solicitation in April 2010, through 
which it plans to award up to an additional $25 million in funding to 
projects anticipated to begin in January 2011. 

[16] For example, see GAO, Performance Budgeting: PART Focuses 
Attention on Program Performance, but More Can Be Done to Engage 
Congress, [hyperlink, http://www.gao.gov/products/GAO-06-28] 
(Washington, D.C.: Oct. 28, 2005); GAO, Higher Education: Federal 
Science, Technology, Engineering, and Mathematics Programs and Related 
Trends, [hyperlink, http://www.gao.gov/products/GAO-06-114] 
(Washington, D.C.: Oct. 12, 2005); U.S. Department of Education, 
Report of the Academic Competitiveness Council (Washington, D.C., 
2007); Committee on Science, Engineering, and Public Policy, 
Evaluating Federal Research Programs: Research and the Government 
Performance and Results Act (Washington, D.C.: Feb. 1999); and Office 
of Management and Budget, "Program Assessment Rating Tool Guidance No. 
2008-01, Appendix C: Research and Development Program Investment 
Criteria," (January 2008) 

[17] See, e.g, GAO, Performance Budgeting: PART Focuses Attention on 
Program Performance, but More Can Be Done to Engage Congress, 
[hyperlink, http://www.gao.gov/products/GAO-06-28] (Washington, D.C.: 
Oct. 28, 2005); GAO, Pipeline Safety: Systematic Process Needed to 
Evaluate Outcomes of Research and Development Program, [hyperlink, 
http://www.gao.gov/products/GAO-03-746] (Washington, D.C.: June 2003); 
and GAO, Highway Research: Systematic Selection and Evaluation 
Processes Needed for Research Program, [hyperlink, 
http://www.gao.gov/products/GAO-02-573] (Washington, D.C.: May 2002). 

[18] Traditional classifications of commonly tracked STEM-related 
careers may not include graduates who use their degrees to pursue 
other STEM-related careers, such as managers at technology companies 
or patent lawyers. 

[19] An earned value management system has the ability to combine 
measurements of scope, schedule, and cost in a single integrated 
system. If implemented appropriately, this system can provide 
objective reports of project status, produce early warning signs of 
impending schedule slippages and cost overruns, and provide unbiased 
estimates of anticipated costs at completion. 

[20] STAR METRICS stands for Science and Technology in America's 
Reinvestment-Measuring the Effect of Research on Innovation, 
Competitiveness and Science. The project is intended to monitor the 
impact of federal science investments on employment, knowledge 
generation, and health outcomes. 

[21] Agency comments refer to the draft report as GAO-10-1040R. That 
report number has since changed to GAO-11-127R. 

[22] We conducted site visits at all of the projects we reviewed with 
the exception of one project, representing the Department of 
Education's Teachers for a Competitive Tomorrow program. For this 
program, we reviewed documents and communicated with program officials 
remotely. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: