This is the accessible text file for GAO report number GAO-10-394 
entitled 'Streamlining Government: Opportunities Exist to Strengthen 
OMB's Approach to Improving Efficiency' which was released on June 7, 
2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office: 
GAO: 

May 2010: 

Streamlining Government: 

Opportunities Exist to Strengthen OMB's Approach to Improving 
Efficiency: 

GAO-10-394: 

GAO Highlights: 

Highlights of GAO-10-394, a report to congressional requesters. 

Why GAO Did This Study: 

Given record budget deficits and continuing fiscal pressures, the 
federal government must seek to deliver results more efficiently. The 
prior Administration sought to improve efficiency under the Program 
Assessment Rating Tool (PART) by requiring programs to have at least 
one efficiency measure and procedures for improving efficiency, and 
show annual efficiency gains. The current administration has also 
emphasized efficiency in some initiatives. GAO was asked to examine 
(1) the types of PART efficiency measures and the extent to which they 
included typical elements of an efficiency measure; (2) the extent to 
which selected programs showed gains and how they used efficiency 
measures for decision making; (3) the challenges selected programs 
faced in developing and using efficiency measures; and (4) other 
strategies that can be used to improve efficiency. GAO analyzed the 36 
efficiency measures in 21 selected programs in 5 agencies and a 
generalizable sample from the other 1,355 measures governmentwide, 
reviewed documents and interviewed officials from selected programs, 
reviewed literature on efficiency, and interviewed experts. 

What GAO Found: 

Under PART, most programs developed an efficiency measure. However, 
according to GAO’s analysis, 26 percent did not include both typical 
efficiency measure elements—an input (e.g., labor hours or costs) as 
well as an output or outcome (e.g., the product, service, or result 
produced). Most frequently missing was the input (69 percent). For 
example, a measure developed by the National Nuclear Safety Security 
Administration considered the number of information assets reviewed 
for certification without considering costs of review. This could 
result in measures that do not capture efficiency. GAO has previously 
recommended agencies improve cost information for decision making, but 
they are in various stages of implementation. However, alternative 
forms of measurement, such as reducing costly error rates, could still 
be useful. 

Of the efficiency measures GAO reviewed that had both typical 
elements, a similar number reported gains and losses. Officials for 
some programs stated that the efficiency measures reported for PART 
were useful, and described ways in which they used the data, such as 
to evaluate proposals from field units, lower the cost of a contract, 
or make decisions to shift production. Others did not find the 
efficiency measures useful because, for example, the program lacked 
control over key cost drivers, such as contractually required staffing 
levels, or because of concern that raising output could lower quality. 

Officials for all of the programs reviewed described challenges to 
developing and using program-level efficiency measures and performance 
measures in general. Challenges included interpreting outcome-level 
efficiency information, such as the cost of improving or maintaining 
the condition of watershed acres, when factors other than program 
funding, such as past impacts from mining, affected conditions as 
well; achieving required annual efficiency gains in cases where a 
program intervention takes years to implement; and inconsistent or 
limited guidance and technical assistance from the Office of 
Management and Budget (OMB) to agencies on how to measure efficiency. 

A variety of approaches have been used to improve efficiency, 
including governmentwide reviews, agency restructurings, process and 
technology improvements, and strategic spending approaches. The 
Administration has some initiatives along these lines, such as 
information technology and procurement reforms. The Government 
Performance and Results Act (GPRA) provides a framework for planning 
future efficiency gains while maintaining or improving effectiveness 
and quality of outputs or outcomes. OMB, as the focal point for 
management in the executive branch, provides guidance and supports 
information-sharing mechanisms, such as the Performance Improvement 
Council, which could also be used to create a more strategic and 
crosscutting focus on agency efforts to improve efficiency. OMB has 
not clearly indicated whether programs should continue measuring 
efficiency nor has it emphasized efficiency in its GPRA guidance to 
agencies. 

What GAO Recommends: 

GAO recommends that OMB evolve toward a broader approach with its 
guidance and support to improve efficiency at government-wide, agency, 
and program levels. OMB concurred with our recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-10-394] or key 
components. For more information, contact Bernice Steinhardt at (202) 
512-6543 or steinhardtb@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Most Programs Developed an Efficiency Measure for PART, but Only about 
Half Clearly Included Typical Elements of an Efficiency Measure: 

Programs Showed Mixed Results in Terms of Improvements in Efficiency 
and Use of Efficiency Measures for Decision Making: 

Program Officials Reported Challenges to Developing and Using 
Efficiency Measures: 

Using GPRA as a Framework, a Broader Array of Strategies Can Be Used 
to Seek Improvements in Efficiency: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Departments, Selected Program Assessment Rating Tool 
Program (PART) Names, and Summary of Programs: 

Appendix III: Department, PART Program Name, and Number of Efficiency 
Measures, Fiscal Year 2009 Funding Level, PART Program Type, and 
Efficiency Measure(s) for Selected Programs: 

Appendix IV: Comments from the Department of the Interior: 

Appendix V: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Examples of Efficiency Measures and Whether They Capture 
Efficiency: 

Table 2: Gains/Losses and Reported Use for Selected Programs' 
Efficiency Measures: 

Figures: 

Figure 1: Extent to Which 36 Efficiency Measures from Selected 
Programs Contained the Two Typical Elements of an Efficiency Measure 
and Other Attributes: 

Figure 2: Estimated Percentage of Efficiency Measures That Contained 
the Two Typical Elements of an Efficiency Measure and Other Attributes: 

Figure 3: Number of Individual Returns and IRS Staff Years for 
Individual Paper and Electronic Processing, Fiscal Years 1999-2010: 

Abbreviations: 

APQC: American Productivity and Quality Center: 

ATO: Air Traffic Organization: 

BPR: Business Process Reengineering: 

BRAC: Base Realignment and Closure: 

CFO: Chief Financial Officer: 

DOD: U.S. Department of Defense: 

EMDS: Ecosystem Management Decision Support: 

FAA: Federal Aviation Administration: 

FSA: Federal Student Aid: 

FTE: full-time equivalent: 

GPRA: Government Performance and Results Act of 1993: 

HM Treasury: Her Majesty's Treasury: 

IRS: Internal Revenue Service: 

IT: information technology: 

MCA: managerial cost accounting: 

NAO: National Audit Office: 

NHTSA: National Highway Traffic Safety Administration: 

NSLP: National School Lunch Program: 

OMB: Office of Management and Budget: 

OSHA: Occupational Safety and Health Administration: 

PART: Program Assessment Rating Tool: 

PMA: President's Management Agenda: 

SAVE: Securing Americans Value and Efficiency: 

SEA: State Education Agency: 

SFFAS: Statement of Federal Financial Accounting Standards: 

UK: United Kingdom: 

USDA: U.S. Department of Agriculture: 

VA: U.S. Department of Veterans Affairs: 

VERA: Veterans Equitable Resource Allocation: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

May 7, 2010: 

The Honorable Thomas R. Carper: 
Chairman: 
The Honorable John McCain: 
Ranking Member: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Tom Coburn: 
United States Senate: 

Weaknesses in the economy and financial markets--and the government's 
response to them--have contributed to recent increases in federal 
deficits, which reached a record level in fiscal year 2009. While a 
lot of attention has been given to the recent fiscal deterioration, 
the federal government faces even larger fiscal challenges, driven by 
certain factors, such as health care cost growth and demographic 
trends, which will persist long after the return of financial 
stability and economic growth. Given the magnitude of these 
challenges, the federal government must identify ways to operate and 
deliver results more efficiently as well as more effectively. 

In response to these fiscal challenges, the current Administration has 
emphasized the importance of reducing spending and improving 
government efficiency in recent initiatives. These initiatives have 
included: the Office of Management and Budget's (OMB) requirement for 
agencies to submit alternative targets for discretionary funding 
levels for fiscal year 2011 budget submissions that involved freeze 
and reduction scenarios, including the identification of 126 program 
terminations, reductions, and other areas of savings identified which, 
if enacted or implemented, could save approximately $23 billion; 
[Footnote 1] contracting and workforce reforms designed to save at 
least $40 billion a year; information technology management 
improvements designed to improve efficiency; and holding a contest to 
seek ideas from federal employees on how to increase efficiency and 
savings.[Footnote 2] Recently, the President also established a 
management advisory board to provide advice and recommendations on, 
among other things, improving the productivity of federal operations. 
[Footnote 3] 

At the same time, several broader government reform efforts over the 
past 17 years have also included a focus on improving efficiency. The 
Government Performance and Results Act of 1993 (GPRA),[Footnote 4] 
which Congress enacted in part to improve federal program 
effectiveness and accountability and enhance congressional decision 
making, was created partly to address waste and inefficiency in 
federal programs.[Footnote 5] The President's Management Agenda (PMA) 
[Footnote 6] and Program Assessment Rating Tool (PART)[Footnote 7] 
initiatives of the previous presidential administration emphasized 
improving government efficiency with specific requirements for 
agencies to develop program-level efficiency measures and show annual 
improvements in efficiency. Analysis of the experiences of federal 
agencies in developing and using efficiency measures under the PMA and 
PART initiatives, as well as identification of additional strategic 
and crosscutting approaches used by government, nongovernment, and 
business organizations to seek improvements in efficiency, could be 
helpful to agencies as they attempt to improve efficiency of programs. 

In response to your request, this report examines (1) the types of 
efficiency measures reported through PART for agency programs overall, 
and particularly for selected programs in five selected agencies, 
focusing on the extent to which they included typical elements of an 
efficiency measure, (2) for selected programs, the extent to which 
programs reporting efficiency measures through PART have shown 
efficiency gains and how programs have used efficiency measures for 
decision making, (3) for selected programs, the types of challenges to 
developing and using efficiency measures they have faced, and (4) 
other strategies that can be used to improve efficiency. 

Based on our review of the literature,[Footnote 8] an efficiency 
measure is typically defined as the ratio of two elements: a program's 
inputs (such as costs or hours worked by employees), to its outputs or 
outcomes. Outputs can be defined as the amount of products or services 
delivered by a program. Outcomes can be defined as the desired results 
of a program, such as events, occurrences, or changes in conditions, 
behaviors, or attitudes. In some literature, the inverse ratio of 
outcomes or outputs to inputs is referred to as a "productivity" 
measure,[Footnote 9] but for purposes of this report, we refer to 
either form of the ratio as an efficiency measure. It should be noted 
that an improvement in efficiency can be achieved by maintaining 
quantity or quality of outputs or outcomes while reducing costs, as 
well as by improving the quantity or quality of outputs or outcomes 
while maintaining (or reducing) costs. Thus an improvement in 
efficiency need not involve a reduction of costs. 

OMB initially described an efficiency measure as the ratio of a 
program's outcomes or outputs to inputs in the 2004 PART guidance. In 
the December 2007 PART guidance, OMB termed this type of ratio an 
"input productivity measure," and indicated that such measures could 
provide a useful approach for identifying efficiency measures. In the 
guidance, OMB also identified erroneous conclusions that can result 
from the use of simple output-input ratios to track changes over time 
in efficiency for programs that do not produce the same or similar 
outputs repetitively. OMB also identified challenges facing efforts to 
measure efficiency in research and development programs and 
construction of special purpose infrastructure projects. OMB broadened 
the discussion of efficiency measures in the revised guidance and 
proposed alternative approaches to tracking efficiency changes for 
such programs, such as meeting project cost, schedule, and performance 
goals. 

To address our objectives, we analyzed all 1,396 PART efficiency 
measures associated with 937 programs in a database provided by OMB. 
We conducted more detailed analysis of the 36 efficiency measures for 
21 selected programs,[Footnote 10] as well as a random sample of 100 
efficiency measures from all remaining programs. This sample was 
designed to enable us to generalize our analysis to the remaining 
efficiency measures for PART.[Footnote 11] We selected the 21 specific 
programs for review from five departments--the U.S. Departments of 
Agriculture, Education, the Interior, Labor, and Transportation. These 
departments were selected to represent variety in the extent to which 
they had developed managerial cost accounting systems as identified by 
our prior work, based on an assumption that the status of a 
department's cost accounting systems could affect the availability of 
cost information and thus the development of efficiency measures. 
[Footnote 12] We selected the 21 specific programs to represent a 
diverse array of functions and operations within the federal 
government, primarily focusing on the PART program type.[Footnote 13] 
Additional criteria were that the selected programs had relatively 
large fiscal year 2009 funding levels,[Footnote 14] and variety in the 
number of efficiency measures associated with the programs. In 
addition, we reviewed program documents, OMB documents, including PART 
assessments, and agency Web sites. We conducted a literature review as 
well as expert interviews to identify the elements of a typical 
efficiency measure, and to identify alternative approaches to 
improving efficiency. We interviewed officials from OMB and from the 
21 selected programs, as well as officials from the five departments 
who were knowledgeable about performance measurement and financial 
systems for the departments. See appendix I for a more detailed 
discussion of our scope and methodology. 

We conducted the major portion of this performance audit from 
September 2008 to May 2010 in accordance with generally accepted 
government auditing standards.[Footnote 15] Those standards require 
that we plan and perform the audit to obtain sufficient, appropriate 
evidence to provide a reasonable basis for our findings and 
conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

Background: 

Congress enacted GPRA in part to inform congressional decision making 
by providing objective information on the relative effectiveness and 
efficiency of federal programs and spending. In addition to requiring 
executive agencies to develop strategic and annual performance plans, 
and measure and report on progress toward goals, GPRA also emphasized 
efficiency. According to the statute, GPRA was intended, among other 
things, to address problems of waste and inefficiency in federal 
programs, and to improve congressional decision making by providing 
objective information on the relative efficiency and effectiveness of 
federal programs and spending.[Footnote 16] 

OMB plays an important role in the management of the federal 
government's performance, and specifically GPRA implementation. Part 
of OMB's overall mission is to ensure that agency plans and reports 
are consistent with the President's budget and administration 
policies. OMB is responsible for receiving and reviewing agencies' 
strategic plans, annual performance plans, and annual performance 
reports. To improve the quality and consistency of these documents, 
OMB issues annual guidance to agencies for their preparation, 
including guidelines on format, required elements, and submission 
deadlines.[Footnote 17] In addition, GPRA requires OMB to prepare the 
overall governmentwide performance plan, based on agencies' annual 
performance plan submissions. 

The PMA and PART of the prior administration also included an emphasis 
on improving government efficiency, with requirements for agencies to 
develop program-level efficiency measures and show annual improvements 
in efficiency. In August 2001, the Bush Administration launched the 
PMA with the stated purpose of ensuring that resources entrusted to 
the federal government were well managed and wisely used. OMB 
developed criteria called "standards of success" to measure progress 
in five management initiatives under the PMA, as well as a scorecard 
to track agency progress under each initiative. Criteria to receive 
and maintain the highest rating score (green status) for the 
performance improvement initiative included that an agency's annual 
budget and performance documents include at least one efficiency 
measure for each program and that program performance and efficiency 
improvements be identified each year.[Footnote 18] 

PART, which was launched in 2002 as a component of the PMA, included 
assessment of the extent to which programs were tracking progress 
toward and achieving efficiency improvements. PART consisted of a set 
of questions developed to assess various types of federal executive 
branch programs, and addressed four aspects of a program: purpose and 
design, strategic planning, program management, and program results/ 
accountability. While there were references to efficiency in several 
different sections of the 2007 and 2008 PART guidance, two PART 
questions focused specifically on development of program-level 
efficiency measures with annual targets for improvement:[Footnote 19] 

* "Does the program have procedures (e.g., competitive sourcing/cost 
comparisons, information technology (IT) improvements, appropriate 
incentives) to measure and achieve efficiencies and cost effectiveness 
in program execution?" 

- In order to receive a "yes" response for this question, a program 
was to have regular procedures in place to achieve efficiencies and 
cost effectiveness, and had to have at least one efficiency measure 
with baseline and targets. Evidence could include efficiency measures, 
competitive sourcing plans, IT improvement plans designed to produce 
tangible productivity and efficiency gains, or IT business cases that 
documented how particular projects improved efficiency. 

* "Does the program demonstrate improved efficiencies or cost 
effectiveness in achieving program goals each year?" 

- In order to receive a "yes" response for this question, a program 
had to demonstrate improved efficiency or cost effectiveness over the 
prior year, including meeting its efficiency target(s) in the question 
above. 

Most Programs Developed an Efficiency Measure for PART, but Only about 
Half Clearly Included Typical Elements of an Efficiency Measure: 

About 90 percent of all programs that received a PART assessment, 
including those in our selected review, developed at least one 
performance measure as an efficiency measure.[Footnote 20] However, we 
found that about half of the approved measures either did not contain 
typical elements of an efficiency measure, or were unclear. As table 1 
below indicates, we analyzed a sample of the efficiency measures that 
were developed for PART, and, to the extent possible, placed them into 
one of the three categories shown in the table. (In some cases, the 
available information on the measure was insufficient for us to place 
it into one of the three categories, so we labeled these measures as 
"unclear.") 

Table 1: Examples of Efficiency Measures and Whether They Capture 
Efficiency: 

Type of measure: Input ÷ Output/outcome; 
Example: Cost per job created[A]; 
Does measure capture efficiency?: Yes. 

Type of measure: (Missing input) ÷ Output/outcome; 
Example: Annual number of information assets reviewed for 
certification and accreditation[B]; 
Does measure capture efficiency?: No; 
Measure indicates whether more or less is being produced, but not 
whether more or fewer resources are being used. 

Type of measure: Input ÷ (Missing output/outcome); 
Example: Administrative cost as a percentage of total program costs[C]; 
Does measure capture efficiency?: No; 
Measure indicates whether administrative costs change relative to 
total cost, but not whether more or fewer outputs or outcomes are 
being produced. 

Source: GAO analysis of OMB PART efficiency measures. 

[A] This efficiency measure was identified in response to the PART 
assessment for the Delta Regional Authority. 

[B] This efficiency measure was identified in response to the PART 
assessment for the Department of Energy National Nuclear Security 
Administration: Safeguards and Security program. 

[C] This efficiency measure was identified in response to the PART 
assessment for the Department of Energy Building Technologies program. 

[End of table] 

As figures 1 and 2 below illustrate, our analysis of the 36 efficiency 
measures from our selected programs and a random sample of the 
remaining efficiency measures indicates that about half of the 
efficiency measures contained typical elements by including both an 
input and an output or outcome. As illustrated in figure 1, for the 21 
selected programs (listed in appendix II), we determined that 58 
percent of the efficiency measures included both elements and 42 
percent did not. In its guidance to programs, OMB stated that, 
although both output and outcome-oriented efficiency measures were 
acceptable, outcome efficiency measures were preferred. Because we 
obtained more in-depth information on the selected programs' measures, 
we further analyzed whether those that included both elements were 
output-or outcome-oriented and found most to be output-oriented. 

Figure 1: Extent to Which 36 Efficiency Measures from Selected 
Programs Contained the Two Typical Elements of an Efficiency Measure 
and Other Attributes: 

[Refer to PDF for image: series of pie-charts] 

Percentages of efficiency measures containing typical elements and 
other attributes (n=36): 

Included both typical elements (21): 58%; 
* Percentage with output or outcome (n=21): 
Output (13): 62%; 
Outcome (8): 38%; 
* Percentage with cost or time as the input (n=21): 
Cost (18): 86%; 
Time (3): 14%. 

Did not include a typical element (15): 42%; 
* Percentage missing a typical element (n=15): 
Input missing (13): 87%; 
Output or outcome missing (2): 13%. 

Source: GAO analysis of OMB PART data. 

Note: The typical elements of an efficiency measure include (1) an 
input and (2) an output or outcome. 

[End of figure] 

Figure 2 summarizes estimates for the remaining 1,355 efficiency 
measures, based on a random sample of 100 of those measures. We 
estimate that 48 percent of the measures included both elements, 
[Footnote 21] 26 percent did not, and the remaining 26 percent were 
unclear.[Footnote 22] Of those that did not contain both elements, the 
missing element was most often an input. 

Figure 2: Estimated Percentage of Efficiency Measures That Contained 
the Two Typical Elements of an Efficiency Measure and Other Attributes: 

[Refer to PDF for image: series of pie-charts] 

Percentages of efficiency measures containing typical elements and 
other attributes (n=100): 

Included both typical elements (48): 48%; 
* Percentage with cost or time as the input (n=48): 
Cost (45): 94%; 
Time (2): 4%; 
Time and cost (1): 2%. 

Did not include a typical element (26): 26%; 
* Percentage missing a typical element (n=26): 
Input missing (18): 69%; 
Output or outcome missing (8): 31%; 

Unclear (26): 26%. 

Source: GAO analysis of sample of 100 efficiency measures taken from 
OMB PART data. 

Note: The two typical elements of an efficiency measure include (1) an 
input and (2) an output or outcome. Estimates based on all 100 sampled 
efficiency measures have a 95 percent confidence interval of +/-10 
percentage points. Estimates based on smaller samples of 48 and 26 
above have 95 percent confidence intervals of +/-12 and +/-22 
percentage points, respectively. 

[End of figure] 

In general, as indicated in table 1, the absence of these typical 
elements can result in measures that do not truly capture efficiency. 
Nevertheless, some of the information captured in these measures could 
still be of value to program officials for helping improve efficiency. 
For example, one measure from our selected programs--average time to 
correct/mitigate higher priority operations and maintenance 
deficiencies at certain facilities in the Bureau of Reclamation--did 
not contain an input element[Footnote 23]. However, program officials 
told us this was an important measure because it helped them 
prioritize which ongoing preventive maintenance projects they should 
repair first by categorizing repairs needed according to the likely 
costs of delaying the repairs. For example, a category 1 deficiency 
should normally be repaired immediately (within 3 to 6 months) to 
avoid escalating the cost of repair; a category 2 deficiency should be 
repaired in a few years. In contrast, a category 3 deficiency is 
normally repaired only if there is time and funding remaining after 
repairing category 1 and 2 deficiencies. 

In another example, the National School Lunch Program (NSLP) used a 
measure which was labeled an efficiency measure, but which did not 
have the typical ratio of inputs to outputs or outcomes. Instead, the 
measure focused on reducing the error rate in making program payments. 
Program officials characterized the measure as a process measure, 
rather than an output or outcome-based efficiency measure. An official 
said that out of $7 billion in total program payments, errors worth $2 
billion occur in terms of under and over payments, for a net cost to 
the program of $1 billion. An official said that if they were able to 
reduce overall overpayments due to various types of error, it could 
save millions of dollars. Officials said this measure has been 
important in helping them take corrective actions to reduce the number 
of payments made in error. 

Among the selected programs, for the efficiency measures that 
contained an input, the type of information used to express the input 
varied in terms of both availability for use and completeness. Most of 
the efficiency measures we reviewed captured inputs in terms of cost, 
but a few used the amount of staff resources or time spent to produce 
an output or outcome as a proxy for cost. For example, the Department 
of Labor Energy Employees Occupational Illness Compensation program's 
efficiency measure was the average number of decisions per full-time 
equivalent (FTE), which we determined used information on work hours 
as estimated by FTEs as the input.[Footnote 24] While FTE information 
is often readily available and can be a useful proxy for cost, it does 
not necessarily reflect total cost because, for example, it would 
neither distinguish between higher and lower cost FTEs, nor would it 
include other costs, such as contractors, training, equipment, or 
facilities. 

In addition, dollar cost information can vary in how completely it 
captures the cost of producing outputs or outcomes. "Cost" generally 
can be thought of as the value of resources that have been, or must 
be, used or sacrificed to attain a particular objective,[Footnote 25] 
which, in the case of an efficiency measure, would be a unit of output 
or outcome. "Full cost" is generally viewed as including both direct 
costs (costs that can be specifically identified with a cost object, 
such as an output) and indirect costs (costs of resources that are 
jointly or commonly used to produce two or more types of outputs but 
are not specifically identifiable with any of the outputs).[Footnote 
26] Managerial cost accounting (MCA) information can provide a more 
complete picture of the cost involved in producing program outputs or 
outcomes by recognizing resources when they are used and determining 
the full cost of producing government goods and services, including 
both direct and indirect costs. According to the Statement of Federal 
Financial Accounting Standards No. 4 (SFFAS 4), Managerial Cost 
Accounting Concepts and Standards for the Federal Government, which 
sets forth the fundamental elements for MCA in government agencies, 
[Footnote 27] costs may be measured, analyzed, and reported in many 
ways and can vary depending upon the circumstances and purpose for 
which the measurement is to be used. Our analysis of the cost 
information used by the selected programs showed that most of the 
measures used budgetary information, such as appropriations or 
obligations, for the cost element.[Footnote 28] Of the 18 efficiency 
measures from our selected programs that had both typical elements, 
and had cost as the input, 14 measures (78 percent) used a form of 
budgetary information. 

We have previously reported that using budgetary information, such as 
appropriations or obligations, may not completely capture the full 
cost of producing program outputs or outcomes because of differing 
time frames and account structures.[Footnote 29] With regard to 
timing, appropriations provide agencies legal authority to obligate 
funds for a given fiscal year or beyond. Consequently, agency outlays 
(payments against obligations for goods and services received) 
representing the resources used to produce a program's outputs or 
outcomes in a given year may flow from obligations made in a prior 
year's appropriation. Therefore a given year's appropriations or 
obligations may not represent the resources actually used to produce a 
program's outputs or outcomes in that year. With regard to account 
structures, appropriations accounts developed over the last 200 years 
were oriented in different ways in response to specific needs. For 
example, some appropriations accounts reflect items of expense, such 
as salaries or construction, while others reflect organizations, 
processes, or programs. Further, program-oriented account structures 
may cover multiple programs or may exclude some indirect resources 
used by the programs. 

Though budgetary information may not completely cover the cost of 
producing program outputs or outcomes, several program officials said 
it was the most complete information available to them and best met 
the needs of Congress. For example, the Department of Labor Job Corps 
program, which used budgetary information in its efficiency measure, 
divided its request in the fiscal year 2010 Job Corps Congressional 
Budget Justification into three categories: operations, construction, 
and administration. However, the program's efficiency measure--cost 
per participant in the Job Corps program--was based entirely on the 
operations category, which encompassed 92 percent of the program's 
fiscal year 2010 request, meaning the measure did not capture the 
remaining 8 percent of construction-or administration-related costs 
that were also associated with program participation. A study 
commissioned by the Job Corps recommended that all direct costs 
associated with Job Corps appropriations be included in the measure if 
full costs were to be determined. This would include actual 
expenditures (i.e., outlays rather than appropriations or obligations) 
for Job Corps appropriations provided for operations, construction, 
[Footnote 30] and direct administrative costs.[Footnote 31] Program 
officials indicated they did not believe including the additional 
costs would provide useful information because there were relatively 
few opportunities to find efficiencies in the construction or 
administration categories. Additionally, a Department of the Interior 
Wildland Fire Management budget official told us that while they had 
access to more complete cost data, this information was not 
necessarily accurate or easy to obtain because it had to be collected 
from five different entities with different cost accounting 
systems.[Footnote 32] They also preferred to use budgetary information 
because it helped to justify their appropriations request to Congress. 
Program officials noted that each of their three efficiency measures 
was based on obligations data.[Footnote 33] 

Relative to time or budgetary information, some agencies have sought 
to develop more complete cost information by using MCA systems capable 
of accumulating and analyzing both financial and nonfinancial data in 
order to determine, among other things, the unit cost of producing 
program outputs or outcomes. Such systems are also capable of 
recognizing resources when they are used and determining the full cost 
of producing government goods and services, including both direct and 
indirect costs. 

However, in earlier work we found that only 3 of the 10 Chief 
Financial Officer (CFO) Act agencies we reviewed had implemented MCA 
systems entitywide: Interior, the Social Security Administration, and 
Labor.[Footnote 34] Transportation had made significant progress in 
implementing MCA entitywide and three agencies--Agriculture, Health 
and Human Services, and Housing and Urban Development--planned to 
implement MCA systems when upgrading their overall financial 
management systems. The three remaining agencies we reviewed--
Education, the Treasury, and Veterans Affairs--had no plans to 
implement MCA departmentwide,[Footnote 35] although Veterans Affairs 
was initiating a review to explore opportunities to do so. 
Consequently, we recommended that individual agencies commence or 
improve the development of entity-wide MCA systems as a fundamental 
component of their financial management system, as required by SFFAS 4 
and the Federal Financial Management Improvement Act of 1996.[Footnote 
36] 

For this report, of the five agencies we reviewed, we selected three-- 
Interior, Labor, and Transportation--because we previously reported 
they had either implemented MCA systems entitywide, or were planning 
to do so. Nevertheless, we did not find widespread use of MCA system 
data for the efficiency measures we reviewed either in these agencies 
or in the other two agencies--Education and Agriculture--that did not 
have entitywide MCA systems. 

Of the 18 efficiency measures from our selected programs that included 
typical elements, four measures (22 percent) used a distinct MCA 
system to determine costs. Those programs that relied on MCA data 
produced outputs, such as the Student Aid Administration program 
(student aid disbursements), the Federal Aviation Administration's 
(FAA) Air Traffic Organization Terminal (take offs and landing 
operations) and Technical (maintenance and modernization of equipment 
needed to provide air traffic services) programs, and the Department 
of the Interior's Fisheries program (pounds of trout per dollar). In 
addition, legislation was enacted in the 1990s, which resulted in both 
Federal Student Aid (FSA) and FAA developing MCA systems to improve 
performance.[Footnote 37],[Footnote 38] Of the remaining 14 efficiency 
measures, officials from several of those programs told us they used 
budgetary information because they either did not have access to an 
MCA system, the system they could access produced poor data, or the 
information would not be useful for congressional decision making. For 
example, the Department of Education did not have a departmentwide MCA 
system, though it is now considering creating such a system in 
response to a prior recommendation we made.[Footnote 39] Also, 
officials with the Department of Transportation CFO office told us 
that the department had taken a decentralized approach in which some 
of their operating administrations--such as the FAA and Federal 
Transit Administration--had developed and were using their own MCA 
system. In addition, although the Department of Labor's CFO had 
developed an MCA system and made it available to its agencies and 
programs, officials from the five Department of Labor programs we 
reviewed indicated that they did not use it for their efficiency 
measures because, in their opinions, the system was either not useful, 
not sufficiently developed for their needs, did not capture all the 
program's costs, or captured a different type of funding than was used 
for the efficiency measure. Finally as indicated previously, a 
Department of the Interior Wildland Fire budget official told us that 
cost information for their program was neither easy to access nor was 
it as useful for budget justification purposes. 

Programs Showed Mixed Results in Terms of Improvements in Efficiency 
and Use of Efficiency Measures for Decision Making: 

The selected programs that had measures with both elements of a 
typical efficiency measure reported mixed results under PART in terms 
of gains and losses in efficiency. As previously indicated in figure 
2, 21 of the 36 efficiency measures developed by the programs selected 
for our review had both of the elements of a typical efficiency 
measure. As can be seen in table 2, 8 of the 21 efficiency measures 
(representing seven different programs), showed an improvement in 
efficiency between the baseline and most current year. Ten of the 
efficiency measures (representing seven programs) showed a decrease in 
efficiency over the reported periods. Three measures (representing two 
programs) had only baseline data. 

Table 2: Gains/Losses and Reported Use for Selected Programs' 
Efficiency Measures: 

Department: Agriculture; 
Program: Plant & Animal Health Monitoring; 
Reported use of efficiency measure(s): Used; 
Efficiency measures: Value of damage prevented or mitigated by the 
monitoring and surveillance programs per dollar spent; 
Efficiency: Gain. 

Department: Education; 
Program: Smaller Learning Communities; 
Reported use of efficiency measure(s): Did not use[A]; 
Efficiency measures: FY 03 Cohort: Cost (in dollars) per student 
demonstrating proficiency or advanced skills in reading; 
Efficiency: Gain. 

Department: Education; 
Program: Smaller Learning Communities; 
Reported use of efficiency measure(s): Did not use[A]; 
Efficiency measures: FY 03 Cohort: Cost (in dollars) per student 
demonstrating proficiency or advanced skills in mathematics; 
Efficiency: Gain. 

Department: Education; 
Program: Smaller Learning Communities; 
Reported use of efficiency measure(s): Did not use[A]; 
Efficiency measures: FY 04 Cohort: Cost (in dollars) per student 
demonstrating proficiency or advanced skills in reading; 
Efficiency: Loss. 

Department: Education; 
Program: Smaller Learning Communities; 
Reported use of efficiency measure(s): Did not use[A]; 
Efficiency measures: FY 04 Cohort: Cost (in dollars) per student 
demonstrating proficiency or advanced skills in mathematics; 
Efficiency: Loss. 

Department: Education; 
Program: Smaller Learning Communities; 
Reported use of efficiency measure(s): Did not use[A]; 
Efficiency measures: FY 05 Cohort: Cost (in dollars) per student 
demonstrating proficiency or advanced skills in reading; 
Efficiency: Baseline data only. 

Department: Education; 
Program: Smaller Learning Communities; 
Reported use of efficiency measure(s): Did not use[A]; 
Efficiency measures: FY 05 Cohort: Cost (in dollars) per student 
demonstrating proficiency or advanced skills in mathematics; 
Efficiency: Baseline data only. 

Department: Education; 
Program: Student Aid Administration; 
Reported use of efficiency measure(s): Used; 
Efficiency measures: Direct administrative unit costs for origination 
and disbursement of student aid; 
Efficiency: Gain. 

Department: Interior; 
Program: Fish and Wildlife Services Fisheries; 
Reported use of efficiency measure(s): Used; 
Efficiency measures: Pounds/dollar of healthy rainbow trout produced 
for recreation; 
Efficiency: Loss. 

Department: Interior; 
Program: Wildland Fire Management; 
Reported use of efficiency measure(s): Used; 
Efficiency measures: Number of acres treated in the wildland-urban 
interface per million dollars gross investment; 
Efficiency: Gain. 

Department: Interior; 
Program: Wildland Fire Management; 
Reported use of efficiency measure(s): [Empty]; 
Efficiency measures: Number of acres treated outside the wildland-
urban interface per million dollars gross investment; 
Efficiency: Loss. 

Department: Interior; 
Program: Wildland Fire Management; 
Reported use of efficiency measure(s): [Empty]; 
Efficiency measures: Number of acres in fire regimes 1, 2, or 3 moved 
to a better condition class per million dollars of gross investment; 
Efficiency: Loss. 

Department: Labor; 
Program: Energy Employees Occupational Illness Compensation; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Average number of decisions per full-time 
equivalent; 
Efficiency: Gain. 

Department: Labor; 
Program: Job Corps; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Cost per participant; 
Efficiency: Loss. 

Department: Labor; 
Program: Occupational Safety & Health Administration; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Inspections per Compliance Safety & Health 
Officer; 
Efficiency: Loss. 

Department: Labor; 
Program: Unemployment Insurance Administration State Grants; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Number of timely and accurate initial benefit 
claims per $1,000 of inflation-adjusted base grant funds; 
Efficiency: Gain. 

Department: Labor; 
Program: Workforce Investment Act-Migrant & Seasonal Farmworkers; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Cost per participant; 
Efficiency: Loss. 

Department: Transportation; 
Program: FAA Air Traffic Organization-Technical Operations; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Unit cost for providing ATO-technical operations 
services; 
Efficiency: Gain. 

Department: Transportation; 
Program: FAA Air Traffic Organization-Terminal Programs; 
Reported use of efficiency measure(s): Used; 
Efficiency measures: Unit cost for providing terminal services; 
Efficiency: Loss. 

Department: Transportation; 
Program: FAA Air Traffic Organization-Terminal Programs; 
Reported use of efficiency measure(s): Used; 
Efficiency measures: Productivity rate at service delivery points; 
Efficiency: Loss. 

Department: Transportation; 
Program: National Highway Traffic Safety Administration-Operations & 
Research; 
Reported use of efficiency measure(s): Did not use; 
Efficiency measures: Average costs incurred to complete a defect 
investigation; 
Efficiency: Baseline data only. 

Department: Total number of efficiency measures; 
Efficiency measures: 21; 
Net gain: 8; 
Net loss: 10; 
Baseline data only: 3. 

Source: GAO analysis of OMB and agency data and agency officials. 

Notes: Table excludes measures missing typical elements of an 
efficiency measure. We determined the net change in efficiency over 
time by comparing the latest year's actual data to the baseline. Some 
programs had only one year of reported data for making comparisons, 
while other programs had multiple years of reported data. Reporting 
the net change over a several year period may obscure interim annual 
gains or losses in reported efficiency. 

[A] Agency officials indicated they initially used the efficiency data 
collected to explore whether there might be some relationship between 
costs per student and either uses of funds or number of grade levels 
served, and determined that the data were not of sufficient quality to 
permit that analysis. We concluded the information was therefore not 
useful for decisionmaking. 

[End of table] 

We have previously reported that agencies can use performance 
information to make various types of management decisions to improve 
programs and results.[Footnote 40] The same is true for performance 
measures that track efficiency--managers need to use the information 
to help them identify actions needed to bring about improved 
efficiency. Our review of selected programs that had measures with 
both elements of a typical efficiency measure found variety in terms 
of whether officials reported using efficiency measures. We also found 
no clear relationship between efficiency gains or losses and whether 
program officials reported using or not using efficiency measures. 
Officials from three of the seven programs that reported efficiency 
gains described using their efficiency measures, while officials for 
three additional programs with efficiency gains said they did not use 
the efficiency measures. Officials for the other program with 
efficiency gains reported mixed pictures, saying they did not use the 
efficiency measure but found some value in the measure or its 
components. A similar mix was found among programs that reported net 
losses in efficiency, with officials for three programs using the 
efficiency measures and officials for four programs not using them. 

One example of a program that showed a net gain over time for its 
efficiency measure and for which officials reported using the data was 
the Department of Education's Student Aid Administration program. 
Reducing costs was one of the primary objectives of the program. Their 
efficiency measure--direct administrative unit costs to originate and 
disburse student loans and Pell Grants--showed a gain in efficiency 
from 2006 to 2008. The agency provides federal assistance to eligible 
students by partnering with postsecondary schools, financial 
institutions, and guaranty agencies (state and nonprofit agencies that 
guarantee loans against default). Program officials told us they used 
information from this measure to establish targets for reduced unit 
costs for their lending transactions. For example, they reported using 
the data to negotiate a lower cost for the origination of direct 
student loans by a sole-source contractor.[Footnote 41] FSA used a 
contractor to originate the loans made directly to students. The 
contract allowed for a certain quantity of loan originations for a set 
price, up to a maximum number of loans each year. According to program 
officials, the sharp reduction in credit availability due to the 
financial crisis beginning in 2008 led to an increase in demand for 
FSA direct loans. FSA had projected that demand for direct student 
loans in the 4th quarter of fiscal year 2009 would exceed the contract 
maximum by 3 million loans. The contractor proposed a price of $8.9 
million for the additional loans, arguing that the added volume would 
require higher infrastructure costs associated with greater call 
center capacity. FSA officials told us they analyzed historical data 
for their efficiency measure and found that the unit cost to originate 
loans decreased as volume increased. They used this analysis to 
challenge the contractor's bid and succeeded in lowering the agreed 
price to $4.9 million. Officials reported that legislation, federal 
cost accounting standards, and our previous recommendations all 
contributed to pressure to track unit costs and try to lower 
administrative costs. Consequently, the agency had developed a number 
of cost models, which facilitated their developing the efficiency 
measure for PART. 

The Department of the Interior's Fisheries program provides an example 
in which the efficiency measure showed a net loss but officials said 
they used the efficiency measure data to make management decisions. 
The efficiency measure tracked the efficiency (pounds per dollar) of 
producing healthy rainbow trout for recreation. For the first 4 years 
examined, fiscal years 2004 through 2007, the efficiency measure 
varied slightly, indicating that overall efficiency was relatively 
stable. For fiscal year 2008, however, the measure fell, indicating a 
significant drop in efficiency. Officials attributed this drop to a 31 
percent increase in feed, energy, and utility costs that was 
experienced throughout the country in 2008 and was beyond their 
control. Several fishery stations reported 40 percent increases in 
feed costs in just 1 year. Officials told us that having information 
about the decline in efficiency was valuable because it led individual 
stations to look for opportunities to lower other costs of production 
that were within their control. For example, program managers said 
they used their efficiency measure data to help them decide to phase 
out the production of inefficient (more costly) strains of trout. In 
addition, they said they used the measure to help manage the losses 
resulting from diseased trout that could not be sold by shifting 
production from one fishery to another that did not have a problem 
with disease. Officials said they thought it was easier for programs 
that directly produced products or provided services to develop and 
use efficiency measures. They said they had a relatively easy time of 
developing their efficiency measure because they directly produce a 
product (i.e., rainbow trout). 

The Department of the Interior's Wildland Fire Management Program 
reported mixed efficiency results. Of their three efficiency measures, 
two showed a net loss and one showed a net gain. Even though the 
results were mixed, officials said they used the data to establish 
ranges of acceptable cost estimates for contract or grant proposals 
and to identify outliers. Officials said their efficiency measures, 
which tracked numbers of wildland acres treated or moved to a better 
condition class (to reduce the likelihood of wildland fires) per 
million dollars, enabled them to identify unusually high or low costs 
when evaluating proposals from field units for funding treatments. 
They could identify a proposal that did not fall within the normal 
range of prior projects in terms of costs, do further analysis, and 
ask for explanations from field staff to better understand why the 
proposal was outside the norm. Program officials also said they used a 
tool called Ecosystem Management Decision Support (EMDS) to help 
prioritize projects and allocate funding for future years. They said 
EMDS takes into account various factors, including past performance 
and efficiency. For example, fuel treatments that demonstrated greater 
efficiency would be given higher priority for funding under EMDS, 
other factors being equal. 

While FAA's Air Traffic Organization Technical Operations program's 
efficiency measure showed a net gain, officials said they did not use 
it to make major decisions. ATO Technical Operations is responsible 
for maintaining and modernizing equipment needed in the national 
airspace system to deliver air traffic services. It fields, repairs, 
and maintains a huge network of complex equipment, including radars, 
instrument landing systems, radio beacons, runway lighting, and 
computer systems. The efficiency measure, unit cost for providing ATO 
Technical Operations services, is the "total labor obligations for the 
Technical Operations' Service Unit" divided by the total hours of 
operational availability (or equipment "uptime"). Officials said the 
measure was used as a baselining effort, and no decisions have been 
made as a result. Officials explained that they cannot significantly 
influence labor costs because of a labor agreement that requires ATO 
to maintain 6,100 direct employees. Officials said they have used data 
for the denominator of the efficiency measure, on the hours of 
operational availability. Equipment needs to be available 
continuously, and currently is about 99.7 percent of the time. 
Officials said they have not done the marginal cost analysis to 
determine whether it would be cost-effective to try to increase 
equipment uptime, but they have broken the data down by location and 
looked for outliers and tried to address impediments to operational 
availability at certain locations. They also said that while they have 
not used the efficiency measure to make any management decisions, it 
has been valuable in helping to orient staff to think about costs of 
operations and how to go about looking for efficiency improvements. 

Lastly, the Department of Labor's Occupational Safety and Health 
Administration (OSHA) program reported a net loss for the efficiency 
measure and told us they did not use the data. Officials said the 
current efficiency measure--inspections per Compliance Safety and 
Health Officer--was only a "back room calculation" and was not 
something they promoted or used to make decisions within the 
organization. They said they did not evaluate the performance of staff 
based on the number of inspections they conducted, because doing so 
could lead to a perverse effect of rushing through inspections in 
order to complete them more quickly, resulting in poorer quality 
inspections. In addition, officials said they did not believe anyone 
used the OSHA efficiency measure other than for reporting purposes. 

Program Officials Reported Challenges to Developing and Using 
Efficiency Measures: 

Officials from all of the selected programs we reviewed identified one 
or more challenges related to developing or using efficiency measures. 
The challenges cited were not new; we have reported on similar types 
of challenges in our prior work on PART and performance measurement 
issues in general.[Footnote 42] Challenges related to OMB's guidance 
and technical assistance for efficiency measures specifically 
included: a program definition that did not correspond well to program 
operations; an emphasis on developing outcome-oriented efficiency 
measures; achieving required annual improvement targets for 
efficiency; and inconsistencies and limitations in OMB's guidance and 
technical assistance. In addition, officials described the difficulty 
of trying to compare the relative efficiency of programs (or units 
within programs) that have significantly different objectives, 
activities, or cost data. 

Developing Efficiency Measures Based on a Program Definition That Did 
Not Correspond Well to Operations: 

We previously reported that determining the appropriate program or 
unit of analysis for a PART assessment was not always obvious, and 
what OMB determined was useful did not necessarily match agency 
organization or planning elements.[Footnote 43] We found that OMB 
sometimes aggregated separate programs into one for the purposes of a 
PART assessment, and in other cases disaggregated programs. 
Aggregating programs sometimes made it difficult to create a limited, 
but comprehensive, set of performance measures for programs with 
multiple missions, and agency officials noted that difficulties could 
arise when unrelated programs and programs with uneven success levels 
were combined for PART. At the same time, disaggregating a program too 
narrowly could distort its relationship to other programs involved in 
achieving a common goal, and sometimes ignored the interdependence of 
programs by artificially isolating programs from the larger contexts 
in which they operated. While OMB, in response to one of our 
recommendations, expanded PART guidance on how a unit of analysis was 
to be determined, problems related to defining programs for PART 
remained. An OMB staff member acknowledged to us that OMB often 
combined what agencies considered and managed as separate programs in 
order to identify a program for PART. According to some program 
officials, the way in which OMB grouped their activities into a 
program for the PART assessment was not useful, and so the resulting 
program-level efficiency measure developed for PART was not useful. 

Officials from the National Highway Traffic Safety Administration 
(NHTSA) within the Department of Transportation told us that the way 
OMB and the department defined their program for the PART assessment 
was a key challenge to developing a useful efficiency measure. 
Officials said that NHTSA's mission and operations are organized along 
two major programmatic lines: highway and motor vehicle safety. In 
contrast, for purposes of PART and development of the required 
efficiency measures, NHTSA was organized into two programs that 
received separate PART assessments: Operations and Research, and Grant 
Management. As a consequence, officials said the efficiency measure 
developed for the Operations and Research program was not meaningful. 
They said they were revising their efficiency measures and planned to 
develop one for each of the programmatic areas. 

Emphasis on Developing Outcome-Oriented Efficiency Measures: 

In previous work, we identified challenges involved in developing 
useful results-or outcome-oriented performance measures for some 
programs, such as those geared toward long-term health outcomes and 
research and development.[Footnote 44] We reported that many of the 
outcomes for which federal programs are responsible are part of a 
broader effort involving federal, state, local, nonprofit, and private 
partners, and that it is often difficult to isolate a particular 
program's contribution to an outcome.[Footnote 45] However, we also 
reported on how selected agencies that had limited control over the 
achievement of their intended objectives addressed the challenge by 
employing various strategies, such as including intermediate outcomes 
within their direct control along with far-reaching or end outcomes. 
[Footnote 46] In a previous review of PART, we reported that OMB had 
taken steps to clarify PART guidance on using outcome and output 
performance measures, and had accepted administrative efficiency 
measures instead of outcome-level efficiency measures for some 
programs.[Footnote 47] However, we also reported that agencies had 
mixed success in reaching agreement with OMB in these areas. 

As mentioned above, of the 21 measures from selected programs that had 
typical elements of an efficiency measure, 13 contained outputs, and 8 
contained outcomes. While OMB's PART guidance described efficiency 
measures as including both outcome-and output-level impacts, it stated 
that the best efficiency measures captured outcomes. Further, program 
officials told us that OMB pressed some programs to have efficiency 
measures that captured outcomes instead of outputs. 

Similar to findings from our prior work, some officials we interviewed 
for this review said it was difficult for their programs to interpret 
outcome-level efficiency measure information, because factors other 
than program funding affected the outcome of the program. For example, 
the purpose of the Forest Service's Watershed program is to restore, 
enhance, and maintain watershed conditions, including soil, water, 
air, and forest and rangeland vegetation within the national forests 
and grasslands. Management of these physical and biological resources 
provides a foundation for healthy, viable ecosystems.[Footnote 48] The 
Watershed program received a "Results Not Demonstrated" rating from 
the OMB 2006 PART assessment process because it lacked long-term, 
outcome-based performance and efficiency measures to track the 
performance of land management activities on national forest and 
nonfederal watersheds, or demonstrated water quality improvement over 
time. Basically, the Forest Service was unable to track how watershed 
projects were prioritized, identify the benefits associated with 
restoration projects, and determine whether those projects improved 
watershed condition. Officials said they had previously proposed the 
unit cost of watershed improvement projects as an efficiency measure 
under PART, but OMB rejected it partly because it was an output-rather 
than an outcome-level measure. According to Forest Service documents, 
factors beyond its control affect watershed conditions, and it is 
difficult to demonstrate the impact of program activities on 
watersheds and try to determine the most cost-effective way to improve 
the outcome. The agency's ability to improve the condition of 
watersheds depends on many factors, including what percentage of the 
land affecting the watershed is privately owned as opposed to owned by 
the Forest Service and past impacts--for example, an official said 
that lands that were previously mined may be more difficult to 
restore. Officials said that the cost of trying to improve some 
watersheds would exceed available funds, and in some cases passive 
restoration, or doing nothing and letting natural processes return, 
could improve conditions as rapidly as any program interventions 
could. Forest Service officials said they reached agreement with OMB 
to develop an outcome-oriented efficiency measure based on the cost of 
improving or maintaining the condition of watershed acres. According 
to a 2008 report prepared by the Forest Service,[Footnote 49] in order 
to be able to relate costs to outcomes, program officials explained 
that they will need to develop a consistent approach for assessing 
watershed condition and a system that would enable them to track 
changes in watershed conditions and relate these changes to Forest 
Service management activities. Following implementation of this 
approach, the agency would be able to track improvements in program 
outcomes and relate changes to cost. 

Achieving Required Annual Improvement Targets for Efficiency: 

OMB's PMA and PART guidance required programs to set annual 
improvement targets for their efficiency measures. We previously 
reported that in some programs, long-term outcomes are expected to 
occur over time through multiple steps, and that it can take years to 
observe program results. For these programs, it can be difficult to 
identify performance measures that will provide information on annual 
progress toward program results.[Footnote 50] 

Along these lines, some program officials we interviewed told us it 
was not reasonable to expect annual improvements in efficiency for 
some programs because it might take several years for an increase in 
efficiency to be realized as a result of some intervention or 
investment, or because a technological advance might result in a one- 
time cost savings that would not continue to be achieved over time. 
For example, the Plant and Animal Health Monitoring and Surveillance 
programs of Agriculture's Animal and Plant Health Inspection Service, 
which protects the health and value of agriculture and natural 
resources through early detection of pest and disease outbreaks, had 
an efficiency measure that tracked the value of damage prevented or 
mitigated by the program per dollar spent. Program officials told us 
that it was difficult to show improvements in efficiency every year. 
They said that as a science-based program, it took time to develop new 
technologies that improved efficiency, and the effect might be a one- 
time improvement in efficiency that would not result in continued 
additional efficiency gains over time. Similarly, officials from the 
Department of the Interior's Endangered Species program stated that 
the timeframe needed to achieve results in terms of conservation and 
recovery of an endangered species is longer than an annual or even 5- 
year timeframe. They said it is difficult to associate additional 
funding with a defined outcome in a given year. Officials from the 
Department of Labor's Center for Program Planning and Results 
acknowledged that their office and OMB strongly encouraged agencies 
and programs to show annual improvements for efficiency measures, 
which led to some friction in setting targets for out-years for some 
programs. They said that pressure to show annual improvements in 
efficiency resulted in some programs revising targets for the 
efficiency measures every year because they could not achieve the 
annual targets. An official said that there was a lot of focus on 
numerical annual targets for efficiency measures, and because some 
programs cannot realistically see improvements in efficiency in a 1-
year time period, monitoring trends would be better. 

Inconsistent or Limited OMB Guidance and Technical Assistance: 

As we previously reported, OMB staff had to exercise judgment in 
interpreting and applying the PART tool to complex federal programs, 
and were not fully consistent in interpreting the guidance.[Footnote 
51] In prior reviews of PART, we identified instances in which OMB 
staff inconsistently defined appropriate measures, in terms of 
outcomes versus outputs, for programs. We reported that some program 
officials said that OMB staff used different standards to define 
measures as outcome oriented. We also reported that OMB took steps to 
try to encourage consistent application of PART in evaluating 
government programs, including pilot testing the assessment 
instrument, clarifying guidance, conducting consistency reviews, and 
making improvements to guidance based on experience.[Footnote 52] OMB 
also issued examples of efficiency measures it identified as exemplary 
[Footnote 53] and expanded the guidance on efficiency measures. 
[Footnote 54] 

While officials for some programs we interviewed told us that OMB 
assistance and feedback under PART were valuable in developing useful 
efficiency measures, officials for other programs cited 
inconsistencies and limitations in OMB's PART guidance and technical 
assistance that made the development of acceptable and useful 
efficiency measures more challenging. For example, officials for 
Agriculture's Plant and Animal Health Monitoring programs said they 
worked with the department and OMB representatives to discuss 
efficiency measures and obtain feedback on proposed measures. 
Officials said feedback obtained was useful and allowed them to 
consider options they had not previously identified, and in some cases 
they incorporated the advice. Officials said that the efficiency 
measure tracking the value of damage prevented and mitigated per 
program dollar spent was a direct result of an OMB recommendation. 
[Footnote 55] 

However, officials for other programs said that PART guidance and OMB 
technical assistance and feedback provided to programs on efficiency 
measures were insufficient or inconsistent. For example, officials for 
the Department of the Interior's Endangered Species program, which 
lacked an efficiency measure that had been approved by OMB, said they 
believed that OMB's review of proposed efficiency measures was 
inconsistent. Officials said that OMB rejected a proposed output-level 
efficiency measure for the Endangered Species program and pushed for 
an outcome-level measure, but approved a similar measure for another 
program in a different federal department. Similarly, officials for 
the Forest Service Watershed program in Agriculture, which did not 
have any of its proposed efficiency measures accepted by OMB for the 
PART assessment, stated that lack of consistency on OMB's part in 
defining acceptable efficiency measures complicated the process for 
them. They said OMB rejected a measure they proposed, but approved a 
similar measure for another agency. Further, officials for OSHA in the 
Department of Labor indicated that they worked with two OMB analysts 
who were not as familiar with their agency as the current analyst and 
created rework. Overall, they did not believe the process they 
undertook with OMB to develop an efficiency measure was fruitful. 

Comparing Efficiency across or within Programs When Program 
Objectives, Activities, or Cost Data Differ: 

Officials we interviewed from the Department of Education's Office of 
Federal Student Aid indicated that they eventually wanted to use data 
for the Student Aid Administration program's efficiency measure 
(direct administrative unit costs for origination and disbursement of 
student aid), to compare the costs of similar activities performed by 
different contractors. However, we previously reported that challenges 
can result from the difficult but potentially useful process of 
comparing the costs of programs related to similar goals.[Footnote 56] 
We have also reported that in order to effectively compare a program 
to alternative strategies for achieving the same goals, comprehensive 
data on the program and comparable data on alternatives need to be 
available.[Footnote 57] In our prior work on human services programs, 
we reported that OMB officials recognized that programs are different 
and it may not be possible to compare costs across programs, 
especially when costs are defined differently due to programmatic 
differences.[Footnote 58] 

Officials from some selected programs we reviewed questioned whether 
it was reasonable to use efficiency measures for comparative analysis 
of performance across programs when the objectives, activities, or 
costs of the programs differed significantly. For example, an official 
from the Department of Labor's Job Corps program said it was not 
appropriate to compare their program's performance to that of other 
department employment and training programs in terms of the efficiency 
measure, which tracked cost (appropriations) per participant. 
According to the program's PART assessment, the program's purpose is 
to assist eligible disadvantaged youth (ages 16-24) who need and can 
benefit from intensive education and training services to become more 
employable, responsible, and productive citizens. Participants have 
characteristics, such as being a school dropout, homeless, or in need 
of intensive counseling to help them participate successfully in 
school or hold a job, that are barriers to employment. Program 
officials said that Job Corps is quite different from other employment 
and training programs run by the department because it involves 
removing participants from a negative environment and placing them in 
a totally different, primarily residential, environment. Such a model 
involves higher operating costs associated with providing participants 
intensive services in a residential setting for up to 2 years, which 
would make it appear less efficient when compared to nonresidential 
programs.[Footnote 59],[Footnote 60] 

As another example, officials for the Endangered Species program at 
the Department of the Interior questioned whether it made sense to try 
to compare the efficiency of efforts to protect different species. The 
program works with states, tribes, other federal agencies, 
nongovernmental organizations, academia, and private landowners to 
promote the conservation and prevent extinction of over 1,300 
endangered or threatened species. As noted in the program's strategic 
plan,[Footnote 61] each species has inherent biological constraints 
which create challenges to its recovery. Officials told us that they 
work with vastly different species in different regions, many factors 
affect the complexity of their work, and each case is unique. We 
previously reported that species are ranked by priority, but rankings 
do not reflect how much funding is needed to protect a species. 
[Footnote 62] Officials told us that the cost of an intervention, such 
as building a fence, could be much cheaper for one species in a 
particular region than for another species in a different location. 
The head of the department's Office of Planning and Performance 
Management in the Office of the Secretary said that because the effort 
to save some species is so much more complicated and expensive than 
for others, it is not meaningful to simply compare the "cost per unit" 
or efficiency of saving different species without considering other 
factors such as the time frame involved, and the scope and level of 
treatment needed. For example, he suggested that it was not reasonable 
to try to compare the cost of saving the polar bear to the cost of 
saving a species of plant.[Footnote 63] 

Using GPRA as a Framework, a Broader Array of Strategies Can Be Used 
to Seek Improvements in Efficiency: 

As stated above, OMB's approach to improving the efficiency of federal 
programs under PMA and PART focused on requiring individual programs 
to develop efficiency measures, identify procedures to achieve 
efficiencies, and achieve annual gains in efficiency. In prior 
reports, we concluded that PART's focus on program-level assessments 
could not substitute for GPRA's focus on thematic goals and department-
and governmentwide crosscutting comparisons.[Footnote 64] Through our 
review of literature, we identified a variety of strategic and 
crosscutting approaches that government, nongovernment, and business 
organizations have used in their efforts to improve efficiency. For 
example, the United Kingdom and some state governments provide some 
important insights into such governmentwide efficiency efforts. These 
approaches share a common theme that performance can be maintained or 
even improved while reducing unnecessary costs associated with 
outmoded or wasteful operations, processes, and purchases. These 
approaches to efficiency improvement differ from OMB's approach under 
PMA/PART in that they can be applied at government-or agencywide 
levels in addition to being applied within specific programs. 
Officials from some selected programs provided examples of additional 
efforts they were undertaking to improve efficiency, some of which can 
be aligned with these broader approaches we identified in the 
literature. Broadening the application of these approaches beyond the 
program level could help to identify even greater opportunities for 
improvements in the efficiency of federal government operations. 
GPRA's planning and reporting requirements can provide a framework for 
agencies to take a more strategic approach to improving federal 
government efficiency. 

Governmentwide Reviews Can Help Identify and Develop Strategies to 
Improve Efficiency: 

Governmentwide reviews have been conducted in the United Kingdom (UK) 
and by some state governments in the U.S. to help identify and 
implement strategic approaches to improve efficiency. Such reviews 
have been ordered by executive leadership to address a wide range of 
government activity. Reviews have been broad in scope, and initiatives 
undertaken to improve efficiency have been crosscutting and could be 
applied across processes, services, and organizations rather than just 
at the program level as required for federal agencies under OMB's PART 
approach. 

In the UK in 2004, Her Majesty's (HM) Treasury published a first of 
its kind, government-wide efficiency review that examined government 
processes, identified opportunities for cutting costs and improving 
services, and developed proposals to deliver sustainable efficiencies 
in the use of resources within both central and local government. The 
review focused on improving government efficiency in areas such as 
procurement, funding, regulation, citizen services, and 
administration. The efficiency review proposed strategies to improve 
efficiency that were adopted by HM Treasury in the UK's 2004 budget. 

HM Treasury actively supported departments in their individual 
efficiency programs. HM Treasury negotiated efficiency goals with each 
department and created a centralized efficiency team managed by the 
Office of Government Commerce to help departments achieve efficiency 
gains. HM Treasury brought in outside expertise, including senior 
figures from the private and public sector, to support and work with 
departments. Additional specialist change agents were employed to 
assist departments with trying to achieve efficiency improvements in 
areas such as e-government, human resources, IT, finance, 
construction, and commodity procurement. Change agents addressed 
problems created by highly fragmented markets that crossed 
departmental boundaries. 

To assist departments in financing efficiency improvement programs, HM 
Treasury created a Ł300 million Efficiency Challenge Fund that 
provided departments with matching funds for efficiency improvement 
programs. Funds were approved based on objective criteria such as the 
ratio of expected savings to matching funds, probability of achieving 
savings, evidence that alternative funds were not available, and 
progress in delivering efficiency gains. 

In a final review of the completed efficiency program in November 
2008, HM Treasury reported that the program led to Ł26.5 billion in 
annual efficiency gains (60 percent of which were direct cost savings 
while the remainder represented increased levels of public service 
rather than immediate cash savings). These final results have not been 
audited, although portions of earlier reported efficiency gains were 
reviewed by the UK National Audit Office (NAO) with mixed results. In 
2007, more than halfway through implementing the efficiency program, 
the NAO reviewed a sample of the reported efficiency gains and found 
that some had a significant risk of inaccuracy. Nevertheless, NAO 
concluded at the time that of the Ł13.3 billion ($21.2 billion) 
reported gains, 26 percent (Ł3.5 billion ($5.6 billion)) fairly 
represented efficiencies achieved, 51 percent (Ł6.7 billion ($10.7 
billion)) appeared to represent improvements in efficiency but had 
associated measurement issues and uncertainty, and 23 percent (Ł3.1 
billion ($4.9 billion)) had potential to represent improvements in 
efficiency, but the measures used either had not yet demonstrated 
efficiency or the reported gains could be substantially incorrect. NAO 
cited measurement problems arising from longstanding weaknesses in 
departments' data systems and from trying to measure savings in areas 
with complex relationships between inputs and outputs. Despite the 
caveats identified by NAO in trying to verify the reported efficiency 
gains, NAO reported that "the efficiency program made important 
contributions and there is now a greater focus on efficiency among 
senior staff." 

In the U.S., several state governments initiated a variety of 
governmentwide reviews. For example, Arizona initiated an efficiency 
review in 2003 to try to find ways to improve customer service, reduce 
cost, and eliminate duplication while drawing heavily on internal 
state resources and experts in state government to manage the effort. 
The Arizona review investigated potential savings in 12 statewide, or 
crosscutting, issues that affected multiple agencies and offered the 
greatest potential for efficiency savings. In 2004, California 
initiated an ongoing review, the California Performance Review, with 
four major components: executive branch reorganization, program 
performance assessment and budgeting, improved services and 
productivity, and acquisition reform. Iowa Excellence is another 
governmentwide effort designed to improve customer service and cut 
costs in state government. Iowa agencies examined their performance 
using Malcolm Baldrige National Quality Program criteria. The state 
governmentwide review efforts share these beneficial features: serving 
as an effective method of cost-saving analysis, helping with 
prioritizing services to citizens, and providing a targeted goal for 
the administration of state governments that may contribute to 
improved government efficiency and effectiveness. 

Restructuring Outmoded Government Organizations and Operations Can 
Contribute to Improvements in Efficiency: 

Solving the daunting fiscal challenges facing the nation will require 
rethinking the base of existing federal spending and tax programs, 
policies, and activities by reviewing their results and testing their 
continued relevance and relative priority for a changing society. Such 
a reexamination offers the prospect of addressing emerging needs by 
weeding out programs and policies that are outdated or ineffective. 
Those programs and policies that remain relevant could be updated and 
modernized by improving their targeting and efficiency through such 
actions as redesigning allocation and cost-sharing provisions, 
consolidating facilities and programs, and streamlining and 
reengineering operations and processes.[Footnote 65] While significant 
efficiency gains can be achieved by restructuring outmoded government 
organizations and operations to better meet current needs, we have 
reported that such restructurings can be immensely complex and 
politically charged.[Footnote 66] All key players must be involved in 
the process--Congress, the President, affected executive branch 
agencies, their employees and unions, and other interested parties, 
including the public. The fundamental restructuring of the health care 
system for veterans in the mid-1990s and the Department of Defense 
(DOD) Base Realignment and Closure (BRAC) process demonstrate the 
significant efficiencies that can result from reexamining the base of 
federal programs. 

U.S. Department of Veterans Affairs (VA) Health Care: 

In the mid-1990s, the U.S. Department of Veterans Affairs (VA), 
recognizing that its health care system was inefficient and in need of 
reform, followed the lead of private sector health care providers and 
began reorganizing its system to improve efficiency and 
access.[Footnote 67] In 1995, VA introduced substantial operational 
and structural changes in its health care system to improve the 
quality, efficiency of, and access to care by reducing its historical 
reliance on inpatient care. VA shifted its focus from a bed-based, 
inpatient system emphasizing specialty care to one emphasizing primary 
care provided on an outpatient basis. To support VA's restructuring 
efforts, Congress enacted legislation in October 1996 that eliminated 
several restrictions on veterans' eligibility for VA outpatient care, 
which allowed VA to serve more patients. 

VA also phased in a new national resource allocation method, the 
Veterans Equitable Resource Allocation (VERA) system, as part of a 
broader effort to provide incentives for networks and medical centers 
to improve efficiency and serve more veterans. Networks that increased 
their patient workload compared with other networks gained resources 
under VERA; those whose patient workloads decreased compared with 
other networks lost resources. As we reported, VA recognized that VERA 
networks were responsible for fostering change, eliminating 
duplicative services, and encouraging cooperation among medical 
facilities. 

We reported that increased efficiency resulting from increased 
outpatient care, staff reductions and reassignments, and integrations 
at the medical centers resulted in savings. For example, from fiscal 
year 1996 to 1998, the VA reduced staff by approximately 16,114 (8 
percent), resulting in estimated annual savings of $897 million. In 
some cases, however, improvements in efficiency did not save money 
because hospitals reinvested funds to enhance or offer new services. 

Base Realignment and Closures: 

The military base realignment and closure experience provides another 
example of the efficiencies that can be gained by reexamining outmoded 
government structures and operations to meet current operating needs. 
In the late 1980s, changes in the national security environment 
resulted in a defense infrastructure with more bases than DOD needed. 
To enable DOD to close unneeded bases and realign other bases, 
Congress enacted legislation that instituted BRAC rounds in 1988, 
1991, 1993, 1995, and 2005. A special commission established for the 
1988 round made realignment and closure recommendations to the Senate 
and House Committees on the Armed Services. For the succeeding rounds, 
special BRAC Commissions were set up, as required by legislation, to 
make specific recommendations to the President, who in turn sent the 
commissions' recommendations to Congress. While the statutory 
requirements vary across the BRAC rounds, those in the 2005 round 
stipulate that closure and realignment decisions must be based upon 
selection criteria, a current force structure plan, and infrastructure 
inventory developed by the Secretary of Defense. Further, the 
selection criteria were required to be publicized in the Federal 
Register to solicit public comments on the criteria before they were 
finalized. A clear authorization was mandated by Congress involving 
both the executive and legislative branches of government while 
recognizing and involving those affected by the government's actions. 
With the completion of the recommended actions for the first four BRAC 
rounds by 2001, DOD had significantly reduced its domestic 
infrastructure through the realignment and closure of hundreds of 
bases and had reportedly generated billions in net savings or cost 
avoidances during the process. 

While DOD's focus for the four BRAC rounds through 1995 was largely on 
eliminating excess capacity, the Secretary of Defense at the outset of 
the BRAC 2005 round--the fifth such round taken on by the department-- 
indicated its intent to reshape DOD's installations and realign DOD 
forces to meet defense needs for the next 20 years and eliminate 
excess physical capacity--the operation, sustainment, and 
recapitalization of which diverts resources from defense capability. 
Both DOD and the BRAC Commission reported that their primary 
consideration in making recommendations for the BRAC 2005 round was 
military value, which includes considerations such as an 
installation's current and future mission capabilities. As such, many 
of the BRAC 2005 recommendations involve complex realignments that 
reflect operational capacity to maximize warfighting capability and 
efficiency. 

We have reported that the fifth round, BRAC 2005, will be the biggest, 
most complex, and costliest BRAC round ever, in part because, unlike 
previous rounds, the Secretary of Defense viewed the 2005 round as an 
opportunity not only to achieve savings but also to assist in 
transforming the department. For example, DOD is consolidating 
facilities and programs through a BRAC action to relocate five 
training centers from across the United States into a single medical 
education and training center at one installation. Although 
anticipated savings resulting from implementing BRAC 2005 
recommendations, which the department could use for other defense 
programs, remain an important consideration in justifying the need for 
this round, our calculations using DOD's fiscal year 2010 BRAC budget 
estimates have shown that estimated savings DOD expects to generate 
over the 20-year period ending in 2025 have declined from the BRAC 
Commission's estimate of $36 billion to $10.9 billion in constant 
fiscal year 2005 dollars.[Footnote 68] 

Process Improvement Methods and Technology Improvements Can Increase 
Efficiency: 

Process improvement methods can increase product quality and decrease 
costs, resulting in improved efficiency.[Footnote 69] Process 
improvement methods can involve examining processes and systems to 
identify and correct costly errors, bottlenecks, or duplicative 
processes while maintaining or improving the quality of outputs. 

There are numerous process methods that use different tools and 
techniques. For example, Six Sigma is a data-driven approach based on 
the idea of eliminating defects and errors that contribute to losses 
of time, money, opportunities, or business. The main idea behind Six 
Sigma is to measure the defects in a process and then devise solutions 
to eliminate them, helping an organization approach a high quality 
level. Another method is Business Process Reengineering (BPR), which 
redesigns the way work is done to better support the organization's 
mission and reduce costs. Reengineering starts with a high-level 
assessment of the organization's mission, strategic goals, and 
customers. As a result of the strategic assessment, BPR identifies, 
analyzes, and redesigns an organization's core business processes with 
the aim of achieving dramatic improvements in critical performance 
measures, such as cost, quality, service, and speed. 

A 2009 study conducted by the American Productivity and Quality Center 
(APQC)[Footnote 70] identified a variety of methods, including Six 
Sigma and Business Process Re-engineering, which have been used by 
organizations to focus on process improvement.[Footnote 71] The study 
included a survey of 281 small-to-large-sized enterprises with annual 
gross revenue of $4.2 trillion to identify current process-focused 
practices and learn about process effectiveness. Survey respondents 
identified various efficiency related improvements resulting from 
their process improvement approaches, such as streamlined processes, 
improved customer satisfaction, quality improvements, and improved 
decision making. 

In relation to process improvement, modernizing processes through 
investments in technology can generate efficiency gains. Our prior 
work indicates that the federal government can help streamline 
processes and potentially reduce long-term costs by facilitating 
technology enhancements.[Footnote 72] For example, as shown in figure 
3, growth in electronic filing has allowed the Internal Revenue 
Service (IRS) to reduce staff years used to process paper tax returns. 
As electronic filing increased between fiscal years 1999 and 2006, IRS 
reduced the number of staff years devoted to total tax return 
processing by 34 percent.[Footnote 73] We have also reported that 
processing is more accurate and costs are lower to IRS as a result of 
electronic filing--IRS saves $2.71 for every return that is filed 
electronically instead of on paper. 

Figure 3: Number of Individual Returns and IRS Staff Years for 
Individual Paper and Electronic Processing, Fiscal Years 1999-2010: 

[Refer to PDF for image: combined stacked vertical bar and multiple 
line graph] 

Fiscal year: 1999; 
Staff years devoted to paper filing: 4,384; 
Staff years devoted to electronic filing: 280; 
Paper returns processed: 96.6 million; 
Electronic returns processed: 29.3 million. 

Fiscal year: 2000; 
Staff years devoted to paper filing: 4,108; 
Staff years devoted to electronic filing: 274; 
Paper returns processed: 92.9 million; 
Electronic returns processed: 35.4 million. 

Fiscal year: 2001; 
Staff years devoted to paper filing: 4,290; 
Staff years devoted to electronic filing: 302; 
Paper returns processed: 90.2 million; 
Electronic returns processed: 40.2 million. 

Fiscal year: 2002; 
Staff years devoted to paper filing: 4,207; 
Staff years devoted to electronic filing: 265; 
Paper returns processed: 84.6 million; 
Electronic returns processed: 46.8 million. 

Fiscal year: 2003; 
Staff years devoted to paper filing: 3,613; 
Staff years devoted to electronic filing: 226; 
Paper returns processed: 78.3 million; 
Electronic returns processed: 54.6 million. 

Fiscal year: 2004; 
Staff years devoted to paper filing: 3,281; 
Staff years devoted to electronic filing: 253; 
Paper returns processed: 70.2 million; 
Electronic returns processed: 61.3 million. 

Fiscal year: 2005; 
Staff years devoted to paper filing: 3,060; 
Staff years devoted to electronic filing: 208; 
Paper returns processed: 65.3 million; 
Electronic returns processed: 68.3 million. 

Fiscal year: 2006; 
Staff years devoted to paper filing: 2,815; 
Staff years devoted to electronic filing: 155; 
Paper returns processed: 61.9 million; 
Electronic returns processed: 72.8 million. 

Fiscal year: 2007; 
Staff years devoted to paper filing: 2,701; 
Staff years devoted to electronic filing: 152; 
Paper returns processed: 59.4 million; 
Electronic returns processed: 79.6 million. 

Fiscal year: 2008; 
Staff years devoted to paper filing: 2,509; 
Staff years devoted to electronic filing: 185; 
Paper returns processed: 51.6 million; 
Electronic returns processed: 89.3 million. 

Fiscal year: 2009[A]; 
Staff years devoted to paper filing: 2,183; 
Staff years devoted to electronic filing: 187; 
Paper returns processed: 46.9 million; 
Electronic returns processed: 93.3 million. 

Fiscal year: 2010[A]; 
Staff years devoted to paper filing: 1,981; 
Staff years devoted to electronic filing: 197; 
Paper returns processed: 42.6 million; 
Electronic returns processed: 98.3 million. 

Source: GAO analysis of IRS data. 

[A] Fiscal years 2009 and 2010 are IRS projections. 

[End of figure] 

The President's 2011 Budget described a variety of initiatives the 
administration intends to undertake to streamline existing IT 
infrastructure, improve the management of IT investments, and leverage 
new IT to improve the efficiency and effectiveness of federal 
government operations.[Footnote 74] In June 2009, the U.S. Chief 
Information Officer (CIO) launched the IT Dashboard, which allows the 
American people to monitor IT investments across the federal 
government. The IT Dashboard displays performance data on nearly 800 
investments that agencies classify as major. The performance data used 
to track the 800 major IT investments include schedule, cost, and the 
agency CIO's assessment of the risk of the investment's ability to 
accomplish its goals. Beginning in January 2010, the U.S. CIO began 
holding TechStat Accountability Sessions--face-to-face, evidence-based 
reviews of IT programs, undertaken with OMB and agency leadership, to 
improve overall performance. According to the U.S. CIO's Web site on 
TechStat, in some cases this review process is leading to projects 
being eliminated. The administration has also indicated it intends to: 

* consolidate data centers to reduce costs and increase efficiency; 

* pursue "cloud computing," which will enable agencies to share 
information technology services and software rather than purchase or 
develop their own; 

* continue to pursue various "e-government" initiatives, which are 
expected to deliver services more efficiently both within across 
agency lines; and: 

* employ federal enterprise architectures and supporting segment 
architectures to streamline processes and modernize services, in many 
cases across agency lines. 

In addition to these IT initiatives, the Administration has also 
placed emphasis on reducing errors in payments. Executive Order 13520, 
signed in November 2009,[Footnote 75] requires, among other things, 
publishing information about improper payments on the Internet, 
including targets for reduction and recovery, and assigning a senior 
official to be accountable for reducing and recovering improper 
payments at relevant agencies. The executive order also lays out steps 
intended to lead to enhanced accountability of contractors and 
incentives and accountability provisions for state and local 
governments for reducing improper payments. 

Consistent with OMB's PART guidance for programs to identify 
procedures to improve efficiency, officials from several of the 
selected programs we reviewed said they had modernized information 
technology to reduce costs and improve services.[Footnote 76] 
Officials from the Department of Labor's Job Corps program said they 
reduced Federal Telecommunication Costs through the use of voice over 
Internet protocol and other improvements in technology, while 
expanding the use of video conferencing and e-learning to improve 
customer service. As a result of these efforts, officials reported 
cutting communication costs by $1 million. Officials for the 
Department of the Interior's Endangered Species program said they used 
information technology to reduce errors due to hand entry of data. 
They said that by eliminating manual entry of data, errors were 
reduced, which resulted in more accurate information and increased 
efficiency. 

Such methods are consistent with PART guidance to identify procedures, 
such as information technology improvements, to improve efficiency. 
However, the program-level focus of the PART process would not 
necessarily lead to an examination of efficiency improvements to be 
gained by improving the processes and systems outside a program's 
purview. Government processes and systems can involve multiple 
programs within and across federal agencies. For example, we 
previously reviewed the cost of administering seven key human services 
programs and found that the federal government may help balance 
administrative cost savings with program effectiveness and integrity 
by simplifying policies and facilitating technology 
improvements.[Footnote 77] Simplifying policies--especially those 
related to eligibility determination processes and federal funding 
structures--could save resources, improve productivity, and help staff 
focus more time on performing essential program activities. By helping 
states facilitate technology enhancements across programs, the federal 
government can help streamline processes and potentially reduce long-
term costs. 

As another example, we have reported that the federal agencies that 
share responsibility for detecting and preventing seafood fraud 
[Footnote 78]--the Department of Homeland Security's Customs and 
Border Protection, the Department of Commerce's National Marine 
Fisheries Service, and the Department of Health and Human Services' 
Food and Drug Administration--have not taken advantage of 
opportunities to share information that could benefit each agency's 
efforts to detect and prevent seafood fraud, nor have they identified 
similar and sometimes overlapping activities that could be better 
coordinated to use limited resources more efficiently. For example, 
each agency has its own laboratory capability for determining seafood 
species and uses different methodologies for creating standards for 
species identification. The result is that neither the laboratories 
nor the data developed in them are shared. 

A Strategic Approach to Spending Can Be Used to Reduce Input Costs and 
Improve Efficiency: 

We have recommended that agencies take a strategic approach to 
spending that involves a range of activities--from using "spend 
analysis" to develop a better picture of what an agency is spending on 
goods and services, to taking an organization-wide approach for 
procuring goods and services.[Footnote 79] We found that private 
sector companies have adopted these activities to help leverage their 
buying power, reduce costs, and better manage suppliers of goods and 
services. By strategically managing costs, government can improve 
efficiency in the same way as private sector organizations examined in 
our prior work.[Footnote 80] 

"Spend analysis" is a tool that provides information about how much is 
being spent for goods and services, identifies buyers and suppliers, 
and helps identify opportunities to leverage buying power to save 
money and improve performance. To obtain this information, 
organizations use a number of practices involving automating, 
extracting, supplementing, organizing, and analyzing procurement data. 
Organizations then use these data to institute a series of structural, 
process, and role changes aimed at moving away from a fragmented 
procurement process to a more efficient and effective process in which 
managers make decisions on an organizationwide basis. 

Spend analysis allows for the creation of lower-cost consolidated 
contracts at the local, regional, or global level. As part of a 
strategic procurement effort, spend analysis allows companies to 
monitor trends in small and minority-owned business supplier 
participation to try to address the proper balance between small and 
minority business utilization, in addition to pursuing equally 
important corporate financial savings goals for strategic sourcing. 

Spend analysis is an important component of the administration's plans 
to improve government procurement. Along these lines, OMB issued 
memoranda in July and October of 2009 instructing agencies to increase 
competition for new contracts.[Footnote 81] The administration also 
set a net savings target of $40 billion to be achieved by agencies 
through improved contracting practices in fiscal year 2010 and 2011. 
The October memorandum provided agencies guidelines for increasing 
competition for contracts and structuring contracts to achieve the 
best results at the least cost to the taxpayer. Specifically, the 
memorandum recommends the use of spend analysis to identify the 
agency's largest spending categories, analyze and compare levels of 
competition achieved by different organizations within the agency, 
determine if more successful practices may exist for obtaining greater 
marketplace competition for a given spending category. 

Among the programs we reviewed, officials from the Job Corps program 
reported that they achieved improvements in efficiency by using some 
elements of a strategic spending approach. For example, Job Corps 
officials indicated that the program has avoided approximately $1 
million in utility costs by purchasing energy from utilities using 
competitive bids in deregulated markets. When an area of the country 
became deregulated, the program would analyze the utility prices and 
quantities of electricity or natural gas used by the Job Corps centers 
in the area. If prices in the deregulated market looked favorable, the 
energy contracts for the centers would be placed out for bid to all 
eligible energy suppliers. Job Corps would select the bid with the 
best price and terms and set up a contract to purchase energy from 
them for a fixed period of time (usually 1 or 2 years). When the 
contracts came to an end, the process would be repeated. If the prices 
on the deregulated market were not favorable at that time, then the 
centers could revert back to the local utilities for their energy. Job 
Corps also conducted energy audits to identify problem areas and 
propose solutions to reduce energy costs at facilities where energy 
usage was above the benchmark. Job Corps reportedly reduced energy 
costs through investments in energy saving projects, training of staff 
and students to control energy use, and using an online system to 
review and analyze billing and procurement of energy in deregulated 
markets.[Footnote 82] 

GPRA Could Provide a Framework for Structuring a More Strategic 
Approach to Improving Government Efficiency: 

The administration has not clearly indicated whether it will continue 
to emphasize measuring efficiency at the program level as it did under 
PART. Rather, in describing its approach to performance and management 
in the President's budget,[Footnote 83] the Administration stated that 
GPRA and PART increased the production of measurements in many 
agencies, resulting in the availability of better measures than 
previously existed; however, these initial successes have not led to 
increased use. To encourage senior leaders to deliver results against 
the most important priorities, the administration tasked agencies with 
identifying and committing to a limited number of priority goals, 
generally three to eight, with high value to the public. The goals 
were to have ambitious, but realistic, targets to achieve within 18 to 
24 months without need for new resources or legislation, and well-
defined, outcome-based measures of progress. Further, in the coming 
year, the Administration will ask agency leaders to carry out a 
similar priority-setting exercise with top managers of their bureaus 
to set bureau-level goals and align those goals, as appropriate, with 
agencywide priority goals. These efforts are not distinct from the 
goal-setting and measurement expectations set forth in GPRA, but 
rather reflect an intention to translate GPRA from a reporting 
exercise to a performance improving practice across the federal 
government. By making agencies' top leaders responsible for specific 
goals that they themselves have named as most important, the 
Administration has stated that it hopes to dramatically improve 
accountability and the chances that government will deliver results on 
what matters most. 

To complement the renewed focus on achieving priority outcomes, the 
Administration has also proposed increased funding to conduct program 
evaluations to determine whether and how selected programs are 
contributing to desired outcomes. The Administration intends to take a 
three-tiered approach to funding new program initiatives. First, more 
money is proposed for promoting the adoption of programs and practices 
that generate results backed up by strong evidence. Second, for an 
additional group of programs with some supportive evidence but not as 
much, additional resources are allocated on the condition that the 
programs will be rigorously evaluated going forward. Third, the 
approach encourages agencies to innovate and to test ideas with strong 
potential--ideas supported by preliminary research findings or 
reasonable hypotheses. We have previously reported on how program 
evaluations can contribute to more useful and informative performance 
reports through assisting program managers in developing valid and 
reliable performance reporting and filling gaps in needed program 
information, such as establishing program impact and reasons for 
observed performance and addressing policy questions that extend 
beyond or across program borders.[Footnote 84] 

In addition to program evaluations that determine program impact or 
outcomes, we have identified cost-effectiveness analysis as a means to 
assess the cost of meeting a single goal or objective, which can be 
used to identify the least costly alternative for meeting that goal. 
In addition cost-benefit analysis aims to identify all relevant costs 
and benefits, usually expressed in dollar terms.[Footnote 85] Given 
the challenges program managers we interviewed cited in developing and 
using outcome-based efficiency measures, such evaluations might fill 
gaps in understanding the cost of achieving outcomes and allow for 
cost comparisons across alternative program strategies intended to 
produce the same results. 

GPRA's focus on strategic planning, development of long-term goals, 
and accountability for results provides a framework that Congress, 
OMB, and executive branch agencies could use to promote and apply 
various approaches to achieving efficiency gains in federal agencies. 
Congress enacted GPRA in part to address waste and inefficiency in 
federal programs. Agencies could use strategic plans as a vehicle for 
identifying longer-term efficiency improvement goals and strategies 
for achieving them. They could use annual performance plans to 
describe performance goals designed to contribute to longer-term 
efficiency goals, and annual performance and accountability reports to 
monitor progress toward achieving annual or longer-term efficiency 
goals. 

GPRA could provide a framework that would balance efforts to improve 
efficiency with overall improvements in outcomes. GPRA was intended to 
provide a balanced picture of performance that focused on 
effectiveness as well as efficiency. Officials from some selected 
programs identified a risk that focusing on reducing costs to improve 
efficiency could potentially have negative effects on the quantity or 
quality of outputs or outcomes. For example, officials for the Smaller 
Learning Communities program at the Department of Education said their 
outcome-level efficiency measures, which tracked the cost per student 
demonstrating proficiency or advanced skills in math or reading, could 
result in unintended negative consequences such as providing 
motivation for grantees to cut costs by lowering teacher salaries, 
lower proficiency standards so that more students would be classified 
as proficient, or engage in "creaming" (focus only on those students 
most likely to achieve gains). OMB's PART guidance included 
recognition that efforts to improve efficiency can involve risk to 
quality, outcomes, or other factors such as customer satisfaction. The 
PART guidance included as an example how reducing processing time to 
be more efficient could result in increased error rates. OMB 
recommended that programs assess risks associated with efficiency 
improvement efforts and develop risk management plans if needed. 
Similarly, in the United Kingdom's governmentwide efficiency program, 
departments could only report improvements in efficiency if they could 
also demonstrate that the quality of public services was not adversely 
affected by the reforms.[Footnote 86] Under GPRA, agencies' plans and 
performance measures are expected to strike difficult balances among 
competing demands, including program outcomes, cost, service quality, 
customer satisfaction, and other stakeholder concerns. Therefore 
agencies could mitigate the risk to program outcomes and quality 
associated with taking a narrow cost-cutting approach by developing 
GPRA goals, strategies, and performance measures that clearly balance 
these competing demands. 

We have previously reported that OMB could use the provision of GPRA 
that calls for OMB to develop a governmentwide performance plan to 
address critical federal performance and management issues, including 
redundancy and other inefficiencies in how we do business. It could 
also provide a framework for any restructuring efforts.[Footnote 87] 
This provision has not been fully implemented, however. OMB issued the 
first and only such plan in February 1998 for fiscal year 1999. 

Further, as the focal point for overall management in the executive 
branch, OMB could provide guidance and management and reporting tools 
to increase federal agencies' focus on efficiency improvements. OMB's 
main vehicle for providing guidance on the development of agency 
strategic plans and performance plans and reports, OMB Circular A-11, 
Section 6 (Preparation and Submission of Strategic Plans, Annual 
Performance Plans, and Annual Program Performance Reports), makes no 
reference to establishing long-term goals for efficiency gains or 
describing strategies for how performance outcomes can be achieved 
more efficiently. References to efficiency in the guidance primarily 
pertain to the inclusion of program-level efficiency measures in 
agency budget justifications. 

OMB could also support mechanisms to share information and encourage 
agency efforts to improve efficiency. OMB has previously developed or 
contributed to mechanisms for sharing information and encouraging 
improvements to federal programs in the past, such as Web sites to 
share information, highlight success, and identify best practices for 
initiatives.[Footnote 88] For example, www.results.gov had information 
on best practices related to PMA initiatives, and www.expectmore.gov 
provided information on PART assessments and improvement plans. OMB's 
own Web site contained information and examples of what it considered 
to be high-quality PART performance measures; discussion papers on 
measurement topics, such as how to effectively measure what a program 
is trying to prevent; and strategies to address some of the challenges 
of measuring the results of research and development programs. OMB 
recently launched a collaborative wiki page which is intended to 
provide an online forum for federal managers to share lessons learned 
and leading practices for using performance information to drive 
decisionmaking.[Footnote 89] OMB has sponsored various management 
councils, such as the President's Management Council and the 
Performance Improvement Council, which include representatives of 
agencies and serve as forums for information sharing among agencies 
and with OMB. We have also reported that OMB has hosted standing 
working groups and committees comprised of agency and OMB staff, and 
has hosted workshops to address important issues and identify and 
share best practices. For example, OMB helped form a subgroup among 
agency officials responsible for the PMA budget and performance 
integration initiative to share lessons learned and discuss strategies 
to address challenges of developing efficiency measures in the grant 
context. 

Conclusions: 

The prior Administration's approach to improving efficiency under PMA 
and PART focused on measuring and achieving efficiency gains at the 
program level. The approach involved requiring each program to develop 
at least one efficiency measure and demonstrate annual gains in 
efficiency, as well as to have regular procedures in place for 
achieving improvements in efficiencies. Although most programs that 
received a PART assessment developed an efficiency measure, not all of 
these measures included both elements of a typical efficiency measure-
-an input as well as an output or outcome. The absence of these 
typical elements can result in measures that do not truly capture 
efficiency. Nevertheless, other forms of measures intended to improve 
efficiency, such as those focused on reducing costly error rates, 
could still provide useful information. 

Officials for some selected programs we reviewed indicated that the 
efficiency measures reported for PART were useful and described ways 
in which they used data for efficiency measures, such as to evaluate 
proposals from field units, lower the cost of a contract, or make 
decisions to shift production. Other officials we interviewed did not 
find the measures useful for decision making. Officials for all of the 
programs described challenges to developing and using efficiency 
measures that were similar to challenges we previously reported on in 
prior work on PART and performance measures in general. For example, 
in one case the way OMB defined the program boundaries did not line up 
well with how managers ran the activities, which resulted in measures 
that were not useful for decision making. Some program officials 
indicated it was not always feasible to meet the requirement to 
demonstrate annual gains in efficiency, given that improvement could 
take multiple years to achieve. Some officials cited inconsistencies 
and limitations in the guidance and technical support from OMB on how 
to develop and use efficiency measures. 

OMB has not clarified whether programs should continue to collect and 
use efficiency measure data established for PART. Such clarification 
is necessary to help guide any refinements, as needed, to the current 
process, as well as broader issues. While tracking efficiency at the 
program level can be useful, this approach can miss opportunities to 
seek efficiencies on a larger scale, such as efforts that cross 
traditional program and agency boundaries. The experiences of private 
and public sector entities in implementing strategic and crosscutting 
approaches to improving efficiency can provide insights for federal 
agencies. For example, process improvement and modernization of 
systems can be undertaken both within and across organizational 
boundaries to increase quality, reduce waste, and lower costs. 
Analyzing spending and procurement strategies to leverage buying power 
and improve performance can identify opportunities to reduce the cost 
of producing agency outputs and outcomes. Broader, governmentwide 
reviews and analysis of restructuring opportunities that involve a 
wider scope of government activity can be used to identify strategic, 
crosscutting approaches to improving efficiency that emphasize the 
need to maintain or improve other key dimensions of performance. Such 
approaches have the potential to yield significant gains in efficiency 
that would be difficult to achieve by individual programs working in 
isolation. 

The current Administration has begun to identify some important 
opportunities for crosscutting efficiencies in its proposed 
information technology initiatives and procurement reforms and has 
tasked agencies with establishing agency cost reduction goals and 
asked federal employees to submit their suggestions for cost savings. 
Efforts to improve efficiency can take multiple years to accomplish 
and can require changes in strategy and collaboration within and 
across organizational lines. Furthermore, efficiency can only be 
improved if other performance dimensions, such as the quality or 
quantity of agency outputs and outcomes, are maintained or improved as 
resources are reduced; or conversely, if quality and quantity of 
outputs/outcomes are improved with a given level of resources. The 
Administration has signaled its intent to make greater use of program 
evaluation to determine which programs are producing desired results. 
Program evaluations can also be used to determine the cost of 
achieving outcomes, an approach that could aid in identifying the most 
cost-effective program designs. 

Continuing to build on the experiences and lessons learned from prior 
initiatives, with a concerted focus on specific levels of governments--
governmentwide, agency, and program--could help to identify, 
introduce, and sustain additional efficiency gains on a more 
systematic and systemic basis at these same levels. The planning and 
reporting requirements of GPRA could serve as a framework for 
developing agency or across-agency strategies for improving efficiency 
and tracking results. By implementing the governmentwide performance 
plan provision of GPRA, OMB could provide further impetus to 
identifying efficiency goals to be achieved by consolidating 
operations or restructuring programs on a governmentwide basis. 
Further, OMB's A-11 guidance on preparing agency strategic and 
performance plans could place greater emphasis on improvements in 
efficiency. OMB has multiple management groups and information-sharing 
mechanisms, including a new wiki, which could be used to identify and 
share successful approaches to improving efficiency, whether applied 
at the program or other levels of government. 

Recommendations for Executive Action: 

We recommend that the Director of OMB take the following four actions: 

* Evolve toward a broader approach that emphasizes identifying and 
pursuing strategies and opportunities to improve efficiency at each of 
the governmentwide, agency, and program levels. 

- At the governmentwide level, OMB should look for additional 
opportunities to consolidate or restructure duplicative or inefficient 
operations that cut across agency lines. One vehicle for doing this is 
the GPRA-required governmentwide performance plan. 

- At the agency level, OMB should clarify its A-11 guidance to 
agencies on establishing efficiency goals and strategies in their 
agency-level GPRA strategic and performance plans, and reporting on 
the results achieved in performance reports. Guidance should stress 
the importance of looking for efficiencies across as well as within 
components and programs and maintaining or improving key dimensions of 
performance such as effectiveness, quality, or customer satisfaction, 
while also striving for efficiency gains. 

- At the program level, OMB should clarify whether agencies are to 
continue developing and using program-level efficiency measures. If 
so, OMB should provide enhanced guidance and technical support to 
agencies that addresses how to develop and use efficiency measures to 
improve efficiency and mitigate the challenges we identified. 

* Collect and disseminate information on strategies and lessons 
learned from successful efforts to improve efficiency by federal 
agencies, other governments, and the private sector. Possible vehicles 
for collection and dissemination of this information include good 
practices guides, workshops, Web sites, wikis, and management 
councils, such as the President's Management Council and the 
Performance Improvement Council. 

Agency Comments: 

We provided a draft of this report for review to OMB and the 
Departments of Agriculture, Education, the Interior, Labor, and 
Transportation. In oral comments, OMB representatives indicated that 
OMB concurred with our recommendations, adding that they thought the 
report will be useful as they revise their guidance to agencies on how 
to address efficiency improvements. OMB also provided technical 
comments which we incorporated where appropriate. 

In their written comments (see appendix IV), Interior also concurred 
with our recommendations, but urged caution with regard to the 
recommendation that OMB provide additional guidance on the use of 
efficiency measures by agencies and programs. In particular, Interior 
cautioned against inviting standardized direction that would have 
agencies comparing efficiency across and within programs, considering 
the inherent differences in scope, complexity, and quality of outputs 
and outcomes. Interior indicated it seeks maximum flexibility for 
federal managers in using efficiency measures when they make sense and 
can be used to drive to the desired goals for the program. 

The Departments of Education and Labor provided technical comments, 
which we incorporated where appropriate. The Departments of 
Agriculture and Transportation did not provide comments. 

As agreed with your offices, unless you publicly announce the contents 
of this report earlier, we plan no further distribution of it until 30 
days from the date of this letter. At that time, we will send copies 
of this report to the appropriate congressional committees; the 
Secretaries of Agriculture, Education, the Interior, Labor, and 
Transportation; the Director of OMB; and other interested parties. The 
report will also be available at no charge on the GAO Web site at 
[hyperlink, http://www.gao.gov]. 

If you or your staff have any questions regarding this report, please 
contact me at (202) 512-6543 or steinhardtb@gao.gov. Contact points 
for our Offices of Congressional Relations and Public Affairs may be 
found on the last page of this report. GAO staff who made major 
contributions to this report are listed in appendix V. 

Signed by: 

Bernice Steinhardt: 
Director, Strategic Issues: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

The objectives of our review were to examine: (1) the types of 
efficiency measures reported through the Program Assessment Rating 
Tool (PART) for agency programs overall, and particularly for selected 
programs in five selected agencies, focusing on the extent to which 
they included typical elements of an efficiency measure; (2) for 
selected programs, the extent to which programs reporting efficiency 
measures through PART have shown efficiency gains and how programs 
have used efficiency measures for decision making; (3) for selected 
programs, the types of challenges to developing and using efficiency 
measures they have faced; and (4) other strategies that can be used to 
improve efficiency. 

To address these objectives, we selected five departments from those 
on which we had reported in 2007 concerning implementation of a 
managerial cost accounting system (MCA).[Footnote 90] Because we 
wanted to include agencies with variety in the types of cost data 
available, we selected some departments that had--and some that had 
not--developed an MCA system. The Departments of the Interior, Labor 
and Transportation were selected because these were the only 
departments out of the 10 agencies we reviewed at the time that had 
implemented--or had made significant progress in implementing--MCA 
departmentwide.[Footnote 91] To compare and contrast findings from 
these departments, we selected two other departments that had not 
implemented an MCA system. The United States Department of Agriculture 
was selected because the department indicated in our 2007 report that 
it planned to implement an MCA system the next time it upgraded its 
financial management system. The Department of Education was selected 
because it indicated it had no plans to implement an entitywide MCA 
system. 

After choosing the departments, we selected 21 programs to review from 
the set of all programs that had a received a PART assessment by the 
Office of Management and Budget (OMB).[Footnote 92] PART was developed 
to assess and improve program performance so that the federal 
government could achieve better results. According to OMB, a PART 
review helped identify a program's strengths and weaknesses to inform 
funding and management decisions aimed at making the program more 
effective. A PART review included program-level performance 
information and efficiency measures for the programs.[Footnote 93] The 
PART data we received from OMB contained 1,396 efficiency measures 
which were associated with 937 programs that received a PART 
assessment. Within the five departments, we selected the 21 specific 
programs for review to represent a diverse array of functions and 
operations within the federal government, as indicated by the PART 
program type.[Footnote 94] Of the seven PART program types, we 
selected five for inclusion in this study, excluding research and 
development and credit.[Footnote 95] Additional criteria were that the 
selected programs have relatively large fiscal year 2008 funding 
levels, and variety in the number of efficiency measures associated 
with the programs. 

For the first objective regarding the extent to which efficiency 
measures included typical necessary elements, we first identified the 
elements and developed a definition by conducting a literature review 
as well as expert interviews. We then performed various degrees of 
analysis on (1) all efficiency measures for all programs represented 
in the PART database, (2) all of the measures for our selected 
programs, and (3) a random sample of 100 efficiency measures taken 
from the PART database. The following describes the analysis we 
conducted on each of these three populations: 

* Analysis on the complete PART database: The analysis we conducted on 
all PART efficiency measures resulted in a set of summary statistics, 
such as the fiscal year 2008 total funding by PART program type, the 
mean amount of funding each program received within the program types, 
the number of programs for each PART program type, the number of 
programs that had between zero and eight efficiency measures, and the 
number of programs in each selected department by PART program type. 

* Analysis of PART measures selected with certainty from 21 programs 
in five departments: For the 21 programs we selected, we conducted a 
more detailed analysis on the 36 associated efficiency measures. 
[Footnote 96] However, any findings based on this analysis cannot be 
generalized beyond these particular measures. We performed a content 
analysis review of these measures, which was based upon the PART 
efficiency measure data; our review of applicable documents concerning 
the measures and programs, such as the programs' PART assessments; and 
interviewing program officials to discuss the measures and programs. 
For each of these measures, we identified whether certain attributes 
were present, and the documents we reviewed and interviews we 
conducted aided in this effort at times. The fields from the PART 
database we used to assess each efficiency measure were the agency and 
program name, the text for each efficiency measure and, when present, 
the more detailed efficiency measure explanation. Using this 
information, we determined whether each of the measures included the 
program's inputs (such as cost or hours worked by employees) as well 
as its outputs or outcomes. When we identified a measure as having an 
output or outcome element, we distinguished between the two. We also 
analyzed whether there was either a time or cost attribute to each 
measure. For each of these attributes, the potential answers were 
"Yes," "No," or "Unclear."[Footnote 97] To determine whether an 
efficiency measure had these attributes, we defined each term for this 
particular exercise. We defined an input as a resource, such as cost 
or employee time, used to produce outputs or outcomes. We defined 
outputs as the amount of products and services delivered by a program. 
We defined outcomes as the desired results of a program, such as 
events, occurrences or changes in conditions, behaviors or attitudes. 
We defined a measure to have an attribute of time or cost when the 
measure appeared to include some type of attribute of time (e.g., 
"hours worked by employees," "per month," "annually," or "within three 
months,") or cost, respectively. We conducted our coding by having 
three team members independently code each of the 36 efficiency 
measures without each knowing how the other two coders assessed each 
measure. Afterward, the three coders discussed and reconciled any 
differences and reached agreement in all incidents. Finally, we 
determined whether the cost element was based on budgetary information 
or MCA information. 

* Analysis of a random sample from the PART database: This analysis 
involved selecting a random sample of 100 efficiency measures from the 
remaining 1,355 efficiency measures in the PART database.[Footnote 98] 
Estimates based on the sample can be generalized to estimate 
characteristics of the remaining population of 1,355 efficiency 
measures. Because we followed a probability procedure based on random 
selections, our sample is only one of a large number of samples that 
we might have drawn. Since each sample could have provided different 
estimates, we express our confidence in the precision of our 
particular sample's results as a 95 percent confidence interval (e.g., 
plus or minus 10 percentage points). This is the interval that would 
contain the actual population value for 95 percent of the samples we 
could have drawn. As a result, we are 95 percent confident that each 
of the confidence intervals in this report will include the true 
values in the study population. Unless otherwise noted, all percentage 
estimates have 95 percent confidence intervals of within plus or minus 
10 percentage points of the estimate itself. The analysis we conducted 
on these measures was similar to the analysis we conducted for the 
selected programs, meaning we analyzed and determined if each measure 
had an input, output or outcome, time or cost attribute and used the 
same definition and coding procedures. However, because we did not 
have in-depth information from interviews or program documents 
concerning these measures, in some cases we were unable to conclude 
whether certain efficiency measures included necessary elements and 
consequently, classified about a quarter of the sample as unclear. 
Also, because of the lack of detailed information on the measures, we 
could not distinguish between outputs and outcomes expressed for these 
measures. 

Simultaneously with the content analysis of the efficiency measures, 
for the second and third objectives, on how selected agencies/programs 
used efficiency measures and the extent to which they reported 
efficiency gains, and what challenges or constraints to developing and 
using efficiency measures they faced, we reviewed program Web sites, 
PART assessments, other documents provided by program officials, and 
interviewed program officials identified by the departments as 
knowledgeable about the particular program and its efficiency 
measure(s). These interviews consisted of asking agency officials a 
similar set of questions with topics such as how the efficiency 
measure(s) was developed and used, associated challenges, and 
alternative methods for evaluating efficiency. For the two programs 
that did not have any efficiency measures in PART, we asked questions 
such as whether they had other efficiency-related measures they 
tracked internally which were unrelated to PART, whether there had 
been prior attempts to develop an efficiency measure, and whether they 
had experienced specific challenges to developing and using efficiency 
measures. In addition to interviewing program officials, we also 
interviewed at least one official in each of the five departments who 
was responsible for performance measurement at the departmentwide 
level. These interviews also had a similar set of questions and were 
specific to departmentwide performance measurement issues, such as 
whether the department had its own guidelines or guidance pertaining 
to developing and using efficiency measures, how results for program-
level efficiency measures get reported within the agency, and how 
program efficiency measures were used. Also at the department level, 
we interviewed officials associated with each of the five departments' 
Chief Financial Officer (CFO) offices, asking questions about the role 
the CFOs office played, if any, in developing efficiency measures for 
programs and inquiring about the development and use of a managerial 
cost accounting system. In addition to interviewing department and 
program officials, we interviewed OMB officials on several occasions 
about the approach to efficiency under PART and discussed, among other 
topics, the training and guidance OMB provided, and any lessons 
learned from the agencies' efforts to develop and use efficiency 
measures. OMB also provided us with documents detailing the history of 
the PART program. 

Finally, to determine whether a selected program's efficiency measure 
indicated a gain or loss, we reviewed the efficiency measure data that 
were reported in the program's PART assessment and subtracted the 
initial year of data from the latest year available. To verify the 
accuracy of the data, we asked program officials to confirm the data 
and when necessary, to provide us with the most recent data. 

To address the fourth objective regarding the approaches agencies can 
employ to improve efficiency, we interviewed program officials for the 
selected programs to learn about the approaches they use to evaluate 
efficiency and also conducted a two-stage literature review to 
determine alternative approaches. The first stage of the literature 
review consisted of examining GAO publications, Congressional Research 
Service reports, the Internet, and various databases for general 
information on strategic approaches to efficiency. We also 
participated in a business process management research report with the 
American Productivity and Quality Center (APQC),[Footnote 99] studying 
how organizations maintain quality across processes and products as 
well as meet customer requirements in the face of pressure to cut 
costs. Using information derived from the first literature review and 
the APQC report, we identified the broad set of approaches to 
improving efficiency. In our literature search, we looked for examples 
and ideas that used a broad array of strategies to seek improvements 
or affect efficiency from prior reports we have published and what 
other institutions that have done work on the subject. For this 
objective, we refer to 18 different pieces of literature from our 
comprehensive literature search. In conducting the literature review, 
we did not attempt to identify all potential alternative approaches 
that could lead to efficiency improvements but focused on approaches 
that appeared consistent with the broad definition of efficiency 
improvement that was used in this report. Furthermore, in addition to 
the interviews with program officials and the literature review, we 
interviewed experts on performance and efficiency measures, who 
discussed definitions, uses, and insights of efficiency measures. 
Among the experts, we interviewed officials in the United Kingdom's 
National Audit Office, which assessed the reliability of the 
efficiency gains reported by United Kingdom agencies as part of the 
United Kingdom's 2004 government-wide efficiency review. We also 
interviewed officials with the Office of the Auditor General of 
Canada, which is conducting a study on ways to improve the efficiency 
of that country's tax administration system. 

[End of section] 

Appendix IIP: Departments, Selected Program Assessment Rating Tool 
Program (PART) Names, and Summary of Programs: 

Department and PART program name: Department of Agriculture:Forest 
Service: Watershed; 
Program summary: Restore, enhance, and maintain watershed conditions 
including soil, water, air, and forest and rangeland vegetation within 
the national forests and grasslands. Management of these physical and 
biological resources provides a foundation for healthy, viable 
ecosystems. 

Department and PART program name: Department of Agriculture:National 
School Lunch Program; 
Program summary: Provides nutritionally balanced, low-cost or free 
lunches for public and nonprofit private schools. The program seeks to 
safeguard the health and well-being of the nation's children and 
support domestic agricultural production. 

Department and PART program name: Department of Agriculture:Plant and 
Animal Health Monitoring Programs; 
Program summary: Assists in protecting plant and animal resources from 
pests and diseases through ongoing monitoring and surveillance. 
Provides rapid detection, analysis, and reporting of pests and 
diseases to minimize potential losses. 

Department and PART program name: Department of Education: 21st 
Century Community Learning Centers; 
Program summary: Awards formula grants to state education agencies 
which, in turn, manage statewide competitions and award subgrants to 
local education agencies and community-based organizations. These 
grants support the creation of community learning centers that provide 
academic enrichment opportunities during nonschool hours for children, 
particularly students who attend high-poverty and low-performing 
schools. This program focuses on enrichment in core academic subjects, 
extracurricular enrichment, as well as literacy and other educational 
services to the families of participating children. 

Department and PART program name: Department of Education: Smaller 
Learning Communities; 
Program summary: Provides competitive grants to local education 
agencies to increase academic achievement in large high schools 
through the creation of smaller, more personalized learning 
environments. 

Department and PART program name: Department of Education: Student Aid 
Administration; 
Program summary: Provides financial assistance to postsecondary 
students and their families through administering federal student aid 
grants and loans. 

Department and PART program name: Department of the Interior: Bureau 
of Reclamation Water Management--Operation and Maintenance; 
Program summary: Ensures the operation and maintenance of reclamation 
facilities, delivers water to irrigators and municipal users, and 
provides storage to help mitigate flooding. The program also addresses 
issues such as water conservation, runoff from irrigated fields, and 
project financial management. 

Department and PART program name: Department of the Interior: Wildland 
Fire Management; 
Program summary: Manages and extinguishes fires on Department of the 
Interior lands and on other lands under fire protection agreements. 
The three largest program activities are fire preparedness, fire 
suppression, and hazardous fuels reduction (i.e., removal of small 
trees and brush that exacerbate fire risks). 

Department and PART program name: Department of the Interior: Fish and 
Wildlife Service--Endangered Species; 
Program summary: Protects threatened or endangered species and 
conserves their habitats. Lists species needing protection, consults 
on federal projects, awards grants, and works with partners on 
recovery actions. 

Department and PART program name: Department of the Interior: Fish and 
Wildlife Service--Fisheries; 
Program summary: Works to conserve and restore native aquatic species 
populations and their habitat and support recreational fishing. 

Department and PART program name: Department of the Interior: Office 
of Surface Mining--State Managed Abandoned Coal Mine Land Reclamation; 
Program summary: Reclaims and restores land and water degraded by coal 
mining activities conducted before 1977. Reclamation fees on current 
coal production fund the program, which has expanded to provide 
oversight over the 23 states and three Indian Tribes that carry out 
the program. 

Department and PART program name: Department of Labor: Energy 
Employees Occupational Illness Compensation Program; 
Program summary: Serves those who have contracted illness due to 
exposure to toxic substances or radiation while working at nuclear 
weapons and related covered facilities. Provides lump-sum compensation 
and health benefits to eligible Department of Energy nuclear weapons 
workers, or the survivors of such workers. 

Department and PART program name: Department of Labor: Job Corps; 
Program summary: Provides intensive education and training services to 
disadvantaged youth ages 16-24. These services are intended to help 
eligible youth obtain jobs, seek further education, or enter the 
military. The program serves approximately 60,000 youth nationwide 
through 122 centers, most of which are residential. 

Department and PART program name: Department of Labor: Occupational 
Safety and Health Administration; 
Program summary: Works to ensure, for every working person in the 
nation, safe and healthful working conditions. Implements the 
Occupational Safety and Health Act of 1970 by setting and enforcing 
standards, outreach and education, cooperative programs and compliance 
assistance. 

Department and PART program name: Department of Labor: Unemployment 
Insurance Administration State Grants; 
Program summary: Assists states in operating their unemployment 
insurance programs, which provide temporary income support to 
unemployed workers. States determine eligibility for benefits, which 
are financed through state-levied taxes. The Department of Labor funds 
the administrative expenses of these state programs. 

Department and PART program name: Department of Labor: Workforce 
Investment Act--Migrant and Seasonal Farmworkers; 
Program summary: Provides competitive grants to fund training, 
employment, and other services to help economically disadvantaged 
farmworkers and their families. Through these services, the program 
seeks to help them achieve economic self-sufficiency by strengthening 
their ability to gain stable employment. 

Department and PART program name: Department of Transportation: 
Federal Aviation Administration (FAA) Air Traffic Organization--
Terminal Programs; 
Program summary: Provides air traffic control services to guide 
aircraft in and out of airports across the country. 

Department and PART program name: Department of Transportation: FAA 
Air Traffic Organization--Technical Operations; 
Program summary: Maintains and modernizes equipment needed in the 
national airspace system to deliver air traffic services. It fields, 
repairs, and maintains a network of complex equipment, including 
radars, instrument landing systems, radio beacons, runway lighting, 
and computer systems. 

Department and PART program name: Department of Transportation: 
Federal Transit Administration New Starts; 
Program summary: Provides financial support for locally planned and 
operated public transit through competitive, discretionary capital 
investment grant transit projects including commuter rail, light rail, 
heavy rail, bus rapid transit, trolleys and ferries. 

Department and PART program name: Department of Transportation: 
Highway Infrastructure; 
Program summary: Provides financial grants and technical assistance to 
states to construct, maintain, and improve the performance of the 
nation's highway system in accordance with federal policy goals. 

Department and PART program name: Department of Transportation: 
National Highway Traffic Safety Administration--Operations and 
Research; 
Program summary: Advances highway safety through research and 
regulations concerning vehicle technologies and human behavior. 
Focuses on researching vehicle and behavioral safety countermeasures, 
issuing vehicle safety regulations, and investigating vehicle defects. 

Source: GAO analysis of selected PART assessments. 

[End of table] 

[End of section] 

Appendix III: Department, PART Program Name, and Number of Efficiency 
Measures, Fiscal Year 2009 Funding Level, PART Program Type, and 
Efficiency Measure(s) for Selected Programs: 

Department: Agriculture; 
PART program name and number of efficiency measures: Forest Service: 
Watershed (0); 
Fiscal year 2009 funding level (dollars in millions): $812; 
PART program type: Direct federal; 
Efficiency measure: None. 

Department: Agriculture; 
PART program name and number of efficiency measures: National School 
Lunch Program (NSLP) (3); 
Fiscal year 2009 funding level (dollars in millions): $8,517; 
PART program type: Block/formula grant; 
Efficiency measure: Dollars lost to error in the National School Lunch 
Program.
Efficiency measure: Rate of verified applications not supported by 
adequate income documentation.
Efficiency measure: Rate of administrative error in NSLP eligibility 
determination. 

Department: Agriculture; 
PART program name and number of efficiency measures: Plant and Animal 
Health Monitoring Programs (2); 
Fiscal year 2009 funding level (dollars in millions): $330; 
PART program type: Regulatory; 
Efficiency measure: Value of damage prevented or mitigated by the 
monitoring and surveillance programs per dollar spent.
Efficiency measure: Improved efficiency through the use of targeted 
samplings versus the use of random sampling. 

Department: Education; 
PART program name and number of efficiency measures: 21st Century 
Community Learning Centers (3); 
Fiscal year 2009 funding level (dollars in millions): $1,081; 
PART program type: Block/formula grant; 
Efficiency measure: The average number of days it takes the department 
to submit the final monitoring report to a State Education Agency 
(SEA) after the conclusion of a site visit.
Efficiency measure: The average number of weeks a state takes to 
resolve compliance findings in a monitoring visit report.
Efficiency measure: The percentage of SEAs that submit complete data 
on 21st century program performance measures by the deadline. 

Department: Education; 
PART program name and number of efficiency measures: Smaller Learning 
Communities (6); 
Fiscal year 2009 funding level (dollars in millions): $80; 
PART program type: Competitive grant; 
Efficiency measure: Fiscal year 2003 cohort: Cost (in dollars) per 
student demonstrating proficiency or advanced skills in reading.
Efficiency measure: Fiscal year 2003 cohort: Cost (in dollars) per 
student demonstrating proficiency or advanced skills in mathematics.
Efficiency measure: Fiscal year 2004 cohort: Cost (in dollars) per 
student demonstrating proficiency or advanced skills in reading.
Efficiency measure: Fiscal year 2004 cohort: Cost (in dollars) per 
student demonstrating proficiency or advanced skills in mathematics.
Efficiency measure: Fiscal year 2005 cohort: Cost (in dollars) per 
student demonstrating proficiency or advanced skills in reading.
Efficiency measure: Fiscal year 2005 cohort: Cost (in dollars) per 
student demonstrating proficiency or advanced skills in mathematics. 

Department: Education; 
PART program name and number of efficiency measures: Student Aid 
Administration (1); 
Fiscal year 2009 funding level (dollars in millions): $753; 
PART program type: Capital and service acquisition; 
Efficiency measure: Direct administrative unit costs for origination 
and disbursement of student aid. 

Department: Interior; 
PART program name and number of efficiency measures: Bureau of 
Reclamation Water Management--Operation and Maintenance (1); 
Fiscal year 2009 funding level (dollars in millions): $308; 
PART program type: Capital and service acquisition; 
Efficiency measure: Average time to correct/mitigate higher priority 
operations and maintenance deficiencies of reserved works facilities. 

Department: Interior; 
PART program name and number of efficiency measures: Fish and Wildlife 
Service--Endangered Species (0); 
Fiscal year 2009 funding level (dollars in millions): $277; 
PART program type: Regulatory; 
Efficiency measure: None. 

Department: Interior; 
PART program name and number of efficiency measures: Fish and Wildlife 
Service--Fisheries (1); 
Fiscal year 2009 funding level (dollars in millions): $126; 
PART program type: Competitive grant; 
Efficiency measure: Pounds/dollar of healthy rainbow trout produced 
for recreation. 

Department: Interior; 
PART program name and number of efficiency measures: Office of Surface 
Mining--State Managed Abandoned Coal Mine Land Reclamation (2); 
Fiscal year 2009 funding level (dollars in millions): $477; 
PART program type: Block/formula grant; 
Efficiency measure: Percentage of declared emergencies abated within 6 
months.
Efficiency measure: Provide appropriate grant funding within 60 days 
of a complete grant application. 

Department: Interior; 
PART program name and number of efficiency measures: Wildland Fire 
Management (3); 
Fiscal year 2009 funding level (dollars in millions): $859; 
PART program type: Direct federal; 
Efficiency measure: Number of acres treated in the wildland-urban 
interface per million dollars of gross investment.
Efficiency measure: Number of acres treated outside the wildland-urban 
interface per million dollars gross investment.
Efficiency measure: Number of acres in fire regimes 1, 2, or 3 moved 
to a better condition class per million dollars of gross investment. 

Department: Labor; 
PART program name and number of efficiency measures: Energy Employees 
Occupational Illness Compensation Program (1); 
Fiscal year 2009 funding level (dollars in millions): $1,161; 
PART program type: Direct federal; 
Efficiency measure: Average number of decisions per full-time 
equivalent. 

Department: Labor; 
PART program name and number of efficiency measures: Job Corps (1); 
Fiscal year 2009 funding level (dollars in millions): $1,611; 
PART program type: Capital and service acquisition; 
Efficiency measure: Cost per participant. 

Department: Labor; 
PART program name and number of efficiency measures: Occupational 
Safety and Health Administration (1); 
Fiscal year 2009 funding level (dollars in millions): $503; 
PART program type: Regulatory; 
Efficiency measure: Inspections per Compliance Safety and Health 
Officer. 

Department: Labor; 
PART program name and number of efficiency measures: Unemployment 
Insurance Administration State Grants (1); 
Fiscal year 2009 funding level (dollars in millions): $3,498; 
PART program type: Block/formula grant; 
Efficiency measure: Number of timely and accurate initial benefit 
payments claims per $1,000 of inflation-adjusted base grant funds. 

Department: Labor; 
PART program name and number of efficiency measures: Workforce 
Investment Act--Migrant and Seasonal Farmworkers (1); 
Fiscal year 2009 funding level (dollars in millions): $83; 
PART program type: Competitive grant; 
Efficiency measure: Cost per participant. 

Department: Transportation; 
PART program name and number of efficiency measures: Federal Aviation 
Administration (FAA) Air Traffic Organization--Technical Operations 
(2); 
Fiscal year 2009 funding level (dollars in millions): $2,650; 
PART program type: Direct federal; 
Efficiency measure: ATO-Technical Operations staffing ratio. 

Efficiency measure: Unit cost for providing ATO-Technical Operations 
services. 

Department: Transportation; 
PART program name and number of efficiency measures: FAA Air Traffic 
Organization--Terminal Programs (2)[A]; 
Fiscal year 2009 funding level (dollars in millions): $2,199; 
PART program type: Direct federal; 
Efficiency measure: Unit cost for providing terminal services. 

Efficiency measure: Productivity rate at service delivery points. 

Department: Transportation; 
PART program name and number of efficiency measures: Federal Transit 
Administration New Starts (1); 
Fiscal year 2009 funding level (dollars in millions): $1,569; 
PART program type: Competitive grant; 
Efficiency measure: Percent of projects under full funding grant 
agreements that have current total cost estimates that do not exceed 
baseline cost by more than 5 percent. 

Department: Transportation; 
PART program name and number of efficiency measures: Highway 
Infrastructure (3); 
Fiscal year 2009 funding level (dollars in millions): $41,325; 
PART program type: Block/formula grant; 
Efficiency measure: Percent of major federally funded transportation 
infrastructure projects with less than 2 percent annual growth in the 
project completion milestone.
Efficiency measure: Median time to complete an Environmental Impact 
Statement.
Efficiency measure: Percent of major federally funded transportation 
infrastructure projects with less than 2 percent annual growth in cost 
estimates. 

Department: Transportation; 
PART program name and number of efficiency measures: National Highway 
Traffic Safety Administration--Operations and Research (1); 
Fiscal year 2009 funding level (dollars in millions): $232; 
PART program type: Regulatory; 
Efficiency measure: Average costs incurred to complete a defect 
investigation. 

Source: GAO analysis of OMB's Program Assessment Rating Tool. 

[A] During the course of our review, FAA Air Traffic Organization-- 
Terminal Programs, changed the status of one of its PART measures (ATO 
Terminal staffing ratio) from an "output" measure to an "efficiency" 
measure. Therefore, we did not include this measure in our review. 

[End of table] 

[End of section] 

Appendix IV: Comments from the Department of the Interior: 

United States Department of the Interior: 
Office Of The Secretary: 
Washington, DC 20240: 

April 20, 2011: 

Ms. Bernice Steinhardt: 
Director, Strategic Issues: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, D.C. 20548: 

Dear Ms. Steinhardt: 

Thank you for providing the Department of the Interior the opportunity 
to review and comment on the draft Government Accountability Office 
Report entitled, Streamlining Government: Opportunities Exist to 
Strengthen OMB's Approach to Improving Efficiency, (GA0-10-394). 

The Department does concur with many aspects of the recommendations 
made in the report. First, we agree that OMB is in a position to 
determine government-wide opportunities for efficiency that involve 
and could benefit multiple agencies. Using input from agencies on what 
has worked and how that can be applied across the Federal community 
would be value added. The councils and work groups that OMB hosts are 
very useful in that regard and we are finding the PIC to be a source 
of good guidance and best practices. 

With regard to additional guidance and direction on the use of 
efficiency measures in agencies and programs, we would urge caution. 
GAO observed that efficiency measurements should be balanced with 
considerations of quality, outcomes, and other factors such as 
customer satisfaction. Rather than ask for additional guidance on 
efficiency measures that may not be sufficiently focused on outcomes, 
we believe that OMB's initiative to strengthen and infuse a program 
evaluation capability in Federal agencies is just what is needed. We 
caution inviting standardized direction that would have us comparing 
efficiency across and within programs, considering the inherent 
differences in scope, complexity, and quality of outputs and outcomes. 
Rather we seek maximum flexibility for Federal managers in using 
efficiency measures when they make sense and can be used to drive to 
the desired goals for the program. As demonstrated by your study, not 
all programs have the same capacity for improved efficiency and some 
are more challenging to evaluate. 

We appreciate having the opportunity to comment. If you have questions 
or need additional information, please contact Dr. Richard Beck, 
Director, Office of Planning and Performance Management, at (202) 208-
1818. 

Sincerely, 

Signed by: 

Rhea Suh: 
Assistant Secretary: 
Policy, Management and Budget: 

[End of section] 

Appendix V: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Bernice Steinhardt, (202) 512-6543 or steinhardtb@gao.gov: 

Staff Acknowledgments: 

In addition to the individual named above, Elizabeth Curda, Assistant 
Director; Charlesetta Bailey; James Cook; Anne Inserra; Eric Knudson; 
Ricardo Sanchez; and Jeremy Williams made key contributions to the 
report. Cynthia Grant; Peter Grinnell; Carol Henn; Donna Miller; A.J. 
Stephens; Jay Smale; Jessica Thomsen; and John Warner also provided 
significant assistance. 

[End of section] 

Footnotes: 

[1] OMB, Memorandum for the Heads of Departments and Agencies (M-09-
20) on Planning for the President's Fiscal Year 2011 Budget and 
Performance Plans (Washington, D.C.: June 11, 2009). 

[2] According to OMB, federal employees submitted over 38,000 ideas to 
the President's SAVE (Securing Americans Value and Efficiency) Award 
contest, which was launched in September 2009. The winner's idea is 
supposed to be included in the 2011budget, and the employee who 
submitted it will be invited to meet the President. 

[3] Executive Order 13538, Establishing the President's Management 
Advisory Board, 75 Fed. Reg. 20,895 (April 19, 2010). 

[4] Pub. L. No. 103-62, 107 Stat. 285 (Aug. 3, 1993). 

[5] In addition to GPRA, executive agencies are subject to other 
general requirements related to efficiency. For example, agencies are 
required to implement and maintain systems of internal controls which 
are, in part, to assure effective and efficient operations. 31 U.S.C. 
§ 3512(c); GAO, Standards for Internal Control in the Federal 
Government, [hyperlink, http://www.gao.gov/products/AIMD-00-21.3.1] 
(Washington, D.C.: November 1999). In addition, federal agencies must 
develop and maintain accounting and financial management systems that, 
consistent with OMB policies, provide for the systematic measurement 
of agency performance, among other things. 31 U.S.C. §§ 503(b), 
902(a)(3)(D)(iv). 

[6] The PMA, which was first announced in 2001, consisted of five 
governmentwide management priorities, including budget and performance 
integration, strategic management of human capital, expanded 
electronic government, improved financial performance, and competitive 
sourcing. 

[7] OMB described PART, which was created in 2002, as a diagnostic 
tool meant to provide a consistent approach to evaluating federal 
programs as part of the executive budget formulation process. 

[8] See, for example, Harry P. Hatry, Performance Measurement: Getting 
Results, Second Edition (Baltimore, MD: The Urban Institute Press, 
2007). 

[9] See, for example, GAO, Tax Administration: IRS Can Improve Its 
Productivity Measures by Using Alternative Methods, [hyperlink, 
http://www.gao.gov/products/GAO-05-671] (Washington, D.C.: July 11, 
2005). 

[10] In addition to these 36 efficiency measures, there were a total 
of five additional efficiency measures included in the PART data we 
received from OMB for three of our selected programs. However, 
officials from each of these programs told us these five efficiency 
measures were no longer associated with PART, so we excluded them from 
our analysis. Further, one of the selected programs from the 
Department of Transportation, the Federal Aviation Administration 
(FAA) Air Traffic Organization (ATO)--Terminal Programs, changed the 
status of one of its PART measures (ATO Terminal Staffing Ratio) from 
an "output" measure to an "efficiency" measure. We did not include 
this measure in our review of efficiency measures for the selected 
programs. 

[11] Percentage estimates based on this sample have 95 percent 
confidence intervals of within +/-10 percentage points of the estimate 
itself, unless otherwise noted. See Appendix I for more information on 
sampling methodology. 

[12] GAO, Managerial Cost Accounting Practices: Implementation and Use 
Vary Widely across 10 Federal Agencies, [hyperlink, 
http://www.gao.gov/products/GAO-07-679] (Washington, D.C.: July 20, 
2007). 

[13] PART classified programs as one of seven types: direct federal, 
competitive grant, block/formula grant, research and development, 
capital assets and acquisition, credit, and regulatory. We excluded 
research and development programs from our sample of selected programs 
based on the findings of a 2008 study by The National Academies which 
raised questions about the feasibility of developing valid outcome- 
based efficiency measures for federal research programs (Evaluating 
Research Efficiency in the U.S. Environmental Protection Agency, 
Committee on Evaluating the Efficiency of Research and Development 
Programs at the U.S. Environmental Protection Agency, The National 
Academies). We excluded credit programs from our sample of selected 
programs because of the relatively small number of these programs in 
the selected departments. 

[14] Fiscal year 2009 funding for the selected programs ranged from 
approximately $80 million to over $41 billion. 

[15] Work on the engagement was originally started in October 2006, 
but subsequently suspended before resuming in September 2008. 

[16] GPRA, §§ 2(a)(1), 2(b)(5). 

[17] GAO, Results-Oriented Government: GPRA Has Established a Solid 
Foundation for Achieving Greater Results, [hyperlink, 
http://www.gao.gov/products/GAO-04-38] (Washington, D.C.: Mar. 10, 
2004). 

[18] OMB's PMA standards included references to additional approaches 
to improving efficiency, such as competitive sourcing and business 
process reengineering for commercial services management, developing 
business cases for major systems investments, and using earned value 
management to plan, execute, and manage major information technology 
(IT) investments. 

[19] Additional references to efficiency in OMB's PART assessment tool 
included language in the section on program purpose and design, which 
asked if the program design was free of flaws that would limit 
efficiency, with a requirement for "there … to be no strong evidence" 
that another approach or mechanism would be more efficient. For 
capital assets and service acquisition programs, PART questions (in 
the strategic planning section) included assessing whether credible 
analysis of alternatives had been conducted, to determine whether the 
agency was investing in something that provided the best value to the 
government. For regulatory programs, there was a specific question in 
the program results section asking whether the goals were achieved at 
the least incremental societal cost and whether the program maximized 
net benefits, to determine whether the program met its goals in the 
most efficient way possible. 

[20] Two of our selected programs--U.S. Department of Agriculture's 
Forest Service Watershed and Department of the Interior's Endangered 
Species--did not have any efficiency measures in PART, but officials 
from both of these programs told us they had proposed efficiency 
measures to OMB that had been rejected, and that they were developing 
new efficiency measures and had been in consultation with OMB seeking 
approval. 

[21] This sample enables us to generalize our analysis to the 
remaining efficiency measures for PART. These percentage estimates 
have 95 percent confidence intervals of within +/-10 percentage points 
of the estimate itself. Appendix I contains additional information on 
the sampling methodology. 

[22] We characterized a measure as "unclear" when it was ambiguous as 
to whether or not both elements (input plus output or outcome) were 
present, based on our analysis of how the measure was written and the 
accompanying explanation. 

[23] We did not consider "average time" as expressed in this measure 
to be an input because it tracked the number of calendar years that 
have passed, not the amount of work hours needed to correct/mitigate 
higher priority operations and maintenance deficiencies (which are 
outputs). 

[24] Full-time equivalent employment is the basic measure of levels of 
employment used in the budget. It is the total number of hours worked 
divided by the total number of compensable hours in a fiscal year. For 
example, in fiscal year 2009 an FTE represented 2,088 hours (8 hours 
per day for 261 days). 

[25] GAO, Performance Budgeting: Efforts to Restructure Budgets to 
Better Align Resources with Performance, [hyperlink, 
http://www.gao.gov/products/GAO-05-117SP] (Washington, D.C.: February 
2005). 

[26] According to Statement of Federal Financial Accounting Standards 
No. 4, Managerial Cost Accounting Concepts and Standards for the 
Federal Government, examples of direct costs include: salaries and 
other benefits for employees who work directly on the output, 
materials and supplies used in the work, office space, and equipment 
and facilities that are used exclusively to produce the output; 
examples of indirect costs include: general administrative services; 
general research and technical support; security; rent; and operations 
and maintenance costs for building, equipment, and utilities. 

[27] The five standards in SFFAS 4 require government agencies to (1) 
accumulate and report the costs of activities on a regular basis for 
management information purposes; (2) establish responsibility 
segments, and measure and report the costs of each segment's outputs 
and calculate the unit cost of each output; (3) determine and report 
the full costs of government goods and services, including direct and 
indirect costs; (4) recognize the costs of goods and services provided 
by other federal entities; and (5) use and consistently follow costing 
methodologies or cost finding techniques most appropriate to the 
segment's operating environment to accumulate and assign costs to 
outputs. 

[28] Appropriations are a form of budget authority to incur 
obligations and to make payments from the Treasury for specified 
purposes. Obligations are a definite commitment that creates a legal 
liability of the government for the payment of goods and services 
ordered or received, or a legal duty on the part of the United States 
that could mature into a legal liability by virtue of actions on the 
part of the other party beyond the control of the United States. 

[29] [hyperlink, http://www.gao.gov/products/GAO-05-117SP]. 

[30] The study recommended including the annual depreciation amount 
for its property, plant, and equipment rather than the funds 
appropriated for construction for a given year. 

[31] Hei Tech Services, Inc., Job Corps Cost Measure: Selecting a Cost 
Measure to Assess Program Results (Dec. 1, 2008). 

[32] According to this official, the Department of the Interior is in 
the process of transitioning to a common business platform financial 
system, Financial Business Management Systems, but not all entities 
within the department have adopted the common system yet. 

[33] Two of the three measures concern the number of acres treated 
inside and outside the wildland-urban interface per million dollar 
gross investment. The third measure concerns the number of acres in 
fire regimes 1, 2, or 3 moved to a better condition class per million 
dollars of gross investment. 

[34] [hyperlink, http://www.gao.gov/products/GAO-07-679]. 

[35] Although Education did not have a departmentwide MCA system, as 
indicated below, Federal Student Aid (FSA) within Education had its 
own MCA system. 

[36] 31 U.S.C. § 3512 note. 

[37] The Higher Education Amendments of 1998, which amended the Higher 
Education Act of 1965, established a performance-based organization 
for the delivery of federal student financial assistance, after which 
Federal Student Aid, the one Department of Education program office 
with an operational MCA system, independently developed its MCA 
system. Pub. L. No. 105-244, title I, § 101(a), 112 Stat. 1581, 1604-
610 (Oct. 7, 1998), codified at 20 U.S.C. § 1018. PBOs are discrete 
units, led by a Chief Operating Officer, that commit to clear 
objectives, specific measurable goals, customer service standards, and 
targets for improved performance, see GAO-06-653T. 

[38] The Federal Aviation Reauthorization Act of 1996 required that 
FAA develop a cost accounting system that accurately reflects the 
investment, operating and overhead costs, revenues, and other 
financial measurement and reporting aspects of its operations. Pub. L. 
No. 104-264, § 276(a)(2), 110 Stat. 3213, 3248 (Oct. 9, 1996), 
codified at 49 U.S.C. § 45303(e). In addition, in 1997, the National 
Civil Aviation Review Commission (the "Mineta Commission") recommended 
that FAA establish a cost accounting system to support the objective 
of FAA operating in a more performance-based, business-like manner. 

[39] GAO, Managerial Cost Accounting Practices: Departments of 
Education, Transportation, and the Treasury, [hyperlink, 
http://www.gao.gov/products/GAO-06-301R] (Washington, D.C.: Dec. 19, 
2005). 

[40] GAO, Managing for Results: Enhancing Agency Use of Performance 
Information for Management Decision Making, [hyperlink, 
http://www.gao.gov/products/GAO-05-927] (Washington, D.C.: Sept. 9, 
2005). 

[41] A sole-source contract is a contract award without competition 
from other companies. Such contracts are used in instances in which 
only one source is deemed able to provide the service or product 
needed at the time. Without the pressure of competing bids to keep 
prices in check, having information on costs is critical to 
negotiating the terms of such contracts. 

[42] GAO, The Government Performance and Results Act: 1997 
Governmentwide Implementation Will Be Uneven, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-97-109] (Washington, D.C.: June 2, 
1997); Managing for Results: Efforts to Strengthen the Link Between 
Resources and Results at the Administration for Children and Families, 
[hyperlink, http://www.gao.gov/products/GAO-03-9] (Washington, D.C.: 
Dec. 10, 2002); Performance Budgeting: Observations on the Use of 
OMB's Program Assessment Rating Tool for the Fiscal Year 2004 Budget, 
[hyperlink, http://www.gao.gov/products/GAO-04-174] (Washington, D.C.: 
Jan. 30, 2004); and Performance Budgeting: PART Focuses Attention on 
Program Performance, but More Can Be Done to Engage Congress, 
[hyperlink, http://www.gao.gov/products/GAO-06-28] (Washington, D.C.: 
Oct. 28, 2005). 

[43] [hyperlink, http://www.gao.gov/products/GAO-04-174], [hyperlink, 
http://www.gao.gov/products/GAO-06-28]. 

[44] [hyperlink, http://www.gao.gov/products/GAO-06-28], [hyperlink, 
http://www.gao.gov/products/GAO/GGD-97-109]. 

[45] [hyperlink, http://www.gao.gov/products/GAO-04-174], [hyperlink, 
http://www.gao.gov/products/GAO-03-9]. 

[46] GAO, Managing for Results: Measuring Program Results That Are 
Under Limited Federal Control, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-99-16] (Washington, D.C.: Dec. 11, 
1998). 

[47] [hyperlink, http://www.gao.gov/products/GAO-06-28]. 

[48] The Forest Service has clear authority to manage a broad spectrum 
of watershed activities on the national forests and to encourage the 
long-term stewardship of non-industrial private forestlands which 
contribute significantly to the health and productivity of the 
nation's watersheds. The Watershed program as delineated for the PART 
assessment encompassed the functional watershed program in the 
Watershed, Fish, Wildlife, Air, and Rare Plants Staff (WFW) and all 
Forest Service activities that contributed to improved watershed 
condition (e.g., vegetation management, reforestation, range 
management, wildlife and fisheries improvements, road decommissioning, 
etc.). It included at least 17 specific budget line items linked to 
meeting the goal of improving watershed condition from the Forest 
Service's Strategic Plan for Fiscal Years 2004-2008. 

[49] USDA Forest Service, Conceptual Framework for Determining and 
Tracking Changes in Watershed Condition on Lands Managed, revised 
February 13, 2008. 

[50] [hyperlink, http://www.gao.gov/products/GAO-06-28]. 

[51] [hyperlink, http://www.gao.gov/products/GAO-04-174]. 

[52] [hyperlink, http://www.gao.gov/products/GAO-04-174], [hyperlink, 
http://www.gao.gov/products/GAO-06-28]. 

[53] OMB, Examples of Performance Measures. 

[54] OMB, Program Assessment Rating Tool Guidance No. 2007-07: 
Guidance to Improve the Quality of PART Performance and Efficiency 
Goal (Dec. 12, 2007); and OMB, Program Assessment Rating Tool Guidance 
No. 2007-03: Guidance to Improve the Consistency of 2007 PART 
Assessments (May 15, 2007). 

[55] The measure showed an improvement in efficiency between 2007 and 
2008, the only 2 years for which data were available. 

[56] [hyperlink, http://www.gao.gov/products/GAO-06-28]. 

[57] GAO, Program Evaluation: Improving the Flow of Information to the 
Congress, [hyperlink, http://www.gao.gov/products/GAO/PEMD-95-1] 
(Washington, D.C.: Jan. 30, 1995). 

[58] GAO, Human Service Programs: Demonstration Projects Could 
Identify Ways to Simplify Policies and Facilitate Technology 
Enhancements to Reduce Administrative Costs, [hyperlink, 
http://www.gao.gov/products/GAO-06-942] (Washington, D.C.: Sept. 19, 
2006). 

[59] The Job Corps program hired a contractor to propose an 
alternative efficiency measure to try to capture the unique outcomes 
of the program. The contractor study proposed an outcome-level 
efficiency measure ("cost per successful program outcome"), but 
cautioned against comparison with other programs because estimates for 
other programs might not reflect full costs, and because comparisons 
could be misleading if program objectives were not identical. Hei Tech 
Services, Inc., Job Corps Cost Measure: Selecting a Cost Measure to 
Assess Program Results (Dec. 1, 2008). 

[60] In a prior review of PART, Labor officials told us that 
participants could remain in the Job Corps program for up to 2 years, 
which they considered adequate time to complete education or 
vocational training, and which generally resulted in higher wages, 
according to studies. However, they said that since costs per 
participant increased the longer a student remained in the program, 
Job Corps appeared less efficient compared with other job training 
programs. [hyperlink, http://www.gao.gov/products/GAO-06-28]. 

[61] The U.S. Fish and Wildlife Service, The Endangered Species 
Program's Strategic Plan, Draft (Sept. 19, 2008). 

[62] GAO, Endangered Species: Fish and Wildlife Service Generally 
Focuses Recovery Funding on High Priority Species, but Needs to 
Periodically Assess Its Funding Decisions, [hyperlink, 
http://www.gao.gov/products/GAO-05-211] (Washington, D.C.: Apr. 6, 
2005). 

[63] As noted above, the Endangered Species program did not have an 
efficiency measure that was approved by OMB for PART. However, program 
officials said they used an efficiency measure internally: the average 
time to complete a 5-year review. (A 5-year review is a period 
analysis of a species' status conducted to ensure that the listing 
classification of a species as threatened or endangered is accurate.) 

[64] [hyperlink, http://www.gao.gov/products/GAO-06-28]. 

[65] GAO, 21st Century Challenges: Reexamining the Base of the Federal 
Government, [hyperlink, http://www.gao.gov/products/GAO-05-325SP] 
(Washington, D.C.: Feb.1, 2005). 

[66] [hyperlink, http://www.gao.gov/products/GAO-03-1168T]. 

[67] GAO, VA Health Care: Status of Efforts to Improve Efficiency and 
Access, [hyperlink, http://www.gao.gov/products/GAO/HEHS-98-48] 
(Washington, D.C.: Feb. 6, 1998). 

[68] GAO, Military Base Realignments and Closures: Estimated Costs 
Have Increased While Savings Estimates Have Decreased Since Fiscal 
Year 2009, [hyperlink, http://www.gao.gov/products/GAO-10-98R] 
(Washington: D.C.: Nov. 13, 2009). 

[69] GAO, DOD Information Technology: Software and Systems Process 
Improvement Programs vary in Use of Best Practices, [hyperlink, 
http://www.gao.gov/products/GAO-01-116] (Washington, D.C.: Mar. 30, 
2001). 

[70] APQC is a nonprofit worldwide leader in process and performance 
improvement with members from government, nongovernment, and business 
organizations. 

[71] APQC, Operating Tactics in Tough Times: Reduce Costs and Retain 
Customers - Business Process Management Research (Houston, TX: Aug. 
11, 2009). Some of the other methodologies covered in the report 
include Baldrige National Quality Program, Kaizen, ISO 9001, and LEAN. 

[72] GAO, Human Service Programs: Demonstration Projects Could 
Identify Ways to Simplify Policies and Facilitate Technology 
Enhancements to Reduce Administrative Costs, [hyperlink, 
http://www.gao.gov/products/GAO-06-942] (Washington, D.C.: Nov. 15, 
2006). 

[73] GAO, Tax Administration: Most Filing Season Services Continue to 
Improve, but Opportunities Exist for Additional Savings, [hyperlink, 
http://www.gao.gov/products/GAO-07-27] (Washington, D.C.: Nov. 15, 
2006). 

[74] OMB, Analytical Perspectives, Budget of the United States 
Government, Fiscal Year 2011 (Washington D.C.: February 2010). 

[75] Executive Order 13520, Reducing Improper Payments, 74 Fed. Reg. 
13,520 (Nov. 20, 2009). 

[76] Some of the programs' modernization efforts were launched before 
PART. 

[77] The seven programs were Adoption Assistance, Child Care and 
Development Fund, Child Support Enforcement, food stamps, Foster Care, 
Temporary Assistance for Needy Families, and Unemployment Insurance. 
GAO, Human Service Programs: Demonstration Projects Could Identify 
Ways to Simplify Policies and Facilitate Technology Enhancements to 
Reduce Administrative Costs, [hyperlink, 
http://www.gao.gov/products/GAO-06-942] (Washington, D.C.: Sept. 19, 
2006). 

[78] Seafood fraud occurs when seafood products are mislabeled for 
financial gain. See GAO, Seafood Fraud: FDA Program Changes and Better 
Collaboration among Key Federal Agencies Could Improve Detection and 
Prevention, [hyperlink, http://www.gao.gov/products/GAO-09-258] 
(Washington, D.C.: Feb. 19, 2009). 

[79] GAO, Best Practices: Using Spend Analysis to Help Agencies Take a 
More Strategic Approach to Procurement, [hyperlink, 
http://www.gao.gov/products/GAO-04-870] (Washington, D.C.: Sept. 16, 
2004). 

[80] Between 2000 and 2003, prior GAO work studied procurement best 
practices of 11 companies--Bausch & Lomb; Brunswick Corporation; 
ChevronTexaco; Delta Air Lines; Dell; Dun & Bradstreet Corporation; 
Electronic Data Systems Corporation; Exxon Mobil Corporation; Hasbro, 
Inc.; International Business Machines; and Merrill Lynch & Co., Inc. 
See [hyperlink, http://www.gao.gov/products/GAO-04-870]. 

[81] Memorandum from Peter R. Orszag, Director, OMB, for the Heads of 
Departments and Agencies, Subject: Improving Government Acquisition 
(July 29, 2009). Memorandum from Lesley A. Field, Deputy 
Administrator, OMB, for Chief Acquisition Officers, Senior Procurement 
Executives, Subject: Increasing Competition and Structuring Contracts 
for the Best Results (Oct. 27, 2009). 

[82] Officials also reported using energywatchdog.com to receive a 
rebate of approximately $520,000 in fiscal year 2006 for overcharged 
utility costs at Job Corps centers. 

[83] OMB, Analytical Perspectives, Budget of the United States 
Government, Fiscal Year 2011 (Washington D.C.: February 2010). 

[84] GAO, Program Evaluation: Studies Helped Agencies Measure or 
Explain Program Performance, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-00-204] (Washington, D.C.: Sept. 
29, 2000). 

[85] GAO, Performance Measurement and Evaluation: Definitions and 
Relationships, [hyperlink, http://www.gao.gov/products/GAO-05-739SP] 
(Washington, D.C.: May 2005). 

[86] NAO, The Efficiency Programme: A Second Review of Progress 
(London, U.K.: Feb. 8, 2007). 

[87] [hyperlink, http://www.gao.gov/products/GAO-04-38]. 

[88] GAO, Grants Management: Enhancing Performance Accountability 
Provisions Could Lead to Better Results, [hyperlink, 
http://www.gao.gov/products/GAO-06-1046] (Washington, D.C.: Sept. 29, 
2006). 

[89] [hyperlink, http://www.gao.gov/products/GAO-09-1011T]. 

[90] GAO, Managerial Cost Accounting Practices: Implementation and Use 
Vary Widely across 10 Federal Agencies, [hyperlink, 
http://www.gao.gov/products/GAO-07-679] (Washington, D.C.: July 20, 
2007). 

[91] Alternatively, we could have selected the Social Security 
Administration, but chose to limit our review to cabinet-level 
departments. 

[92] The PART assessment years for the programs we selected ranged 
from 2003 to 2008. 

[93] OMB provided us with a database containing information on all 
programs that had received PART assessments and said the data were 
current as of January 14, 2009. We assessed the reliability of the OMB 
data and found that they were sufficiently reliable for purposes of 
this engagement. 

[94] All programs were considered to be direct federal and were 
assessed using 25 basic questions that comprised the direct federal 
PART. If a program delivered goods and services using one of the 
mechanisms captured in the other six PART types (competitive grant, 
block/formula grant, research and development, capital assets and 
acquisition, credit, or regulatory), it was assessed with additional 
specific questions tailored to the program type. 

[95] We excluded research and development programs from our sample of 
selected programs based on the findings of a 2008 study by The 
National Academies which raised questions about the feasibility of 
developing valid outcome-based efficiency measures for federal 
research programs (Evaluating Research Efficiency in the U.S. 
Environmental Protection Agency, Committee on Evaluating the 
Efficiency of Research and Development Programs at the U.S. 
Environmental Protection Agency, The National Academies). We excluded 
credit programs from our sample of selected programs because of the 
difficulty in making generalizations about such programs due to the 
relatively small number of these programs in the selected departments. 

[96] In addition to these 36 efficiency measures, there were a total 
of 5 additional efficiency measures included in the PART data we 
received from OMB for three of our selected programs. However, 
officials from each of these programs told us these 5 efficiency 
measures were no longer associated with PART, so we excluded them from 
our analysis. Further, one of the selected programs, the Department of 
Transportation's Federal Aviation Administration Air Traffic 
Organization (ATO)Terminal, changed one of its measures in PART, ATO- 
Terminal staffing ratio, from an "output" to an "efficiency" measure 
after our initial interview. As a result, we did not include this 
measure in our review. 

[97] When a measure was coded "No" for output/outcome, we coded the 
output or outcome type "N/A." 

[98] We excluded the 36 specifically selected efficiency measures from 
this population and the 5 efficiency measures which were included in 
PART but which program officials said should not be. 

[99] APQC, Operating Tactics in Tough Times: Reduce Costs and Retain 
Customers (Houston, TX: Aug. 11, 2009). 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: