This is the accessible text file for GAO report number GAO-14-747 
entitled 'Managing For Results: Agencies' Trends in the Use of 
Performance Information to Make Decisions' which was released on 
September 26, 2014. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to Congressional Addressees: 

September 2014: 

Managing For Results: 

Agencies' Trends in the Use of Performance Information to Make 
Decisions: 

GAO-14-747: 

GAO Highlights: 

Highlights of GAO-14-747, a report to congressional addressees. 

Why GAO Did This Study: 

GAO has long reported that agencies are better equipped to address 
management and performance challenges when managers effectively use 
performance information for decision making. However, GAO's periodic 
surveys of federal managers indicate that use of performance 
information has not changed significantly. 

GAO was mandated to evaluate the implementation of the GPRA 
Modernization Act of 2010. GAO assessed agencies' use of performance 
information from responses to GAO's surveys of federal managers at 24 
agencies. To address this objective, GAO created an index to measure 
agency use of performance information derived from a set of questions 
from the most recent surveys in 2007 and 2013, and used statistical 
analysis to identify practices most significantly related to the use 
of performance information index. 

What GAO Found: 

Agencies' reported use of performance information, as measured by GAO'
s use of performance information index, generally did not improve 
between 2007 and 2013. The index was derived from a set of survey 
questions in the 2007 and 2013 surveys that reflected the extent to 
which managers reported that their agencies used performance 
information for various management activities and decision making. 
GAO's analysis of the average index score among managers at each agency 
found that most agencies showed no statistically significant change in 
use during this period. As shown in the table below, only two agencies 
experienced a statistically significant improvement in the use of 
performance information. During the same time period, four agencies 
experienced a statistically significant decline in the use of 
performance information. 

Table: Federal Agencies' Average Scores on Use of Performance 
Information Index-—2007 and 2013: 

Agency: Government-wide; 
2007 average score: 3.46; 
2013 average score: 3.41; 
Statistically significant increase or decrease between 2007 and 2013: 
decrease. 

Agency: Office of Personnel Management; 
2007 average score: 3.38; 
2013 average score: 3.66; 
Statistically significant increase or decrease between 2007 and 2013: 
increase. 

Agency: Department of Labor; 
2007 average score: 3.37; 
2013 average score: 3.58; 
Statistically significant increase or decrease between 2007 and 2013: 
increase. 

Agency: Department of Veterans Affairs; 
2007 average score: 3.71; 
2013 average score: 3.49; 
Statistically significant increase or decrease between 2007 and 2013: 
decrease. 

Agency: National Aeronautics and Space Administration; 
2007 average score: 3.71; 
2013 average score: 3.49; 
Statistically significant increase or decrease between 2007 and 2013: 
decrease. 

Agency: Department of Energy; 
2007 average score: 3.52; 
2013 average score: 3.34; 
Statistically significant increase or decrease between 2007 and 2013: 
decrease. 

Agency: Nuclear Regulatory Commission; 
2007 average score: 3.70; 
2013 average score: 3.32; 
Statistically significant increase or decrease between 2007 and 2013: 
decrease. 

Source: GAO-08-1036SP and GAO-13-519SP. GAO-14-747. 

Note: The other 18 federal agencies did not experience either a 
statistically significant increase or decrease between 2007 and 2013. 

[End of table] 

GAO has previously found that there are five leading practices that 
can enhance or facilitate the use of performance information: (1) 
aligning agency-wide goals, objectives, and measures; (2) improving 
the usefulness of performance information; (3) developing agency 
capacity to use performance information; (4) demonstrating management 
commitment; and (5) communicating performance information frequently 
and effectively. GAO tested whether additional survey questions 
related to the five practices were significantly related to the use of 
performance information as measured by the index. GAO found that the 
average use of performance information index for agencies increased 
when managers reported their agencies engaged to a great extent in 
these practices as reflected in the survey questions. For example, the 
Office of Personnel Management (OPM) was one of the two agencies that 
experienced an increase in use of performance information from 2007 to 
2013, as measured by the GAO index. In 2013, OPM managers responded 
more favorably than the government-wide average on several of the 
survey questions related to these practices. 

What GAO Recommends: 

GAO is not making recommendations in this report. Office of Management 
and Budget staff generally agreed with the report. Four agencies (the 
Departments of Commerce and the Treasury, the General Services 
Administration (GSA), and the National Aeronautics and Space 
Administration (NASA)) provided comments that are addressed. Commerce 
and GSA agreed with the report. Treasury and NASA raised concerns 
about the findings and conclusions in this report, including the 
design of the surveys. GAO continues to believe its findings and 
conclusions are valid as discussed in the report. Twenty other 
agencies did not have comments. 

View [hyperlink, http://www.gao.gov/products/GAO-14-747]. For more 
information, contact J. Christopher Mihm at (202) 512-6806, or 
mihmj@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Agencies' Reported Use of Performance Information Generally Has Not 
Improved Since 2007: 

Concluding Observations: 

Agency Comments: 

Appendix I: Regression Analyses of 2013 Federal Managers Survey 
Results to Identify Predictors of Use of Performance Information: 

Sensitivity and Specification Testing: 

Appendix II: Comments from the Social Security Administration: 

Appendix III: GAO Contacts and Staff Acknowledgments: 

Tables: 

Table 1: Comparison of 2007 and 2013 Federal Agencies' Average Scores 
on the Use of Performance Information Index: 

Table 2: Final Regression Analysis Based on the 2013 Federal Managers 
Survey (Dependent Variable: Performance Information Use Index): 

Figures: 

Figure 1: Questions from GAO's 2013 Managers Survey Used to Develop 
the Use of Performance Information Index: 

Figure 2: Leading Practices That Can Enhance or Facilitate the Use of 
Performance Information for Management Decision Making: 

Figure 3: Questions from the 2013 Managers Survey Associated with 
Leading Practices to Enhance and Facilitate the Use of Performance 
Information: 

Figure 4: Difference in Use of Performance Information between SES 
Managers and Non-SES Managers Significant in Most Agencies in 2013: 

Figure 5: Practices and Related Managers Survey Questions 
Statistically and Positively Related to the Use of Performance 
Information Index: 

Abbreviations: 

CFO: Chief Financial Officer: 

DHS: Department of Homeland Security: 

EPA: Environmental Protection Agency: 

FEVS: Federal Employee Viewpoint Survey: 

GPRA: Government Performance and Results Act of 1993: 

GPRAMA: GPRA Modernization Act of 2010: 

NASA: National Aeronautics and Space Administration: 

NRC: Nuclear Regulatory Commission: 

OMB: Office of Management and Budget: 

OPM: Office of Personnel Management: 

PIC: Performance Improvement Council: 

SES: Senior Executive Service: 

SSA: Social Security Administration: 

Treasury: Department of the Treasury: 

USAID: U.S. Agency for International Development: 

USDA: U.S. Department of Agriculture: 

VA: Department of Veterans Affairs: 

[End of section] 

United States Government Accountability Office: 
GAO:
441 G St. N.W. 
Washington, DC 20548: 

September 26, 2014: 

Congressional Addressees: 

The federal government is one of the world's largest and most complex 
entities, with about $3.5 trillion in outlays in fiscal year 2013, 
funding a vast array of programs and operations. It faces a number of 
significant fiscal, management, and performance challenges in 
responding to the diverse and increasingly complex issues it seeks to 
address. Addressing these challenges will require actions on multiple 
fronts. Our prior work on results-oriented management has found that 
data-driven decision making leads to better results.[Footnote 1] If 
agencies do not effectively use performance measures and performance 
information to track progress and achieve their goals, they increase 
the risk of failing to achieve them. 

In that regard, we have previously reported that the performance 
planning and reporting framework originally put into place by the 
Government Performance and Results Act of 1993 (GPRA)[Footnote 2], and 
significantly enhanced by the GPRA Modernization Act of 2010 (GPRAMA) 
[Footnote 3], provides important tools that can help inform 
congressional and executive branch decision making to address 
challenges the federal government faces.[Footnote 4] This report is 
part of a series of reports under our mandate to examine the 
implementation of GPRAMA. This report compares the agency-level 
results from our 2013 survey of federal managers at the 24 agencies 
covered by the Chief Financial Officers (CFO) Act of 1990, as amended, 
with our 2007 managers survey.[Footnote 5] The 2007 survey is the most 
recent survey conducted before GPRAMA was enacted in 2011 and includes 
results from the Department of Homeland Security (DHS).[Footnote 6] 
Our specific objective for this report was to assess agencies' use of 
performance information from responses to our federal managers surveys. 

To address this objective, we analyzed survey data that we had 
previously collected and publicly reported in 2007 and 2013. In 2007, 
we surveyed a stratified random sample of mid-level and upper-level 
managers and supervisors (General Schedule levels comparable to 13 
through 15 and career Senior Executive Service (SES) or equivalent) at 
the 24 CFO Act agencies (4,412 persons from a population of 
approximately 107,326 mid-level and upper-level civilian managers and 
supervisors). For the 2007 survey results, the average response rate 
across all agencies was about 70 percent. In 2013, we again surveyed a 
stratified random sample of mid-level and upper-level managers and 
supervisors at the 24 CFO Act agencies (4,391 persons from a 
population of approximately 148,300 mid-level and upper-level civilian 
managers and supervisors). For the 2013 survey, the average response 
rate across all agencies was 69 percent. Similar to the previous 
surveys, the sample was stratified by agency and by whether the 
manager or supervisor was a member of the SES or non-SES. The overall 
survey results are generalizable to the population of managers as 
described above at each of the 24 agencies and government-wide. The 
responses of each eligible sample member who provided a usable 
questionnaire were weighted in the analyses to account statistically 
for all members of the population.[Footnote 7] For additional 
information on survey development and implementation, please see our 
prior reports.[Footnote 8] We did not interview officials at the 
agencies that participated in our 2013 managers survey to obtain 
additional information. 

The use of performance information index was based primarily on an 
index that we developed and reported on the 2007 managers survey. In 
both the 2007 and 2013 surveys, we defined the terms "performance 
information" and "performance measures" in the broadest sense. In our 
2013 survey, we defined performance information as the data collected 
to measure progress toward achieving an agency's established mission 
or program-related goals or objectives. We further stated that 
performance information can focus on performance measures, such as 
quality, timeliness, customer satisfaction, or efficiency. It can 
inform key management decisions such as setting program priorities, 
allocating resources, or identifying program problems and taking 
corrective actions to solve those problems. After identifying a core 
set of items from the original index, we tested the impact of 
including and excluding several additional questions related to 
performance management use to ensure the cohesiveness and strength of 
our revised index that we used for the 2013 survey.[Footnote 9] 

Figure 1 shows the set of 2013 managers survey questions that comprise 
the use index. 

Figure 1: Questions from GAO's 2013 Managers Survey Used to Develop 
the Use of Performance Information Index: 

[Refer to PDF for image: illustration] 

For the program, operation or project that you are involved with, to 
what extent, if at all, do you use the information obtained from 
performance measurements when participating in the following 
activities? 
* Developing program strategy; 
* Allocating resources; 
* Identifying program problems to be addressed; 
* Taking corrective action to solve program problems; 
* Adopting new program approaches or changing work processes; 
* Identifying and sharing effective program approaches with others. 

To what extent, if at all, do you agree with the following statements? 
* My agency's top leadership demonstrates a strong commitment to using 
performance information to guide decisionmaking; 
* Agency managers/supervisors at my level use performance information 
to share effective program approaches with others; 
* Changes by management above my level to the 
program(s)/operation(s)/project(s) I am responsible for are based on 
results or outcome-oriented performance information. 

To what extent, if at all, do you believe that the following persons 
pay attention to your agency's use of performance information in 
management decision making? 
* The individual I report to; 
* Employees that report to me. 

Source: GAO-13-519SP. GAO-14-747. 

[End of figure] 

After developing the use indices for 2007 and 2013, we then analyzed 
the managers' responses, grouped the managers' responses according to 
their agencies, and compared the agencies' scores on each index. 

We then reviewed our prior work where we identified leading practices 
that can help agencies enhance or facilitate the use of performance 
information for management decision making, as shown in figure 2 
below.[Footnote 10] 

Figure 2: Leading Practices That Can Enhance or Facilitate the Use of 
Performance Information for Management Decision Making: 

[Refer to PDF for image: illustration] 

Practices: lead to Use of performance information, which lead to 
Improved results. 

Practices: 
* Demonstrating management commitment; 
* Aligning agencywide goals, objectives, and measures; 
* Improving the usefulness of performance information; 
* Developing the capacity to use performance information; 
* Communicating performance information frequently and efficiently. 

Source: GAO-05-927. GAO-14-747. 

[End of figure] 

To determine if there any additional factors related to these leading 
practices that could influence how an agency scored on our use index, 
we looked at the remaining questions from the 2013 managers survey and 
identified those additional questions that were associated with these 
leading practices. We used statistical testing to determine if the 
relationship between these additional questions and an agency's use of 
performance information was statistically significant.[Footnote 11] 
See figure 3 below for the survey questions we tested related to the 
five leading practices. 

Figure 3: Questions from the 2013 Managers Survey Associated with 
Leading Practices to Enhance and Facilitate the Use of Performance 
Information: 

[Refer to PDF for image: illustration] 

Practice that enhance or facilitate use of performance information: 
Aligning agencywide goals, objectives, and measures; 
Tested survey questions: 
* Agency managers/supervisors at my level take steps to align program 
performance measures with agencywide goals and objectives. 

Practice that enhance or facilitate use of performance information: 
Improving the usefulness of performance information; 
Tested survey questions: 
* I have sufficient information on the validity of the performance 
data I use to make decisions; 
* My agency's performance information is available in a format that is 
easy to use; 
* Performance Information is available in time to manage the 
program(s)/operation(s)/project(s) that I am involved with. 

Practice that enhance or facilitate use of performance information: 
Developing agency capacity to use performance information; 
Tested survey questions: 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to use program performance 
information to make decisions; 
* My agency has sufficient analytical tools for managers at my level 
to collect, analyze, and use performance information; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to set program performance goals; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to develop program performance 
measures; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to assess the quality of performance 
data; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to link the 
program(s)/operation(s)/project(s) to the achievement of agency 
strategic goals. 

Practice that enhance or facilitate use of performance information: 
Demonstrating management commitment; 
Tested survey questions: 
* My agency's top leadership demonstrates a strong commitment to 
achieving results; 
* My agency is investing the resources needed to ensure that it's 
performance data is of sufficient quality; 
* My agency is investing in resources to improve the agency's capacity 
to use performance information. 

Practice that enhance or facilitate use of performance information: 
Communicating performance information frequently and effectively; 
Tested survey questions: 
* Agency managers/supervisors at my level effectively communicate 
performance information on a routine basis; 
* Employees in my agency receive positive recognition for helping the 
agency accomplish its strategic goals; 
* My agency's performance information is easily accessible to managers 
at my level; 
* My agency's performance information is easily accessible to 
employees; 
* My agency's performance information is easily accessible to the 
public, as appropriate. 

Source: GAO-13-519SP. GAO-14-747. 

[End of figure] 

We conducted regression analyses to assess the relationship between 
these additional questions related to leading practices to enhance and 
facilitate the use of performance information and an agency's score 
and ranking on the use index. Regression analyses allowed us to assess 
the unique association between our outcome variable and a given 
predictor variable, while controlling for multiple other predictor 
variables.[Footnote 12] Using our survey data, these analyses were 
intended to reflect the strength of the relationship between our 
previously identified practices with our 2013 use index. Our analyses 
did not seek to identify or assess other survey items addressing 
additional practices that we had not previously identified for 
improving the use of performance information for management decision 
making. More information on our regression analysis can be found in 
appendix I. 

To ensure reliability of the specific survey questions we used in our 
analyses, we conducted electronic testing of the data and reviewed 
prior information on the design and implementation of our 2007 and 
2013 managers surveys.[Footnote 13] We believe the data are 
sufficiently reliable for the purpose of this report. 

We conducted this performance audit from August 2013 to September 2014 
in accordance with generally accepted government auditing standards. 
Those standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe 
that the evidence obtained provides a reasonable basis for our 
findings and conclusions based on our audit objectives. 

Background: 

Our prior work on managing for results has found that federal agencies 
may use performance information across a range of management functions 
to improve results, from setting program priorities and allocating 
resources, to taking corrective action to solve performance problems. 
In June 2013, we highlighted the government-wide results of our 2013 
federal managers survey and found little improvement since 1997 in 
managers' reported use of performance information, or in the 
management activities that could facilitate or enhance the use of 
performance information. We also reported a decline in the percentage 
of managers who agreed that their agencies' top leadership 
demonstrates a strong commitment to achieving results. Overall, our 
periodic surveys of federal managers since 1997 indicate that with the 
few exceptions, the use of performance information has not changed 
significantly over time government-wide. 

These survey results are consistent with trends identified in other 
federal employee surveys government-wide. For example, the Office of 
Personnel Management (OPM) surveys federal workers with the Federal 
Employee Viewpoint Survey (FEVS). FEVS is a tool that measures 
employees' perceptions of whether, and to what extent, conditions 
characterizing successful organizations are present in their agencies. 
OPM creates an index using a smaller subset selected from the FEVS 
survey responses that are related to agencies' results-oriented 
performance culture. OPM also creates additional indices using 
different subsets of FEVS survey questions related to: (1) leadership 
and knowledge management; (2) talent management; and (3) job 
satisfaction. On the results-oriented performance culture index, 27 of 
the 37 agencies OPM surveyed experienced a decline between 2008 and 
2013. Only seven agencies improved during this time period--OPM, the 
U.S. Departments of Education and Transportation, the Federal 
Communications Commission, National Labor Relations Board, Railroad 
Retirement Board, and the Broadcasting Board of Governors. 

The Office of Management and Budget and the Performance Improvement 
Council (PIC), work with federal agencies to improve performance 
across the federal government. Among the PIC's responsibilities is the 
charge to facilitate the exchange of useful performance improvement 
practices and work among the federal agencies to resolve government-
wide or crosscutting performance issues. 

Agencies' Reported Use of Performance Information Generally Has Not 
Improved Since 2007: 

Few federal agencies showed improvement in managers' use of 
performance information for decision making between 2007 and 2013, as 
measured by our use index. Specifically, our analysis of the average 
use index score at each agency found that most agencies showed no 
statistically significant change in use during this period. Only two 
agencies--OPM and the Department of Labor--experienced a statistically 
significant improvement in managers' use of performance information. 
During the same time period, four agencies--the Departments of Energy 
and Veterans Affairs (VA), the National Aeronautics and Space 
Administration, and the Nuclear Regulatory Commission--experienced a 
statistically significant decline in managers' use of performance 
information as measured by our index. See table 1 below for agency 
scores on the use of performance information index. 

Table 1: Comparison of 2007 and 2013 Federal Agencies' Average Scores 
on the Use of Performance Information Index: 

Agency: Government-wide; 
2007 Average score: 3.46; 
2013 Average score: 3.41; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: decrease. 

Ranking: 1; 
Agency: Office of Personnel Management; 
2007 Average score: 3.38; 
2013 Average score: 3.66; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: increase. 

Ranking: 2; 
Agency: Social Security Administration; 
2007 Average score: 3.70; 
2013 Average score: 3.65; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 3; 
Agency: Department of Labor; 
2007 Average score: 3.37; 
2013 Average score: 3.58; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: increase. 

Ranking: 4; 
Agency: General Services Administration; 
2007 Average score: 3.62; 
2013 Average score: 3.54; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 5; 
Agency: United States Agency for International Development; 
2007 Average score: 3.35; 
2013 Average score: 3.52; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 6; 
Agency: Department of Veterans Affairs; 
2007 Average score: 3.71; 
2013 Average score: 3.49; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: decrease. 

Ranking: 7; 
Agency: National Aeronautics Space Administration; 
2007 Average score: 3.71; 
2013 Average score: 3.49; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: decrease. 

Ranking: 8; 
Agency: Small Business Administration; 
2007 Average score: 3.51; 
2013 Average score: 3.47; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 9; 
Agency: Department of Defense; 
2007 Average score: 3.35; 
2013 Average score: 3.44; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 10; 
Agency: Department of Education; 
2007 Average score: 3.49; 
2013 Average score: 3.43; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 11; 
Agency: Environmental Protection Agency; 
2007 Average score: 3.54; 
2013 Average score: 3.42; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 12; 
Agency: Department of the Treasury; 
2007 Average score: 3.54; 
2013 Average score: 3.41; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 13; 
Agency: Department of Housing and Urban Development; 
2007 Average score: 3.57; 
2013 Average score: 3.38; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 14; 
Agency: National Science Foundation; 
2007 Average score: 3.61; 
2013 Average score: 3.37; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 15; 
Agency: Department of Health and Human Services; 
2007 Average score: 3.29; 
2013 Average score: 3.37; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 16; 
Agency: Department of State; 
2007 Average score: 3.36; 
2013 Average score: 3.36; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 17; 
Agency: Department of Commerce; 
2007 Average score: 3.44; 
2013 Average score: 3.35; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 18; 
Agency: Department of Energy; 
2007 Average score: 3.52; 
2013 Average score: 3.34; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: decrease. 

Ranking: 19; 
Agency: Nuclear Regulatory Commission; 
2007 Average score: 3.70; 
2013 Average score: 3.32; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: decrease. 

Ranking: 20; 
Agency: Department of Transportation; 
2007 Average score: 3.35; 
2013 Average score: 3.31; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 21; 
Agency: Department of Justice; 
2007 Average score: 3.30; 
2013 Average score: 3.31; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 22; 
Agency: Department of Homeland Security; 
2007 Average score: 3.33; 
2013 Average score: 3.22; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 23; 
Agency: Department of Interior; 
2007 Average score: 3.16; 
2013 Average score: 3.22; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Ranking: 24; 
Agency: United States Department of Agriculture; 
2007 Average score: 3.16; 
2013 Average score: 3.13; 
Statistically significant increase, decrease, or no significant change 
between 2007 and 2013: no significant change. 

Source: GAO analysis of federal managers surveys for 2007 and 2013. 
GAO-14-747. 

Legend: 
no significant change; 
statistically significant decrease; 
statistically significant increase. 

Notes: The Agency Use of Performance Information Mean Scale Index from 
2007 and 2013 assumes an average opinion for missing and no basis to 
judge responses and includes questions 8a, 8c, 8d, 8e, 8f, 8m, 10h, 
10m, 11c, 12c, and 12d on the 2013 survey. 

The maximum margin of error for our use index was 0.20 or less in both 
2007 and 2013. We measured statistical significance at the p<.05 
value, which indicates that there is less than a 5 percent chance that 
we would observe a change, difference, or association as large as we 
observed if such a change, difference, or association did not exist. 

[End of table] 

In addition, figure 4 illustrates that SES managers used performance 
information, as measured by our index, more than non-SES managers both 
government-wide and within each agency. SES managers government-wide 
and at nine agencies scored statistically significantly higher than 
the non-SES managers at those agencies. As shown in figure 4 below, 
SES and non-SES managers from DHS and VA had the largest gaps in use 
of performance information between their SES and non-SES managers. In 
one agency--the National Science Foundation--the trend was reversed, 
with non-SES managers reporting more favorably than SES managers. 
However, this difference was not statistically significant. 

Figure 4: Difference in Use of Performance Information between SES 
Managers and Non-SES Managers Significant in Most Agencies in 2013: 

[Refer to PDF for image: vertical bar graph] 

Difference in score on 5-point index: 

Agencies: 

Homeland Security[A]: 0.49; 
Veterans Affairs[A]: 0.48; 
Treasury[A]: 0.47; 
Agriculture[A]: 0.42; 
Energy[A]: 0.42; 
Housing and Urban Development[A]: 0.36; 
National Aeronautics and Space Administration: 0.36; 
Nuclear Regulatory Commission[A]: 0.34; 
Agency for International Development[A]: 0.34; 
General Services Administration: 0.31; 
Defense[A]: 0.31; 
Social Security Administration: 0.25; 
Office of Personnel Management: 0.24; 
Health and Human Services: 0.22; 
Justice: 0.18; 
Transportation: 0.15; 
Interior: 0.13; 
Labor: 0.1; 
Environmental Protection Agency: 0.02; 
State: 0.02; 
Commerce: 0.01; 
Education: 0.004; 
Small Business Administration: -0.01; 
National Science Foundation: -0.12; 
Government-wide[A]: 0.24. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

[A] Denotes agencies where the SES managers' score is statistically 
greater than the non-SES managers' score. 

[End of figure] 

Survey Questions Addressing Key Practices Significantly Related to the 
Use of Performance Information: 

Using the data from our 2013 survey of federal managers, we found that 
specific practices identified in our previous work on the use of 
performance information to enhance or facilitate the use of 
performance information for decision making were significantly related 
to the use of performance information as measured by our use index. 
[Footnote 14] Figure 5 shows the questions that we tested based on 
each of the practices. We have highlighted those questions and 
responses that we found to have a statistically significant and 
positive relationship with the use of performance information index. 
[Footnote 15] We found that the average use of performance information 
index for agencies increased when managers reported that their 
agencies engaged to a greater extent in these practices as reflected 
in the survey questions. For example, in 2013, OPM managers responded 
more favorably than the government-wide average on several of the 
survey questions related to these practices. OPM was one of the two 
agencies that experienced an increase in use of performance 
information from 2007 to 2013, as measured by our index. 

Figure 5: Practices and Related Managers Survey Questions 
Statistically and Positively Related to the Use of Performance 
Information Index: 

[Refer to PDF for image: illustration] 

Practice that enhance or facilitate use of performance information: 
Aligning agencywide goals, objectives, and measures; 
Questions found to be statistically and positively related to the use 
of performance information index: 
* Agency managers/supervisors at my level take steps to align program 
performance measures with agencywide goals and objectives. 

Practice that enhance or facilitate use of performance information: 
Improving the usefulness of performance information; 
Questions found to be statistically and positively related to the use 
of performance information index: 
* I have sufficient information on the validity of the performance 
data I use to make decisions; 
Other tested survey questions: 
* My agency's performance information is available in a format that is 
easy to use; 
* Performance Information is available in time to manage the 
program(s)/operation(s)/project(s) that I am involved with. 

Practice that enhance or facilitate use of performance information: 
Developing agency capacity to use performance information; 
Questions found to be statistically and positively related to the use 
of performance information index: 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to use program performance 
information to make decisions; 
Other tested survey questions: 
* My agency has sufficient analytical tools for managers at my level 
to collect, analyze, and use performance information; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to set program performance goals; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to develop program performance 
measures[A]; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to assess the quality of performance 
data; 
* During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to link the 
program(s)/operation(s)/project(s) to the achievement of agency 
strategic goals. 

Practice that enhance or facilitate use of performance information: 
Demonstrating management commitment; 
Questions found to be statistically and positively related to the use 
of performance information index: 
* My agency's top leadership demonstrates a strong commitment to 
achieving results; 
Other tested survey questions: 
* My agency is investing the resources needed to ensure that it's 
performance data is of sufficient quality; 
* My agency is investing in resources to improve the agency's capacity 
to use performance information. 

Practice that enhance or facilitate use of performance information: 
Communicating performance information frequently and effectively; 
Questions found to be statistically and positively related to the use 
of performance information index: 
* Agency managers/supervisors at my level effectively communicate 
performance information on a routine basis; 
* Employees in my agency receive positive recognition for helping the 
agency accomplish its strategic goals; 
Other tested survey questions: 
* My agency's performance information is easily accessible to managers 
at my level; 
* My agency's performance information is easily accessible to 
employees; 
* My agency's performance information is easily accessible to the 
public, as appropriate. 

Source: GAO-13-519SP. GAO-14-747. 

[A] This question was significantly related to the practice of 
developing agency capacity however, it was negatively related, meaning 
that agencies that offered this training tended to score lower on the 
use index. 

[End of figure] 

Aligning Agencywide Goals, Objectives, and Measures: 

Side bar: 

Aligning agencywide goals, objectives, and measures: 
Agency managers/supervisors at my level take steps to align program 
performance measures with agencywide goals and objectives. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

[End of side bar] 

Leading practices state that aligning an agency's goals, objectives, 
and measures increases the usefulness of the performance information 
collected to decision makers at each level, and reinforces the 
connection between strategic goals and the day-to-day activities of 
managers and staff.[Footnote 16] In analyzing the 2013 survey results, 
we found that managers' responses to a related survey question were 
significantly related to the use of performance information 
controlling for other factors. Specifically, increase in the extent to 
which individuals agreed that managers aligned performance measures 
with agency-wide goals and objectives were associated with increase on 
the five-point scale we used for our use index.[Footnote 17] 

Government-wide, an estimated 46 percent of managers at federal 
agencies reported that managers at their levels took steps to align 
program performance measures with agency-wide goals and objectives. 
The Social Security Administration (SSA) and OPM led the 24 agencies 
with approximately 65 percent of managers reporting that they aligned 
program performance measures with agency-wide goals and objectives. 
DHS trailed the other agencies with only 34 percent of their managers 
reporting similarly.[Footnote 18] 

Improving the Usefulness of Performance Information: 

Side bar: 

Improving the usefulness of performance information: 
I have sufficient information on the validity of the performance data 
I use to make decisions. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

[End of side bar] 

Leading practices state that to facilitate the use of performance 
information, agencies should ensure that information meets various 
users' needs for completeness, accuracy, consistency, timeliness, 
validity, and ease of use.[Footnote 19] When analyzing the results of 
our 2013 survey, we found that managers' responses to the statement, 
"I have sufficient information on the validity of the performance data 
I use to make decisions," related to their use of performance 
information. Specifically, individuals who rated their agencies as 
providing a higher extent of sufficient information on the validity of 
performance data for decision making, tended to rate their agencies 
higher on the performance use scale than individuals who rated their 
agencies lower, controlling for other factors. Having sufficient 
information on the validity of performance data for decision making 
had the largest potential effect of the questions included in our 
model. This question was the strongest predictor in our regression 
analysis. 

Government-wide, the percentage of managers responding favorably about 
having sufficient information on the validity of performance data was 
particularly low, at about 36 percent. The National Aeronautics and 
Space Administration (NASA) and OPM led the agencies with more than 50 
percent of managers from NASA and OPM responding that they have 
sufficient information about the validity of performance data for 
decision-making (58 percent and 54 percent, respectively). The U.S. 
Department of Agriculture (USDA) and DHS trailed the other agencies 
with less than 30 percent of their managers responding similarly (28 
percent and 21 percent, respectively). 

Developing Agency Capacity to Use Performance Information: 

Side bar: 

Developing agency capacity to use performance information: 
During the past 3 years, your agency has provided, arranged, or paid 
for training that would help you to use program performance 
information to make decisions. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

[End of side bar] 

Leading practices state that building the capacity for managers to use 
performance information is critical to using performance information 
in a meaningful fashion, and that inadequate staff expertise, among 
other factors, can hinder agencies from using performance 
information.[Footnote 20] When we analyzed the results of our 2013 
survey, we found that managers who said that their agencies have 
provided training that would help them to use performance information 
to make decisions, rated their agencies more positively on our use 
index. Compared to managers who said their agencies had not trained 
them on using performance information in decision making, those who 
said their agencies did rated them higher on the use scale, 
controlling for other factors. 

Government-wide, an estimated 44 percent of the managers who responded 
to our survey reported that their agencies have provided training that 
would help them to use performance information in decision making. The 
U.S. Agency for International Development (USAID) led the agencies in 
this area, with 62 percent of USAID managers responding that their 
agencies have provided training that would help them use of 
performance information in decision making in the last 3 years. 
[Footnote 21] The U.S. Department of the Treasury (Treasury), DHS, the 
Nuclear Regulatory Commission (NRC), and the Environmental Protection 
Agency (EPA) trailed the other agencies with less than 35 percent of 
their managers responding similarly (Treasury and DHS with 34 percent, 
NRC with 33 percent, and EPA with 32 percent) that they had received 
training on use of performance information in the last 3 years. 
[Footnote 22] 

Other types of training did not appear to be positively related to use 
of performance information. Specifically, training on developing 
performance measures was significantly--but negatively--related to use 
of performance information. Training on (1) setting program 
performance goals; (2) assessing the quality of performance data; and 
(3) linking program performance to agency strategic plans was not 
found to relate to managers' use of performance information after 
controlling for other information. 

Demonstrating Management Commitment: 

Side bar: 

Demonstrating management commitment: 
My agency's top leadership demonstrates a strong commitment to 
achieving results. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

[End of side bar] 

Leading practices state that the demonstrated commitment of leadership 
and management to achieving results and using performance information 
can encourage others to embrace using a model that uses performance 
information to make decisions.[Footnote 23] When we analyzed the 
results of our 2013 survey, we found that managers' responses to the 
statement, "My agency's top leadership demonstrates a strong 
commitment to achieving results," were significantly and positively 
related to the use of performance information. Specifically, on 
average, increases in a manager's rating of the strength of their 
agency's top leadership's commitment to achieving results were 
associated with increased ratings of their agencies on the use scale, 
controlling for other factors. 

Government-wide, the percentage of federal managers responding 
favorably about their agencies' top leadership demonstrating a strong 
commitment to achieving results was an estimated 60 percent. Managers 
from NRC (78 percent) and SSA (74 percent) had significantly higher 
scores on this question than the government-wide average, while 
managers from DHS (44 percent) and USDA (42 percent) had lower scores 
than the government-wide average. 

Communicating Performance Information Frequently and Effectively: 

Side bar: 

Communicating performance information frequently and effectively: 
Agency managers/supervisors at my level effectively communicate 
performance information on a routine basis; 
Employees in my agency receive positive recognition for helping the 
agency accomplish it's strategic goals. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

[End of side bar] 

Leading practices state that communicating performance information 
frequently and effectively throughout an agency can help managers to 
inform staff and other stakeholders of their commitment to achieve 
agency goals and to keep these goals in mind as they pursue their day-
to-day activities.[Footnote 24] When analyzing the results of our 2013 
survey, we found that two related questions were significantly and 
positively related to an agency's use of performance information: 

* Agency managers/supervisors at my level effectively communicate 
performance information routinely. 

* Employees in my agency receive positive recognition for helping the 
agency accomplish its strategic goals. 

Specifically, those who reported favorably that agency 
managers/supervisors at their levels effectively communicated 
performance information routinely tended to rate their agencies 
somewhat higher on the use index, controlling for other factors. 
Similarly, those who reported favorably that employees in their agency 
receive positive recognition for helping the agency accomplish its 
strategic goals rated their agencies somewhat higher on the use scale, 
controlling for other factors. 

An estimate 41 percentage of managers government-wide who responded to 
our survey reported that agency managers/supervisors at their level 
effectively communicated performance information routinely. About 60 
percent of managers at the Small Business Administration, Department 
of Labor, and OPM, responded positively when asked about effectively 
communicating performance information routinely (62 percent, 61 
percent, and 60 percent respectively). DHS trailed the other agencies 
with only 34 percent of its managers reporting similarly. 

Government-wide, an estimated 42 percent of the managers responded 
favorably when asked about employees in their respective agencies 
receiving positive recognition for helping the agencies accomplish 
their strategic goals. While the managers at NRC and the U.S. 
Department of Commerce scored at or higher than 50 percent when asked 
about positive recognition (58 percent and 50 percent, respectively), 
DHS trailed federal agencies with only 34 percent of its managers 
reporting similarly. 

Concluding Observations: 

Our analyses of agency-level results from our periodic surveys of 
federal managers in 2007 and 2013 reinforce that there are several 
leading practices and related survey questions that significantly 
influenced agencies' use of performance information for management 
decision making. However, our surveys show that such usage generally 
has not improved over time. This information can be helpful to the 
Office of Management and Budget (OMB) and the Performance Improvement 
Council as they work with federal agencies to identify and implement 
stronger performance management practices to help improve agency use 
of performance information. Moreover, the use of performance 
information will remain a challenge unless agencies can narrow the gap 
in use between Senior Executive Service (SES) and non-SES managers. 

Agency Comments: 

We provided a draft of this report to the Director of OMB and to the 
24 agencies that responded to our 2007 and 2013 federal managers 
surveys. On September 4, 2014, OMB staff provided us with oral 
comments and generally agreed with our report. OMB staff also stated 
that they would continue to work with agencies to address the use of 
performance information through agencies' annual strategic reviews of 
progress toward agencies' strategic objectives, which began in 2014. 

We also received comments from the U.S. Departments of Commerce 
(Commerce) and the Treasury (Treasury), the General Services 
Administration (GSA), and the National Aeronautics and Space 
Administration (NASA). On August 27, 2014, the liaison from NASA e-
mailed us a summary of NASA officials' comments. On August 28, 2014, 
the liaison from GSA e-mailed us a summary of GSA officials' comments. 
On August 29, 2014, the liaisons from Commerce and Treasury e-mailed 
us summaries of their respective agency officials' comments. Commerce 
and GSA generally agreed with our report, and provided technical 
comments, which we incorporated as appropriate. NASA and Treasury 
raised concerns about the findings and conclusions in our report, 
including the design of the surveys. We discuss their comments and our 
evaluation of them below, which generally fell into the following four 
categories: 

* NASA and Treasury raised concerns about the underlying methodology 
for the 2007 and 2013 federal managers surveys. They said that it did 
not adequately provide agency-wide perspectives that fully represented 
the agencies' use of performance information. Specifically, NASA and 
Treasury expressed concerns about the lack of demographic information 
about the survey respondents (e.g. survey respondents by agency 
component and geographic location). Treasury also expressed concern as 
to whether we had included senior leadership in our report. To address 
this comment, we added some additional information to our report that 
discusses our survey design and administration, specifically that we 
did not collect demographic information beyond whether a federal 
manager' was a member of the SES or not (non-SES). Moreover, our 
stratified random sample of federal managers ensured that we had a 
representative sample of federal managers both government-wide and 
within each of the 24 agencies we surveyed. It was not our objective 
to design the survey and draw a sample of managers that would allow us 
to report in a generalizable way at the geographic location or 
organizational level within an agency. Designing a sample to produce 
estimates at the geographic location and/or organizational level 
within an agency would result in a much larger sample than the 
approximately 107,326 managers selected in our 2007 survey and the 
approximately 148,300 managers selected in our 2013 survey. 
Nevertheless, as previously discussed, our sample was sufficient for 
the purposes of this report. 

* NASA and Treasury also expressed concern that despite all the 
efforts their respective agencies have undertaken to implement the 
GPRA Modernization Act of 2010, our draft report did not provide 
information on the root causes for the lack of progress in the use of 
performance information in their agencies. For example, NASA cited 
some of its agency initiatives, including the development of an 
automated performance management data repository to assist in the 
agency's decision-making process. Treasury cited its Quarterly 
Performance Review process as an example of the agency's commitment to 
using performance information in decision making. We recognize the 
activities that the agencies have underway to improve staff engagement 
on the use of performance information for decision making, and have 
previously reported on some of these initiatives.[Footnote 25] 
However, despite the efforts discussed above, our survey results 
showed that the use of performance information, as reported by 
managers at the agencies, has not improved within agencies between 
2007 and 2013. Our report analyzed the results from specific questions 
in both the 2007 and 2013 surveys. We agree that our report does not 
provide information on the root causes for the trends we found in the 
use of performance information. However, the results of the regression 
analysis in this report point to some specific practices that can 
enhance the use of performance information, areas where federal 
agencies may want to focus further analysis and efforts. Both NASA and 
Treasury requested their respective agencies' 2007 and 2013 survey 
data sets to perform additional analyses that might provide further 
insights into root causes underlying the trends in the use of 
performance information within their agencies. 

* Treasury also commented that the rankings we report based on the 
average scores on the 2013 use of performance information index might 
imply that agencies with a higher ranking are theoretically better at 
using performance information, and therefore, have superior 
performance management practices. Treasury also raised concerns about 
our use of the index to score agencies. It asked if it should view the 
higher-ranking agencies as examples of what agencies should do to 
improve the use of performance information. There is not a huge 
difference in scores between those agencies that scored higher on the 
use index than others at the lower end. But, we believe our 
methodology is useful for generally distinguishing between agencies' 
levels of use of performance information, and for assessing change in 
use of performance information over time. However, we revised our 
report to focus on agencies' scores rather than on rank ordering. We 
also did additional statistical testing to determine whether or not 
the changes between the 2007 and 2013 use indexes were statistically 
different among agencies. As for the implication of the rankings on 
the quality of management practices in particular agencies, in 2007, 
we did employ a use index to identify agencies for further case study 
analysis. We selected an agency that had significantly improved on the 
use index along with agencies that scored lower on the index to assess 
whether there were any promising practices or challenges facing those 
agencies.[Footnote 26] 

* NASA, Treasury, and Commerce all commented that it was difficult to 
tell how managers may have interpreted the term "performance 
information" when responding to our surveys. Treasury further 
commented that it was unclear what information managers were using to 
make management decisions if they were not using performance 
information. In both the 2007 and 2013 surveys, we defined the terms 
"performance information" and "performance measures" in the broadest 
sense. To clarify this point, we added the definition of performance 
information from the 2013 managers survey in the report. Moreover, as 
discussed above, additional agency analysis of the root causes 
underlying the use of performance information could provide some 
additional context to the types of information agencies are using for 
decision making. 

The following 20 agencies had no comments on the draft report: the 
U.S. Departments of Agriculture, Defense, Education, Energy, Health 
and Human Services, Homeland Security, Housing and Urban Development, 
the Interior, Justice, Labor, State, Transportation, and Veterans 
Affairs, the Environmental Protection Agency, Nuclear Regulatory 
Commission, Office of Personnel Management, National Science 
Foundation, Small Business Administration, Social Security 
Administration, and the United States Agency for International 
Development. The written response from the Social Security 
Administration is reproduced in appendix II. 

We are sending copies of this report to the agencies that participated 
in our 2013 managers survey, the Director of OMB, as well as 
appropriate congressional committees and other interested parties. In 
addition, this report is available at no charge on the GAO website at 
[hyperlink, http://www.gao.gov]. 

If you or your staff members have any questions about this report, 
please contact me at (202) 512-6806 or mihmj@gao.gov. Contact points 
for our Offices of Congressional Relations and Public Affairs may be 
found on the last page of this report. GAO staff who made key 
contributions to this report are listed in appendix II. 

Signed by: 

J. Christopher Mihm: 
Managing Director, Strategic Issues: 

List of Congressional Addressees: 

The Honorable Thomas R. Carper: 
Chairman: 
The Honorable Tom Coburn, M.D.
Ranking Member: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Mark R. Warner: 
Chairman: 
Task Force on Government Performance: 
Committee on the Budget: 
United States Senate: 

The Honorable Elijah E. Cummings: 
Ranking Member: 
Committee on Oversight and Governmental Reform: 
House of Representatives: 

[End of section] 

Appendix I: Regression Analyses of 2013 Federal Managers Survey 
Results to Identify Predictors of Use of Performance Information: 

In analyzing the results of our 2013 survey, we explored whether 
federal managers' responses to certain survey questions could help 
explain differences in how managers in agencies reported using 
performance information. To examine which factors related to agency 
use of performance information, as measured by the use of performance 
information index, we conducted regression analysis. The regression 
analysis allowed us to assess the unique association between our 
outcome variable--the performance information index--and a given 
predictor variable, while controlling for multiple other predictor 
variables.[Footnote 27] 

To create the use of performance index, we identified survey questions 
that reflected managers' use of performance information for key 
management activities and decision making. The 2013 use of performance 
index included most of the questions included in our 2007 index, and 
additional questions from the 2013 managers survey that we determined 
reflected the concept of use of performance information (see figure 1 
for specific questions included in our index).[Footnote 28] After 
identifying a core set of items from the original index, we tested the 
impact of including and excluding several additional questions related 
to performance management use to ensure the cohesiveness and strength 
of our revised index.[Footnote 29] Our revised index is an average of 
the questions used for the index and runs from 1 to 5, where a 1 
reflects that managers feel the agency engages "to no extent" and a 5 
reflecting that managers feel the agency engages "to a very great 
extent" in the use of performance information activities.[Footnote 30] 
We found the index met generally accepted standards for scale 
reliability.[Footnote 31] 

To develop our regression model examining predictors of performance 
use as measured by our index, we first identified a series of 
variables that were related to one of the five practices we have 
previously found to enhance or facilitate use of performance 
information. These practices include: aligning agencywide goals, 
objectives, and measures; improving the usefulness of performance 
information; developing the capacity to use performance information; 
demonstrating management commitment; and communicating performance 
information frequently and effectively. See figure 3 for the specific 
questions related to these five practices that we included in the 
regression. Although we identified other questions also related to the 
five elements of effective performance management, many of these 
questions were already accounted for in our use index of performance 
information, and we excluded them from consideration in our regression. 

Overall, our results demonstrate that some types of management 
practices and training are more positively correlated than others, 
with manager perceptions of performance information use as measured by 
the use index, even when controlling for other factors. Further, these 
results suggest that certain specific efforts to increase agency use 
of performance information--such as increasing timeliness of 
performance information and providing information on the validity of 
performance measures--may have a higher return than others. 

To execute our analysis, we began with a base model that treated 
differences in managers' views of agency performance management use as 
a function of the agency where they worked. We found that despite 
statistically significant differences on average among managers at 
different agencies, a regression model based on agency alone had very 
poor predictive power (R-squared of .03).[Footnote 32] 

We next examined whether managers' responses to other items reflecting 
the practices of effective performance management related to their 
perceptions of agency use of performance information, independent of 
agency. We found that several items consistently predicted increases 
on individuals' ratings of their agencies use of performance 
management information, including whether managers align program 
performance measures with agency goals and objectives; having 
information on the validity of performance measures; and training on 
how to use performance management information in decision making. We 
also tested this model controlling for whether a respondent was a 
member of the Senior Executive Service (SES), and found similar 
results. 

We also tested our model with a variable to control for agency size in 
five categories. We found that, relative to the largest agencies 
(100,000 or more employees), managers at smaller agencies tended to 
rate their agency's use of performance information slightly lower. The 
significance and magnitude of other significant variables was similar 
whether we controlled for agency size or using intercepts to control 
for individual agencies. 

Our final model had an R-squared of .65, suggesting that the 
independent variables in the model predicted approximately 65 percent 
of the variance in the use index. Specific results are presented in 
table 2 below. Each coefficient reflects the average increase in the 
dependent variable, our five-point use scale, associated with a one-
unit increase in the value of the independent variables.[Footnote 33] 
Note that in our discussion, we highlight the maximum potential impact 
of each variable rather than the increase in the use score associated 
with each increase in a dependent variable. 

Table 2: Final Regression Analysis Based on the 2013 Federal Managers 
Survey (Dependent Variable: Performance Information Use Index): 

Aligning agencywide goals, objectives, and measures: 

Performance Practice/Variable: Agency managers/supervisors at my level 
take steps to align program performance measures with agencywide goals 
and objectives (10j); 
Regression model coefficients/fit information: 0.13***. 

Improving the usefulness of performance information: 

Performance Practice/Variable: I have sufficient information on the 
validity of the performance data I use to make decisions (11d); 
Regression model coefficients/fit information: 0.16***. 

Performance Practice/Variable: My agency's performance information is 
available in a format that is easy to use (7d); 
Regression model coefficients/fit information: 0.02. 

Performance Practice/Variable: Performance information is available in 
time to manage the program(s)/operation(s)/project(s) that I am 
involved with (7g); 
Regression model coefficients/fit information: 0.07*. 

Developing agency capacity to use performance information: 

Performance Practice/Variable: During the past 3 years, your agency has 
provided, arranged, or paid for training that would help you to use 
performance information to make decisions (13e); 
Regression model coefficients/fit information: 0.14*. 

Performance Practice/Variable: My agency has sufficient analytical 
tools for managers at my level to collect, analyze, and use performance 
information (7e); 
Regression model coefficients/fit information: 0.02. 

Performance Practice/Variable: During the past 3 years, your agency has 
provided, arranged, or paid for training that would help you to set 
program performance goals (13b); 
Regression model coefficients/fit information: 0.08. 

Performance Practice/Variable: During the past 3 years, your agency has 
provided, arranged, or paid for training that would help you to develop 
program performance measures (13c); 
Regression model coefficients/fit information: -0.11*. 

Performance Practice/Variable: During the past 3 years, your agency has 
provided, arranged, or paid for training that would help you to assess 
the quality of performance data (13d); 
Regression model coefficients/fit information: 0.03. 

Performance Practice/Variable: During the past 3 years, your agency has 
provided, arranged, or paid for training that would help you to link 
the program(s)/operation(s)/project(s) to the achievement of agency 
strategic goals (13f); 
Regression model coefficients/fit information: -0.03. 

Demonstrating management commitment: 

Performance Practice/Variable: My agency's top leadership demonstrates 
a strong commitment to achieving results (10g); 
Regression model coefficients/fit information: 0.08***. 

Performance Practice/Variable: My agency is investing the resources 
needed to ensure that its performance data are of sufficient quality 
(10f); 
Regression model coefficients/fit information: -0.01. 

Performance Practice/Variable: My agency is investing in resources to 
improved the agency's capacity to use performance information (10i); 
Regression model coefficients/fit information: 0.04. 

Communicating performance information frequently and effectively: 

Performance Practice/Variable: Agency managers/supervisor at my level 
effectively communicate performance information on a routine basis 
(10k); 
Regression model coefficients/fit information: 0.08**. 

Performance Practice/Variable: Employees in my agency receive positive 
recognition for helping the agency accomplish its strategic goals 
(10e); 
Regression model coefficients/fit information: 0.06**. 

Performance Practice/Variable: My agency's performance information is 
easily accessible to managers at my level (7a); 
Regression model coefficients/fit information: -0.03. 

Performance Practice/Variable: My agency's performance information is 
easily accessible to employees (7b); 
Regression model coefficients/fit information: 0.05. 

Performance Practice/Variable: My agency's performance information is 
easily accessible to the public as appropriate (7c); 
Regression model coefficients/fit information: 0.04. 

Controls: 

Performance Practice/Variable: SES (compared to non-SES); 
Regression model coefficients/fit information: 0.01. 

Performance Practice/Variable: Agency as intercept (results omitted). 

Performance Practice/Variable: Constant; 
Regression model coefficients/fit information: 1.03***. 

Performance Practice/Variable: R-squared; 
Regression model coefficients/fit information: .65. 

Performance Practice/Variable: N(number of survey respondents); 
Regression model coefficients/fit information: 2,449. 

Source: GAO analysis of 2013 federal managers survey. GAO-14-747. 

Legend: * p<0.05; ** p<0.01; *** p<0.001. 

With the exception of intercepts for the set of survey questions on 
training, SES status, the constant and each agency, each variable is a 
five-point scale that runs from "to no extent" to "to a very great 
extent." 

R-squared is a measure of how well the variation in the independent 
variables included in the model predicts the variability in the 
dependent variable, and runs from 0 to 1. 

The number in parenthesis following each question corresponds to the 
question number on the 2013 federal managers survey. 

[End of table] 

As seen in table 2, at least one question related to each of the five 
practices to enhance agencies' use of performance information was 
significant. With respect to aligning agencywide goals, objectives, 
and measures, we found that each increase in terms of the extent to 
which individuals felt that managers aligned performance measures with 
agencywide goals and objectives was associated with a .13 increase in 
their score on the use scale, or approximately a .52 increase on the 5-
point use scale when comparing individuals in the lowest to the 
highest categories. 

In terms of improving the usefulness of performance information, we 
found that having information on the validity of performance data for 
decision making was the strongest predictor in our model. Compared to 
individuals who said that they did not have sufficient information on 
the validity of performance data for decision making, on average, 
individuals who said they had a very great extent of information rated 
their agencies approximately 0.64 points higher on the performance use 
scale, controlling for other factors. In contrast, the potential 
effect of the timeliness of information, while significant, had a 
smaller potential impact on managers' perceptions of their agency's 
use of performance information. On average, managers who responded "to 
a very great extent" on whether their agency's performance information 
was available in time to manage programs or projects rated their 
agency about .28 points higher on the performance use scale than those 
who responded "to no extent." 

In terms of developing agency capacity to use performance information, 
we found that one type of training was positively related to use of 
performance information, though other types of training were either 
not related or were negatively related, after controlling for other 
factors. Compared to managers who said their agencies had not trained 
them with training on how to use performance information in decision 
making, those who said their agencies did provide such training rated 
their agencies an average of .14 points higher on the use scale, 
controlling for other factors. The potential effect of this type of 
training was relatively small compared to the potential effect of some 
of the other predictors in our model. In contrast, training in 
developing performance measures was negatively associated with 
managers' perceptions of performance information use. 

With respect to demonstrating management commitment, managers that 
rated their agency's leadership highly in terms of demonstrating a 
strong commitment to achieving results tended to rate their agencies 
higher on performance information use, as measured by our use index. 
Each increase in the extent to which a manager felt their agency 
leadership was committed to results was associated with a .08 increase 
in the performance use index, or up to a .32 increase in the five-
point performance use index when comparing managers who reported "no 
extent" of leadership commitment to those that reported "a very great 
extent." 

Two questions related to communicating performance information 
frequently and effectively were significantly and positively 
associated with manager's perceptions of an agency's use of 
performance information, controlling for other factors. Compared to 
those who rated their agencies the lowest in terms of whether managers 
and supervisors effectively communicated performance information 
routinely--those managers who rated their agencies most highly 
averaged .32 points higher on the five-point performance use index. 
Similarly, managers who reported that employees in their agency 
received "a very great extent" of positive recognition for helping the 
agency to accomplish strategic goals rated their agencies an average 
of .24 points higher on performance information use, as measured by 
our use index. We did not find a statistically significant 
relationship between the accessibility of performance information (to 
managers, employees or the public) and managers' perceptions of use of 
performance information. 

Sensitivity and Specification Testing: 

To conduct our analysis, we used Stata software to generate regression 
estimates that incorporated variance calculations appropriate for the 
complex design of the survey data.[Footnote 34] To ensure that large 
amounts of missing data do not result from listwise deletion, we 
imputed values for individual questions if the individual is missing 
or indicated "no basis to judge" on three or fewer responses from the 
23 variables initially tested in the regression, using the agency-
level average to impute. Individuals missing data on more than 3 of 
the 23 potential variables were dropped from the analysis. We 
conducted a variety of sensitivity checks to ensure that our results 
were robust across different specifications and assumptions. For the 
most part, we found generally similar patterns across models in terms 
of the magnitude and significance of different variables related to 
the elements of effective performance management. 

In general, our models assume that the relationship between the 
independent and dependent variables is linear, and that changes in the 
dependent variable associated with a change in the independent 
variable are similar across each ordinal category. Under this 
specification, the change in the use index associated with a shift 
from "to no extent" to "to a small extent" is assumed to be similar to 
the change associated with an increase from "to a great extent" to "a 
very great extent". To determine whether the linear specification was 
appropriate, or consistent with the observed data, we tested versions 
of our models that treated independent variables with a Likert-scale 
response as categorical.[Footnote 35] We found our results to be 
robust across a variety of specifications, including those that 
relaxed the assumption of linearity for responses based on a five-
point scale. 

[End of section] 

Appendix II: Comments from the Social Security Administration: 

Social Security: 
Office of the Commissioner: 
Social Security Administration: 
Baltimore, MD 21235-0001: 

August 27, 2014: 

Mr. J. Christopher Mihm: 
Managing Director, Strategic Issues: 
U. S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Mihm: 

Thank you for the opportunity to review the draft report, "Managing 
For Results: Agencies' Trends in the Use of Performance Information to 
Make Decisions" (GAO-14-747). We have no comments. 

If you have any questions, please contact me at (410) 965-2850. Your 
staff may contact Gary S. Hatcher, Senior Advisor for Records 
Management and Audit Liaison Staff, at (410) 965-0680. 

Signed by: 

James A. Kissko: 
Chief of Staff: 

[End of section] 

Appendix III: GAO Contacts and Staff Acknowledgments: 

GAO Contact: 

J. Christopher Mihm, Managing Director, (202) 512-6806 or 
mihmj@gao.gov: 

Staff Acknowledgments: 

In addition to the contact named above, Sarah Veale (Assistant 
Director), Margaret McKenna Adams, Tom Beall, Mallory Barg Bulman, 
Chad Clady, Karin Fangman, Cynthia Jackson, Janice Latimer, Donna 
Miller, Anna Maria Ortiz, Kathleen Padulchick, Mark Ramage, Joseph 
Santiago, Albert Sim, and Megan Taylor made key contributions to this 
report. 

[End of section] 

Footnotes: 

[1] GAO, Managing for Results: Data-Driven Performance Reviews Show 
Promise But Agencies Should Explore How to Involve Other Relevant 
Agencies, [hyperlink, http://www.gao.gov/products/GAO-13-228] 
(Washington, D.C.: Feb. 27, 2013). 

[2] Pub. L. No. 103-62, 107 Stat. 285 (Aug. 3, 1993). 

[3] Pub. L. No. 111-352, 124 Stat. 3866 (Jan. 4, 2011). 

[4] GAO, Managing for Results: Executive Branch Should More Fully 
Implement the GPRA Modernization Act to Address Pressing Governance 
Challenges, [hyperlink, http://www.gao.gov/products/GAO-13-518] 
(Washington, D.C.: June 26, 2013). 

[5] 31 U.S.C. § 901(b). The 24 CFO Act agencies are the U.S. 
Departments of Agriculture, Commerce, Defense, Education, Energy, 
Health and Human Services, Homeland Security, Housing and Urban 
Development, the Interior, Justice, Labor, State, Transportation, the 
Treasury, and Veterans Affairs, as well as the U.S. Agency for 
International Development, Environmental Protection Agency, General 
Services Administration, National Aeronautics and Space 
Administration, National Science Foundation, Nuclear Regulatory 
Commission, Office of Personnel Management, Small Business 
Administration, and Social Security Administration. 

[6] Our surveys were completed in 1997, 2000, 2003, 2007, and 2013 and 
were designed to obtain the observations and perceptions of 
respondents on various aspects of results-oriented management topics. 
The surveys from 1997 and 2003 were government-wide surveys only. 
While the 2000 survey captured agency-level responses, DHS was not 
created until 2002, which makes the data difficult to compare over 
time at an agency-level. 

[7] All results are subject to some uncertainty or sampling error as 
well as nonsampling error. Because we followed a probability procedure 
based on random selections, our sample is only one of a large number 
of samples that we might have drawn. Since each sample could have 
provided different estimates, we express our confidence in the 
precision of our particular sample's results as a 95 percent 
confidence interval. This is the interval that would contain the 
actual population value for 95 percent of the samples we could have 
drawn. 

[8] GAO, Managing for Results: 2013 Federal Managers Survey on 
Organizational Performance and Management Issues, an E-supplement to 
[hyperlink, http://www.gao.gov/products/GAO-13-518], [hyperlink, 
http://www.gao.gov/products/GAO-13-519SP] (Washington, D.C.: June 26, 
2013) and Government Performance: 2007 Federal Managers Survey on 
Performance and Management Issues, an E-supplement to [hyperlink, 
http://www.gao.gov/products/GAO-08-1026T], [hyperlink, 
http://www.gao.gov/products/GAO-08-1036SP] (Washington, D.C.: July 24, 
2008). 

[9] For the 2007 use of performance information index, see GAO, 
Government Performance: Lessons Learned for the Next Administration on 
Using Performance Information to Improve Results, GAO-08-1026T 
(Washington, D.C.: July 24, 2008). The questions we used and reported 
on in the 2007 survey were slightly different than the ones that we 
used for the 2013 survey. To ensure comparability across years, we 
also recalculated the 2007 index using the 2007 data for the updated 
set of questions. We used Cronbach's alpha, a measure of whether the 
variation in the scale captures the majority of the variation in the 
underlying items, to assess the cohesiveness of the questions we 
included in the index. For more information see appendix I. 

[10] From our review of the literature and interviews with experts and 
staff from five agencies (the U.S. Departments of Commerce, Labor, 
Transportation, and Veterans Affairs, and the Small Business 
Administration) in 2005, we developed a conceptual framework 
identifying five practices that contribute to using performance 
information. See GAO, Managing for Results: Enhancing Agency Use of 
Performance Information for Management Decision Making, GAO-05-927 
(Washington, D.C.: Sept. 9, 2005). 

[11] We measured statistical significance at the p<.05 value, which 
indicates that there is less than a 5 percent chance that we would 
observe a change, difference, or association as large as we observed 
if such a change, difference, or association did not exist. 

[12] The outcome variable is also referred to as the dependent 
variable. Predictor variables are also referred to as independent 
variables. 

[13] [hyperlink, http://www.gao.gov/products/GAO-08-1036SP] and 
[hyperlink, http://www.gao.gov/products/GAO-13-519SP]. 

[14] [hyperlink, http://www.gao.gov/products/GAO-05-927]. 

[15] Most of the items on our surveys asked respondents to rate the 
strength of their perception on a five-point extent scale ranging from 
"to no extent" at the low end of the scale to "to a very great extent" 
at the high end. 

[16] [hyperlink, http://www.gao.gov/products/GAO-05-927]. 

[17] See appendix I for additional information on our regression model 
and the magnitude of the coefficients. 

[18] Government-wide percentage estimates, based on the 2013 survey 
have 95 percent confidence intervals within +/-4 percentage points of 
the estimates themselves, unless otherwise noted. For agency-wide 
percentage estimates these confidence intervals are within +/-10 
percentage points of the estimates themselves. 

[19] [hyperlink, http://www.gao.gov/products/GAO-05-927]. 

[20] [hyperlink, http://www.gao.gov/products/GAO-05-927]. 

[21] USAID's estimate has a 95 percent confidence interval within +/- 
11.7 percentage points. 

[22] EPA's estimate has a 95 percent confidence interval within +/-
10.2 percentage points. 

[23] [hyperlink, http://www.gao.gov/products/GAO-05-927]. 

[24] [hyperlink, http://www.gao.gov/products/GAO-05-927]. 

[25] For example, we discussed the quarterly performance reviews 
currently underway at the Treasury in [hyperlink, 
http://www.gao.gov/products/GAO-13-518]. 

[26] GAO, Results-Oriented Management: Strengthening Key Practices at 
FEMA and Interior Could Promote Greater Use of Performance 
Information, [hyperlink, http://www.gao.gov/products/GAO-09-676] 
(Washington, D.C.: Aug. 17, 2009). 

[27] The outcome variable is also referred to as the dependent 
variable, and predictor variables are also referred to as independent 
variables. 

[28] For more information on the original index we created for the 
2007 federal managers survey, see [hyperlink, 
http://www.gao.gov/products/GAO-08-1026T]. 

[29] To ensure comparability across years, we also recalculated the 
2007 index using the 2007 data for the updated set of questions. 

[30] To ensure that we did not lose cases due to item non-response, we 
imputed the value for missing questions using the agency average. 

[31] We tested the reliability of our index using Cronbach's alpha, a 
measure of whether the variation in the scale captures the majority of 
the variation in the underlying items. Our use of performance 
information index had a Cronbach's alpha of .90 or higher in 2007 and 
2013. 

[32] R-squared is a measure of how well the variation in the 
independent variables included in the model predicts the variability 
in the dependent variable, and runs from 0 to 1. 

[33] With the exception of intercepts for training, SES status, the 
constant and agency, questions in our model run from 1 to 5 with 1 
reflecting a response of "to no extent" and 5 reflecting a response of 
"to a great extent." 

[34] For more information on the sample design of our 2013 survey, see 
[hyperlink, http://www.gao.gov/products/GAO-13-519SP]. 

[35] Response categories for the majority of items in our model ran 
from "to a very great extent," "to a great extent," "to a moderate 
extent," "to a small extent" and "to no extent." 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation, and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the 
performance and accountability of the federal government for the 
American people. GAO examines the use of public funds; evaluates 
federal programs and policies; and provides analyses, recommendations, 
and other assistance to help Congress make informed oversight, policy, 
and funding decisions. GAO's commitment to good government is 
reflected in its core values of accountability, integrity, and 
reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's website [hyperlink, http://www.gao.gov]. Each 
weekday afternoon, GAO posts on its website newly released reports, 
testimony, and correspondence. To have GAO e-mail you a list of newly 
posted products, go to [hyperlink, http://www.gao.gov] and select 
"E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO's actual cost of 
production and distribution and depends on the number of pages in the 
publication and whether the publication is printed in color or black 
and white. Pricing and ordering information is posted on GAO's 
website, [hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or 
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card, 
MasterCard, Visa, check, or money order. Call for additional 
information. 

Connect with GAO: 

Connect with GAO on facebook, flickr, twitter, and YouTube.
Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts.
Visit GAO on the web at [hyperlink, http://www.gao.gov]. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 
Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; 
E-mail: fraudnet@gao.gov; 
Automated answering system: (800) 424-5454 or (202) 512-7470. 

Congressional Relations: 

Katherine Siggerud, Managing Director, siggerudk@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, DC 20548. 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, DC 20548. 

[End of document]