This is the accessible text file for GAO report number GAO-13-518 entitled 'Managing for Results: Executive Branch Should More Fully Implement the GPRA Modernization Act to Address Pressing Governance Challenges' which was released on June 26, 2013. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Committees: June 2013: Managing for Results: Executive Branch Should More Fully Implement the GPRA Modernization Act to Address Pressing Governance Challenges: GAO-13-518: GAO Highlights: Highlights of GAO-13-518, a report to congressional committees. Why GAO Did This Study: The federal government faces significant and long-standing fiscal, management, and performance challenges. The act’s implementation offers opportunities for Congress and the executive branch to help address these challenges. This report is the latest in a series in which GAO, as required by the act, reviewed the act’s initial implementation. GAO assessed the executive branch’s (1) progress in implementing the act and (2) effectiveness in using tools provided by the act to address key governance challenges. To address these objectives, GAO reviewed the act, related OMB guidance, and past and recent GAO work related to federal performance management and the act; and interviewed OMB staff. In addition, to determine the extent to which agencies are using performance information and several of the act’ s requirements to improve agency results, GAO surveyed a stratified random sample of 4,391 federal managers from 24 agencies, with a 69 percent response rate which allows GAO to generalize these results. What GAO Found: The executive branch has taken a number of steps to implement key provisions of the GPRA Modernization Act (the act). The Office of Management and Budget (OMB) developed interim cross-agency priority (CAP) goals, and agencies developed agency priority goals (APG). Agency officials reported that their agencies have assigned performance management leadership roles and responsibilities to officials who generally participate in performance management activities, including quarterly performance reviews (QPR) for APGs. Further, OMB developed Performance.gov, a government-wide website, which provides quarterly updates on the CAP goals and APGs. However, the executive branch needs to do more to fully implement and leverage the act’s provisions to address governance challenges. OMB and agencies have identified many programs and activities that contribute to goals, as required, but are missing additional opportunities to address crosscutting issues. For example, few have identified tax expenditures, which represented about $1 trillion in forgone revenue in fiscal year 2012, due to a lack of OMB guidance and oversight. Therefore, the contributions made by tax expenditures towards broader federal outcomes are unknown. Ensuring performance information is useful and used by federal managers to improve results remains a weakness. GAO found little improvement in managers’ reported use of performance information or practices that could help promote this use. There was a decline in the percentage of managers that agreed that their agencies’ top leadership demonstrates a strong commitment to achieving results. However, agencies' QPRs show promise as a leadership strategy for improving the use of performance information in agencies. Agencies have taken steps to align daily operations with agency results, but continue to face difficulties measuring performance. Agencies have established performance management systems to align individual performance with agency results. However, agencies continue to face long-standing issues with measuring performance across various programs and activities. The Performance Improvement Council (PIC) could do more to examine and address these issues, given its responsibilities for addressing crosscutting performance issues and sharing best practices. Without a comprehensive examination of these issues and an approach to address them, agencies will likely continue to experience difficulties in measuring program performance. Communication of performance information could better meet users’ needs. Federal managers and potential users of Performance.gov reported concerns about the accessibility, availability, understandability, and relevance of performance information to the public. Further outreach to key stakeholders could help improve how this information is communicated. Agency performance information is not always useful for congressional decision making. Consultations with Congress are intended, in part, to ensure this information is useful for congressional decision making. However, GAO found little evidence that meaningful consultations occurred related to agency strategic plans and APGs. GAO also found that the performance information provided on Performance.gov may not fully be meeting congressional needs. What GAO Recommends: GAO recommends OMB improve implementation of the act and help address challenges by ensuring that the contributions of tax expenditures to crosscutting and agency goals are identified and assessed, and developing a detailed approach for addressing long-standing performance measurement issues. OMB staff agreed with these recommendations. GAO also reports on the status of existing recommendations related to CAP goals, APGs, QPRs, the PIC, agency performance management training, and Performance.gov. View [hyperlink, http://www.gao.gov/products/GAO-13-518]. For more information, contact J. Christopher Mihm at (202) 512-6806 or mihmj@gao.gov. [End of section] Contents: Letter: Background: The Executive Branch Has Taken Important Steps to Implement Key GPRAMA Provisions: The Executive Branch Needs to More Fully Use the GPRAMA Framework to Address Pressing Federal Governance Challenges: Conclusions: Recommendations for Executive Action: Agency Comments: Appendix I: Objectives, Scope, and Methodology: Appendix II: Status of Key Recommendations Related to GPRAMA: Appendix III: GAO Contact and Staff Acknowledgments: Tables: Table 1: CAP Goals, APGs, and QPRs Did Not Include All Relevant Participants: Table 2: Illustrative Examples of Reported Difficulties Agencies Face in Measuring Performance by Program Type: Figures: Figure 1: Fragmentation, Overlap, and Duplication Definitions: Figure 2: More Managers Report Collaborating Outside of Their Program When They View Their Program as Contributing to a Great or Very Great Extent in the Achievement of the CAP Goals: Figure 3: More Managers Report Collaborating Outside of Their Program When They View Their Program as Contributing to a Great or Very Great Extent to the Achievement of the APGs: Figure 4: Little Change in Percentage of Federal Managers Reporting That They Use Performance Information to a "Great" or "Very Great" Extent for Various Management Activities: Figure 5: Less than Two-Thirds of Federal Managers Agreed in 2013 to a "Great" or "Very Great" Extent with Statements about Leadership and Supervisor Commitment and Attention to Performance Information: Figure 6: Percentage of Federal Managers Characterizing Specific Factors as a "Great" or "Very Great" Hindrance to Using Performance Information Shows Some Improvement, though Many Continue to Report Hindrances: Figure 7: Less than Half of Federal Managers Agree to a "Great" or "Very Great" Extent with Most Statements about the Usefulness of Performance Information: Figure 8: After Increasing from 1997 Levels, Percentage of Federal Managers Responding "Yes" to Most Items on Whether Their Agencies Made Training Available in the Past 3 Years on Specific Performance Management Tasks Has Leveled Off or Declined: Figure 9: SES Managers Reported Greater Familiarity than Non-SES Managers with QPRs at Their Agencies in 2013 Survey: Figure 10: More Managers Agreed to a "Great" or "Very Great" Extent with Statements on QPR Uses When Their Programs Have Been the Subject of QPRs to a Greater Extent: Figure 11: More Managers Agreed to a "Great" or "Very Great" Extent with Statements on QPR Practices When Their Programs Have Been the Subject of QPRs to a Greater Extent: Figure 12: Gap Remains between Percent of SES Managers Reporting They Are Being Held Accountable and Percent Reporting They Have Decision- Making Authority to a "Great" or "Very Great" Extent: Figure 13: Little Change over Time in Federal Managers Reporting Use of Performance Information in Employee Performance Management Issues to a "Great" or "Very Great" Extent: Figure 14: Generally No Statistically Significant Increase since 2007 in the Reported Presence of Performance Measures Available to a "Great" or "Very Great" Extent: Abbreviations: APG: agency priority goal: AQI: Agriculture Quarantine Inspection: CAP goal: cross-agency priority goal: CFO: Chief Financial Officer: COO: Chief Operating Officer: CPDF: Central Personnel Data File: DHS: Department of Homeland Security: DOD: Department of Defense: DOE: Department of Energy: DOJ: Department of Justice: DOT: Department of Transportation: FAA: Federal Aviation Administration: EHRI-SDM: Enterprise Human Resources Integration-Statistical Data Mart: FEMA: Federal Emergency Management Agency: FDA: Food and Drug Administration: FRA: Federal Railroad Administration: FTA: Federal Transit Administration: GEAR: Goals-Engagement-Accountability-Results: GPRA: Government Performance and Results Act of 1993: GPRAMA: GPRA Modernization Act of 2010: GS: general schedule: GSA: General Services Administration: HUD: Department of Housing and Urban Development: IRS: Internal Revenue Service: OMB: Office of Management and Budget: OPM: Office of Personnel Management: PIC: Performance Improvement Council: PIO: Performance Improvement Officer: QPR: quarterly performance review: SBA: Small Business Administration: SES: Senior Executive Service: Treasury: Department of the Treasury: USDA: U.S. Department of Agriculture: VA: Department of Veterans Affairs: VHA: Veterans Health Administration: [End of section] GAO: United States Government Accountability Office: 441 G St. N.W. Washington, DC 20548: June 26, 2013: The Honorable Thomas R. Carper: Chairman: The Honorable Tom Coburn: Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Mark R. Warner: Chairman: Task Force on Government Performance: Committee on the Budget: United States Senate: The Honorable Elijah Cummings: Ranking Member: Committee on Oversight and Government Reform: House of Representatives: The federal government is one of the world's largest and most complex entities, with about $3.5 trillion in outlays in fiscal year 2012 funding a vast array of programs and operations. It faces a number of significant fiscal, management, and performance challenges in responding to the diverse and increasingly complex issues it seeks to address. Addressing these challenges will require actions on multiple fronts. For example, program structures that are outmoded, fragmented, overlapping, or duplicative and not up to the challenges of the times must be reformed or restructured. Since 2011, our series of annual reports has identified 162 areas of potential duplication, overlap, or fragmentation as well as cost savings and revenue-enhancing opportunities.[Footnote 1] In addition, weaknesses in management capacity, both government-wide and in individual agencies, undermine efficient and effective government. The recent update to our high-risk list identified numerous opportunities to reduce costs and improve government performance.[Footnote 2] Moving forward, the federal government will need to make tough choices in setting priorities as well as reforming programs and management practices to better link resources to results. In that regard, we have previously reported that the performance planning and reporting framework originally put into place by the Government Performance and Results Act of 1993 (GPRA),[Footnote 3] and significantly enhanced by the GPRA Modernization Act of 2010 (GPRAMA or the act),[Footnote 4] provides important tools that can help inform congressional and executive branch decision making to address challenges the federal government faces.[Footnote 5] For example, we recently reported on several issues that hinder the federal government's ability to address fragmentation, overlap, and duplication, including the need for improved and regular performance information, the absence of a comprehensive list of federal programs, and the lack of related funding information.[Footnote 6] If effectively implemented, GPRAMA could help address these issues as well as improve information sharing and coordination among federal agencies--both of which are needed to further address governance challenges related to fragmentation, overlap, and duplication.[Footnote 7] GPRAMA lays out a schedule for gradual implementation of its provisions during a period of interim implementation--from its enactment in January 2011 to February 2014 when a new planning and reporting cycle for federal agencies begins. GPRAMA also includes provisions requiring us to review implementation of the act at several critical junctures and provide recommendations to improve its implementation. This report is the final in a series responding to the mandate to assess initial implementation of the act by June 2013, and pulls together findings from our recent work related to the act, the results of our periodic survey of federal managers, and our related recent work on federal governance, performance, and coordination issues.[Footnote 8] Our specific objectives for this report were to assess the executive branch's (1) progress in implementing key provisions of the act and (2) effectiveness in using tools provided by the act to address key governance challenges the federal government faces. To address these objectives, we reviewed GPRAMA, related congressional documents and Office of Management and Budget (OMB) guidance, and our past and recent work related to managing for results and the act. We also interviewed OMB staff. To determine the extent to which agencies are using performance information and several of the act's requirements to improve agency results, we surveyed a stratified random sample of 4,391 persons from a population of approximately 148,300 mid-level and upper-level civilian managers and supervisors (General Schedule levels 13 through 15 and career Senior Executive Service (SES), or equivalent) working in the 24 executive branch agencies covered by the Chief Financial Officers (CFO) Act of 1990, as amended.[Footnote 9] The web-based survey was administered between November 2012 and February 2013 and is comparable to surveys we conducted in 1997, 2000, 2003, and 2007.[Footnote 10] For this report, our focus is on comparing the 2013 survey results with those from the 1997 baseline survey and with the results of the 2007 survey, which is the most recent survey conducted before GPRAMA was enacted in 2011. We noted the results from the other two surveys--2000 and 2003--when statistically significant trends compared to 2013 occurred. For the 2013 survey, we received usable questionnaires from about 69 percent of the eligible sample. The response rate across the 24 agencies ranged from 57 percent to 88 percent. The overall survey results are generalizable to the population of managers as described above at each of the 24 agencies and government-wide. Concurrently with this report, we are issuing an electronic supplement that shows all of the aggregated responses to all survey items at the government-wide and individual agency levels.[Footnote 11] To help determine the reliability and accuracy of the database elements used to draw our sample of federal managers for the 2013 survey, we checked the data for reasonableness and the presence of any obvious or potential errors in accuracy and completeness and reviewed our past analyses of the reliability of this database.[Footnote 12] We believe the data used to draw our sample are sufficiently reliable for the purpose of this report. Appendix I provides additional information about our objectives, scope, and methodology. We conducted this performance audit from August 2012 to June 2013 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: GPRAMA is a significant enhancement of GPRA, which was the centerpiece of a statutory framework that Congress put in place during the 1990s to help resolve long-standing management problems in the federal government and provide greater accountability for results. GPRA sought to focus federal agencies on performance by requiring agencies to develop long-term and annual goals--contained in strategic and annual performance plans--and measure and report on progress towards those goals on an annual basis. In our past reviews of its implementation, we found that GPRA provided a solid foundation to achieve greater results in the federal government, but several key governance challenges remained-- particularly related to: * addressing crosscutting issues; * ensuring performance information was useful and used by agency leadership and managers and the Congress; * strengthening the alignment between individual performance and agency results as well as holding individuals and organizations responsible for achieving those results; * measuring performance for certain types of programs; and: * providing timely, useful information about the results achieved by agencies.[Footnote 13] To help address these and other challenges, GPRAMA revises existing provisions and adds new requirements, including the following: * Cross-agency priority (CAP) goals: OMB is required to coordinate with agencies to establish federal government priority goals-- otherwise referred to as CAP goals--that include outcome-oriented goals covering a limited number of policy areas as well as goals for management improvements needed across the government. The act also requires that OMB--with agencies--develop annual federal government performance plans to, among other things, define the level of performance to be achieved toward the CAP goals. * Agency priority goals (APGs): Certain agencies are required to develop a limited number of APGs every 2 years. Both the agencies required to develop these goals and the number of goals to be developed are determined by OMB. These goals are to reflect the highest priorities of each selected agency, as identified by the head of the agency, and be informed by the CAP goals as well as input from relevant congressional committees. * Leadership positions: Although most of these positions previously existed in government, they were created by executive orders, presidential memoranda, or OMB guidance.[Footnote 14] GPRAMA established these roles in law, provided responsibilities for various aspects of performance improvement, and elevated some of them. - Chief operating officer (COO): The deputy agency head, or equivalent, is designated COO, with overall responsibility for improving agency management and performance. - Performance improvement officer (PIO): Agencies are required to designate a senior executive within the agency as PIO, who reports directly to the COO and has responsibilities to assist the agency head and COO with performance management activities. - Goal leader: For each CAP goal, OMB must identify a lead government official--referred to by OMB as a goal leader--responsible for coordinating efforts to achieve each of the goals. For agency performance goals, including APGs, agencies must also designate a goal leader, who is responsible for achieving the goal. * Performance Improvement Council (PIC): Originally created by a 2007 executive order,[Footnote 15] GPRAMA establishes the PIC in law and included additional responsibilities. The PIC is charged with assisting OMB to improve the performance of the federal government and achieve the CAP goals. Among its other responsibilities, the PIC is to facilitate the exchange among agencies of useful performance improvement practices and work to resolve government-wide or crosscutting performance issues. The PIC is chaired by the Deputy Director for Management at OMB and includes agency PIOs from each of the 24 CFO Act agencies as well as other PIOs and individuals designated by the chair. * Quarterly performance reviews (QPR): For each APG, agencies are required to conduct QPRs to review progress towards the goals and develop strategies to improve performance, as needed. These reviews are to be led by the agency head and COO and include the PIO, relevant goal leaders, and other relevant parties both within and outside the agency. * Performance.gov: OMB is required to develop a single, government- wide performance website to communicate government-wide and agency performance information. The website--implemented by OMB as Performance.gov--is required to make available information on APGs and CAP goals, updated on a quarterly basis; agency strategic plans, annual performance plans, and annual performance reports; and an inventory of all federal programs. * Performance management capacity: The Office of Personnel Management (OPM) is charged with three responsibilities under the act. OPM is to (1) in consultation with the PIC, identify key skills and competencies needed by federal employees to carry out a variety of performance management activities; (2) incorporate these skills and competencies into relevant position classifications; and (3) work with agencies to incorporate these key skills into agency training. The Executive Branch Has Taken Important Steps to Implement Key GPRAMA Provisions: Since GPRAMA's enactment in January 2011, OMB and agencies have taken a number of important steps to implement key provisions related to the act's planning and reporting requirements. In February 2012, OMB identified 14 interim CAP goals concurrent with the submission of the President's Budget. Nine of the goals related to crosscutting policy areas and 5 covered management improvements.[Footnote 16] In addition, at the same time, 24 agencies selected by OMB developed 103 APGs for 2012 and 2013,[Footnote 17] and OMB published information about these goals as well as the CAP goals on Performance.gov, which OMB considers to comprise the federal government performance plan. In December 2012, OMB expanded the information available on the site by providing an update on fiscal year 2012 performance for both sets of goals, and in March 2013, quarterly updates of the site began. All 24 CFO Act agencies are conducting QPRs, according to our survey of PIOs at these agencies.[Footnote 18] Our 2013 survey indicates that approximately one-third (33 percent) of federal managers across the government are at least somewhat familiar with the QPRs.[Footnote 19] These and related efforts were based on OMB guidance on implementing the act issued in 2011 and 2012.[Footnote 20] As another positive development, OMB and agencies have also put into place key aspects of the act's performance management leadership roles. We recently reported that, at the agency level, all 24 CFO Act agencies have assigned senior-level officials to the COO, PIO, and goal leader roles. Furthermore, OMB guidance directed agencies with PIOs who are political appointees or other officials with limited-term appointments to appoint a career senior executive to serve as deputy PIO. Nearly all (22) of the CFO Act agencies have assigned officials to the deputy PIO role, according to our PIO survey. PIOs we surveyed reported that most performance management officials (COOs, PIOs, deputy PIOs and goal leaders) had large involvement in four primary tasks that summarize the performance management responsibilities required by GPRAMA: (1) strategic and performance planning and goal setting, (2) performance measurement and analysis, (3) communicating agency progress toward goals, and (4) agency quarterly performance reviews. At the government-wide level, the PIC has taken steps to meet its requirement to facilitate the exchange of useful practices and tips and tools to strengthen agency performance management. For example, it established the Goal Setting Working Group to help agencies set their 2012 to 2013 APGs; the Internal Agency Reviews Working Group to share best practices for QPRs; and the Business Intelligence Working Group to share tools for data analytics. PIOs we surveyed reported that, in general, they found the PIC helpful and that there was strong agency participation in the PIC and its working groups. However, in April 2013 we reported that the PIC has not routinely assessed its performance and recommended that OMB work with the PIC to: * conduct formal feedback on the PIC's performance from member agencies on an ongoing basis; and: * update the PIC's strategic plan and review the PIC's goals, measures, and strategies for achieving performance, and revise them if appropriate.[Footnote 21] OMB staff agreed with these recommendations. In addition, OPM has completed its work identifying key skills and competencies needed by performance management staff and incorporating those skills and competencies into relevant position classifications. OPM identified 15 competencies for performance management staff and published them in a January 2012 memorandum from the OPM Director. It also identified relevant position classifications that are related to the competencies for performance management staff and worked with a PIC working group to develop related guidance and tools for agencies. Furthermore, OPM has taken steps to work with agencies to incorporate the key competencies into agency training. However, we reported in April 2013 that these efforts have been broad-based and not informed by specific assessments of agency training needs.[Footnote 22] We recommended that, in coordination with the PIC and the Chief Learning Officers Council, OPM (1) identify competency areas needing improvement within agencies, (2) identify agency training that focuses on needed performance management competencies, and (3) share information about available agency training on competency areas needing improvement. OPM agreed with these recommendations and reported that it will take actions to implement them. The Executive Branch Needs to More Fully Use the GPRAMA Framework to Address Pressing Federal Governance Challenges: OMB and Agencies Have Made Some Progress Addressing Crosscutting Issues, but Are Missing Additional Opportunities: Many of the meaningful results that the federal government seeks to achieve, such as those related to protecting food and agriculture and providing homeland security, require the coordinated efforts of more than one federal agency, level of government, or sector. However, agencies face a range of challenges and barriers when they attempt to work collaboratively.[Footnote 23] The need for improved collaboration has been highlighted throughout our work over many years, in particular in two bodies of work. First, our reports over the past 3 years identified more than 80 areas where opportunities exist for executive branch agencies or Congress to reduce fragmentation, overlap, and duplication.[Footnote 24] Figure 1 defines and illustrates these terms. Figure 1: Fragmentation, Overlap, and Duplication Definitions: [Refer to PDF for image: 3 illustrations] Fragmentation refers to those circumstances in which more than one federal agency (or more than one organization within an agency) is involved in the same broad area of national need and opportunities exist to improve service delivery. Overlap occurs when multiple agencies or programs have similar goals, engage in similar activities or strategies to achieve them, or target similar beneficiaries. Duplication occurs when two or more agencies or programs are engaged in the same activities or provide the same services to the same beneficiaries. Source: GAO. [End of figure] We found that resolving many of these issues requires better collaboration among agencies. Second, collaboration and improved working relationships across agencies are fundamental to many of the issues that we have designated as high risk due to their vulnerabilities to fraud, waste, abuse, and mismanagement, or most in need of transformation.[Footnote 25] For almost 2 decades we have reported on agencies' missed opportunities for improved collaboration through the effective implementation of GPRA. In our 1997 assessment of the status of the implementation of GPRA, we reported that agencies faced challenges addressing crosscutting issues, which led to fragmentation and overlap.[Footnote 26] Again, we reported in 2004--10 years after the enactment of GPRA--that there was still an inadequate focus on addressing issues that cut across federal agencies.[Footnote 27] On a government-wide level, we reported that OMB did not fully implement a government-wide performance plan, as was required by GPRA. Additionally, few agency strategic and performance plans addressed crosscutting efforts and coordination. At that time, almost half of federal managers in our 2003 survey reported that they coordinated program efforts to a great or very great extent with other internal or external organizations. Now, almost 20 years since GPRA's passage, our work continues to demonstrate that the needed collaboration is not sufficiently widespread. Accordingly, in 2012 we developed a guide on key considerations for implementing collaborative mechanisms.[Footnote 28] The results of our 2013 survey of federal managers show that the percentage of managers reporting that they use information obtained from performance measurement when coordinating program efforts with other internal or external organizations to a great or very great extent has not increased since 1997. Based on this survey, an estimated 23 percent of the managers reported that they coordinated program efforts to a small extent or not at all. The following three examples, among many, highlight the need for improved collaboration to help address crosscutting issues: * Food safety: One area that has been identified in both bodies of work is the fragmented nature of federal food safety oversight. The U.S. food safety system is characterized by inconsistent oversight, ineffective coordination, and inefficient use of resources; these characteristics have placed the system on our high-risk list since 2007 and in all three of our annual reports on fragmentation, overlap, and duplication.[Footnote 29] We have reported that the U.S. Department of Agriculture (USDA) and the Food and Drug Administration (FDA), the two primary agencies responsible for food safety, have taken some steps to increase collaboration. However, agencies have not developed a government-wide performance plan for food safety that includes results-oriented goals and performance measures, as we recommended when we put federal oversight of food safety on the high- risk list in January 2007.[Footnote 30] In the absence of this plan, we have reported cases of fragmentation, overlap, and duplication. The 2010 nationwide recall of more than 500 million eggs because of Salmonella contamination highlights a negative consequence of this fragmentation. Several agencies have different roles and responsibilities in the egg production system. Through the Food Safety Working Group,[Footnote 31] federal agencies have taken steps designed to increase collaboration in some areas that cross regulatory jurisdictions. For example, both USDA and FDA set goals to reduce illness from Salmonella within their own areas of egg safety jurisdiction by the end of 2011 and developed a memorandum of understanding on information sharing regarding egg safety. While such actions are encouraging, without a government-wide performance plan for food safety, fragmentation, overlap, and duplication is likely to continue. * Climate change: Climate change is a complex, crosscutting issue that poses risks to many environmental and economic systems--including agriculture, infrastructure, ecosystems, and human health--and presents a significant financial risk to the federal government. Among other impacts, climate change could threaten coastal areas with rising sea levels, alter agricultural productivity, and increase the intensity and frequency of severe weather events such as floods, drought, and hurricanes. Weather-related events have cost the nation tens of billions of dollars in damages over the past decade. For example, in 2012, the administration requested $60.4 billion for Superstorm Sandy recovery efforts. However, the federal government is not well positioned to address the fiscal exposure presented by climate change, partly because of the complex, crosscutting nature of the issue. Given these challenges and the nation's precarious fiscal condition, we added "Limiting the Federal Government's Fiscal Exposure to Climate Change" to our high-risk list in 2013.[Footnote 32] In adding climate change to this list, we reported that the federal government would be better positioned to respond to the risks posed by climate change if federal efforts were more coordinated and directed toward common goals. In October 2009, we recommended that the appropriate entities within the Executive Office of the President, in consultation with relevant federal agencies, state and local governments, and key congressional committees of jurisdiction, develop a strategic plan to guide the nation's efforts to adapt to climate change, including the establishment of clear roles, responsibilities, and working relationships among federal, state, and local governments. [Footnote 33] In written comments, the Council on Environmental Quality generally agreed with the report's recommendations,[Footnote 34] noting that leadership and coordination is necessary within the federal government to ensure an effective and appropriate adaptation response and that such coordination would help to catalyze regional, state, and local activities. Some actions have subsequently been taken to improve the coordination of federal adaptation efforts, including the development of an interagency climate change adaptation task force. * Federal disability programs: In June 2012, we identified 45 programs in nine agencies that helped people with disabilities obtain or retain employment, reflecting a fragmented system of services and supports. [Footnote 35] Many of these programs overlapped in whom they served and the types of services they provided. Such fragmentation and overlap may frustrate and confuse program beneficiaries and limit the overall effectiveness of the federal effort. Having extensive coordination and overarching goals can help address program fragmentation. Although we identified promising coordination efforts among some programs, most reported not coordinating with each other, and some officials told us they lacked funding and staff time to pursue coordination. Coordination efforts can be enhanced when programs work toward a common goal; however, the number and type of outcome measures used by the 45 programs varied greatly. To improve coordination, efficiency, and effectiveness, we suggested that OMB consider establishing government-wide goals for employment of people with disabilities. Consistent with this suggestion, OMB officials stated that the Domestic Policy Council began an internal review intended to improve the effectiveness of some disability programs through better coordination and alignment.[Footnote 36] However, as we noted in our 2013 high-risk update, OMB still needs to maintain and expand its role in improving coordination across programs--such as the 45 we identified--that support employment for those with disabilities, and ultimately work with all relevant agencies to develop measurable government-wide goals to spur further coordination and improved outcomes for those who are seeking to find and maintain employment. [Footnote 37] On the other hand, we have recently highlighted progress that the executive branch and Congress have made in addressing areas that we previously identified as being at risk of fragmentation, overlap, and duplication.[Footnote 38] For example, the nation's surface transportation system is critical to the economy and affects the daily life of most Americans. However, in our 2011 annual report on fragmentation, overlap, and duplication, we reported that over the years federal surface transportation programs grew increasingly fragmented.[Footnote 39] At the core of this fragmentation was the fact that federal goals and roles for the programs were unclear or conflicted with other federal priorities, programs lacked links to the performance of the transportation system or of the grantees, and programs did not use the best tools to target investments in transportation to the areas of greatest benefit. Accordingly, since 2004, we have made several recommendations and matters for congressional consideration to address the need for a more goal- oriented approach to surface transportation, introduce greater performance and accountability for results, and break down modal stovepipes. As we reported in February 2013,[Footnote 40] there was progress in clarifying federal goals and roles and linking federal programs to performance when the Moving Ahead for Progress in the 21st Century Act was enacted in July 2012.[Footnote 41] The act addressed fragmentation by eliminating or consolidating programs, and made progress in clarifying federal goals and roles and linking federal programs to performance to better ensure accountability for results. The challenge of collaboration has also been highlighted in our reviews of related GPRAMA requirements, such as those for CAP goals, APGs, and QPRs. While agencies have implemented some of these provisions, these efforts have not included all of the relevant agency, program, and other contributors. When agencies do not include all relevant contributors, they may miss important opportunities to work with others who are instrumental to achieving intended outcomes. Including all contributors is also a requirement of GPRAMA. * At the government-wide level, OMB is required to list all of the agencies, organizations, program activities, regulations, tax expenditures, policies, and other activities that contribute to each CAP goal. With relevant stakeholders, OMB is required to review the progress of all contributors towards each goal on a quarterly basis. * At the agency level, agencies are required to identify the various federal organizations, programs, and activities--both within and external to the agency--that contribute to each goal, and for APGs, review progress on a quarterly basis with relevant stakeholders. However, as shown in table 1, we have found that agencies are not including all stakeholders as they implement GPRAMA. Table 1: CAP Goals, APGs, and QPRs Did Not Include All Relevant Participants: CAP goals: What we found: In May 2012, we identified additional agencies that should be named as contributors for 10 of the 14 interim CAP goals[A]; Examples: To help achieve the National Export Initiative and crosscutting goal of doubling the value of U.S. exports by 2014, the Export Promotion Cabinet and the 20 agencies that are members of the Trade Promotion Coordinating Committee were directed to coordinate and align export promotion and other activities, including improving foreign market access. When OMB listed "Double U.S. exports by the end of 2014" as a CAP goal, 12 agencies that are members of the Trade Promotion Coordinating Committee were not included as contributors; What we recommended: In May 2012, we recommended that OMB consider adding additional agencies as contributors. OMB staff agreed with this recommendation and in December 2012 and March 2013, OMB updated information on Performance.gov on the CAP goals. OMB included some of the agencies we identified for select goals, but in other instances eliminated key contributors that were previously listed. APGs: What we found: In April 2013, we found that agencies identified contributors within the agency for each APG, but did not identify external contributors for 29 of the 102 APGs we reviewed. In some cases the goals seem to be internally focused, but in other cases our work has shown that there are external contributors that were not listed[B]; Examples: The National Science Foundation did not list any external contributors to its APG to develop a diverse and highly qualified science and technology workforce. Our past work has identified 209 programs across 13 federal agencies that are focused on science, technology, engineering, and mathematics education, some which may have efforts related to those the National Science Foundation is undertaking for this goal; What we recommended: In April 2013, we recommended that OMB work to ensure that agencies adhere to OMB's guidance for website updates by providing complete information about the organizations, program activities, regulations, policies, tax expenditures, and other activities--both within and external to the agency--that contribute to each goal. OMB staff agreed with this recommendation. QPRs: What we found: In February 2013 we reported that while we found QPRs have shown promise in improving internal agency coordination and collaboration, few agency performance improvement officers reported they were using the reviews to coordinate or collaborate with other agencies that have similar goals[C]; Examples: Our survey of PIOs indicated that there was little to no involvement in the reviews from other agencies that could help achieve agency goals. This was also true at the Departments of Energy and the Treasury, and the Small Business Administration, where officials expressed concerns about including outsiders in their reviews and described other means of coordinating with them. However, OMB guidance--along with a leading practice we identified--indicates that including key players from other agencies can lead to more effective collaboration and goal achievement; What we recommended: We recommended that the Director of OMB identify and share promising practices for including other relevant entities that contribute to achieving their agency performance goals. OMB staff agreed with our recommendation. Source: GAO. [A] [hyperlink, http://www.gao.gov/products/GAO-12-620R]. [B] [hyperlink, http://www.gao.gov/products/GAO-13-174]. [C] [hyperlink, http://www.gao.gov/products/GAO-13-228]. [End of table] While we continue to see challenges to collaboration across federal agencies, as a positive development, our survey of federal managers shows that reported collaboration increases when individuals contribute to the CAP goals, APGs, or QPRs. Our 2013 survey data indicate that 58 percent of federal managers reported they were somewhat or very familiar with CAP goals. Among these individuals, federal managers who view their programs as contributing to CAP goals to a great or very great extent are more likely to report collaborating outside their program to a great or very great extent to help achieve CAP goals, as figure 2 shows. Figure 2: More Managers Report Collaborating Outside of Their Program When They View Their Program as Contributing to a Great or Very Great Extent in the Achievement of the CAP Goals: [Refer to PDF for image: combined vertical bar and line graph] My program contributes to the achievement of CAP goals to a small or no extent: 34%; I have collaborated outside my program to help achieve CAP goals to a great or very great extent: 0. My program contributes to the achievement of CAP goals to a moderate extent: 32%; I have collaborated outside my program to help achieve CAP goals to a great or very great extent: 3%. My program contributes to theachievement of CAP goals to a great or very great extent: 34%; I have collaborated outside my program to help achieve CAP goals to a great or very great extent: 18%. Source: GAO. Notes: The percentages shown in this figure are based on the 58 percent of managers who reported being somewhat or very familiar with CAP goals, and who answered on the extent scale. Percentage estimates have 95 percent confidence intervals within +/- 5.7 percentage points of the estimated percentage. Some of the survey items were abbreviated. For the full text, see items 16b and 16c in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. Survey items were introduced in 2013. [End of figure] We saw a similar pattern in responses from managers who were familiar with the APGs and the extent to which their programs contributed to the APGs. Eighty-two percent of federal managers reported they were somewhat or very familiar with APGs. Among these individuals, those who view their programs as contributing to APGs to a great or very great extent are more likely to report collaborating outside their program to a great or very great extent, to help achieve APGs, as shown in figure 3. Figure 3: More Managers Report Collaborating Outside of Their Program When They View Their Program as Contributing to a Great or Very Great Extent to the Achievement of the APGs: [Refer to PDF for image: combined vertical bar and line graph] My program contributes to the achievement of APGs to a small or no extent: 21%; I have collaborated outside my program to help achieve APGs to a great or very great extent: 1%. My program contributes to the achievement of APGs to a moderate extent: 25%; I have collaborated outside my program to help achieve APGs to a great or very great extent: 2%. My program contributes to the achievement of APGs to a great or very great extent: 54%; I have collaborated outside my program to help achieve APGs to a great or very great extent: 34%. Source: GAO. Notes: The percentages shown in this figure are based on the 82 percent of managers who reported being somewhat or very familiar with APGs, and who answered on the extent scale. Percentage estimates have 95 percent confidence intervals within +/-5 percentage points of the estimated percentage. Some of the survey items were abbreviated. For the full text, see items 18e and 18f in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP. Survey items were introduced in 2013. [End of figure] While the questions on our survey were designed to examine collaboration outside individual programs, they were not designed to distinguish between collaboration within or outside agency boundaries. As discussed in table 1, we found that collaboration was more common within agencies than between agencies. This may be appropriate in some cases; however, in other cases this might point to a need for broader inclusion of external stakeholders. Similarly, we found that more managers reported collaborating with officials external to their agency to a great or very great extent when they also reported that their programs were involved in QPRs to a similar extent. The Executive Branch Still Lacks a Framework for Reviewing the Contributions of Tax Expenditures to Broader Federal Efforts: Tax expenditures represent a significant federal investment.[Footnote 42] If the Department of the Treasury (Treasury) estimates are summed, an estimated $1 trillion in revenue was forgone from the 169 tax expenditures reported for fiscal year 2012, nearly the same as discretionary spending that year. For some tax expenditures, forgone revenue can be of the same magnitude or larger than related federal spending for some mission areas. For example, in fiscal year 2010, tax expenditures represented about 78 percent ($132 billion) of federal support for housing. Since 1994, we have recommended greater scrutiny of tax expenditures, as periodic reviews could help determine how well specific tax expenditures work to achieve their goals and how their benefits and costs compare to those of spending programs with similar goals. In November 2012, we issued a guide that identifies criteria for assessing tax expenditures and provides questions for the Congress to ask about a tax expenditure's effectiveness.[Footnote 43] However, OMB has not developed a framework for reviewing tax expenditure performance, as we recommended in June 1994 and again in September 2005.[Footnote 44] Because OMB has not yet established such a framework, little is known about how tax expenditures contribute to broad federal outcomes and how they are related to spending programs seeking the same or a similar outcome.[Footnote 45] OMB guidance has shown some progress in addressing how agencies should incorporate tax expenditures in strategic plans and annual performance plans and reports, as we first recommended in September 2005.[Footnote 46] GPRAMA specifically requires OMB to identify tax expenditures among the various federal activities that contribute to each CAP goal, when applicable. Although the act does not explicitly require agencies to identify tax expenditures among the various federal programs and activities that contribute to their performance goals, OMB's guidance directs agencies to do so for their APGs, which are a small subset of their performance goals. However, our review of the APGs developed for 2012 to 2013 found that only one agency, for one of its APGs, identified two relevant tax expenditures. We recently reported that OMB was missing an opportunity to more broadly identify how tax expenditures contribute to each agency's overall performance.[Footnote 47] Even among the CAP goals, OMB and agencies are missing opportunities to identify tax expenditures as contributors. In the original information on Performance.gov in February 2012, OMB included tax expenditures as potential contributors for 5 of the 14 CAP goals (veteran career readiness, entrepreneurship and small businesses, energy efficiency, job training, and improper payments). In the December 2012 and March 2013 updates to Performance.gov, only two goals (veteran career readiness and improper payments) discussed two tax expenditures, which represent $2.7 billion or 0.3 percent of the $1 trillion sum across the tax expenditures listed by Treasury. Tax expenditures were no longer mentioned as contributing to the entrepreneurship and small businesses, energy efficiency, and job training CAP goals. For example, under the energy efficiency CAP goal, OMB originally listed both spending programs and tax expenditures that contribute to the goal. However, in the December 2012 update to Performance.gov, OMB had deleted all of the tax expenditures even though many of these tax expenditures remained unchanged. In one case, OMB deleted the credit for energy efficiency improvements to existing homes (estimated at $780 million for fiscal year 2012), but highlighted the Department of Energy's (DOE) weatherization assistance spending program (estimated at $68 million in obligations for fiscal year 2012), even though both fund residential energy efficiency. Overall, we identified eight tax expenditures, totaling $2.4 billion in forgone revenue, which share the purpose of achieving energy efficiency, but are no longer identified as potential contributors. When asked about these changes, OMB staff shared that for the entrepreneurship and small business CAP goal the goal leaders narrowed the focus of the goal, which resulted in an updated list of contributing programs and activities that no longer included tax expenditures. For the energy efficiency and job training CAP goals, OMB staff told us that the exclusion of tax expenditures from the December 2012 and March 2013 updates was an oversight. OMB staff told us they planned to add the appropriate tax expenditures as contributors to those goals in the next quarterly update to Performance.gov, which occurred in June 2013. However, none were added to the job training CAP goal update, and as of June 19, 2013, the energy efficiency CAP goal had not yet been updated. However, these examples raise concerns as to whether OMB previously ensured all relevant tax expenditures were identified as contributors to the 14 CAP goals when they were published in February 2012, especially since only 5 CAP goals listed tax expenditures as contributors at that time. We have previously reported that, as with spending programs, tax expenditures represent a substantial federal commitment to a wide range of mission areas.[Footnote 48] Given the lack of scrutiny tax expenditures receive compared to spending programs--especially absent a comprehensive framework for reviewing them--it is possible that additional tax expenditures should have been identified and included as contributors to one or more of the other 9 CAP goals. Moreover, for the 2 CAP goals where tax expenditures were listed as contributors and mistakenly removed, it is unclear if OMB and the goal leaders assessed the contributions of those tax expenditures toward the CAP goal efforts, since they were not listed in the December 2012 and March 2013 updates. Without information about which tax expenditures support these goals and measures of their performance, Congress and other decision makers will not have the needed information to assess overall federal contributions towards desired results, and the costs and relative effectiveness associated with those contributions. Ensuring Performance Information Is Useful and Used by Managers to Improve Results Remains a Weakness, but Key Performance Management Practices Hold Promise: GAO Continues to Find Widespread Progress Is Needed to Use Data to Drive Performance: We have previously reported that data-driven decision making leads to better results.[Footnote 49] Moreover, we have reported that if agencies do not use performance measures and performance information to track progress toward goals, they may be at risk of failing to achieve their goals.[Footnote 50] The textbox illustrates this problem in the high risk area of the Department of Defense's (DOD) approach to business transformation. [Text box} DOD Is Not Regularly Reviewing Performance Information to Assess Progress towards Goals in Transforming Its Business Operations: In 2005, we identified DOD's approach to business transformation as high-risk because DOD had not established clear and specific management responsibility, accountability and control over its business transformation and it lacked a plan with specific goals, measures, and mechanisms to monitor progress.[A] We subsequently reported that DOD made improvements to strengthen its management approach, but we also identified additional steps that are needed. For example, DOD has broadly outlined a performance management approach, and established governance structures, such as the Defense Business Council, to help monitor progress in its business transformation efforts. However, we found the Council had not regularly reviewed performance data and when reviews did occur, it did not have sufficient information to assess progress. To enhance DOD's ability to set strategic direction for its business transformation efforts, better assess overall progress toward business transformation goals, and take any necessary corrective actions, we recommended in February 2013 that DOD take a number of steps to improve its approach to performance management.[B] DOD agreed with this recommendation and said it would continue to improve and institutionalize the Council's operations. [A] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-05-207] (Washington, D.C.: January 2005). [B] GAO, Defense Business Transformation: Improvements Made but Additional Steps Needed to Strengthen Strategic Planning and Assess Progress, [hyperlink, http://www.gao.gov/products/GAO-13-267] (Washington, D.C. Feb. 12, 2013). We made another recommendation in this report, concerning DOD's strategic management plan, with which DOD partially concurred. [End of text box] In the first 4 months of 2013 alone, we issued numerous testimonies and reports that illustrate how performance management weaknesses can hinder agencies' abilities to achieve critical results. This work also illustrates that the scope of these problems is widespread, affecting agencies such as DOD, Treasury, the Departments of Transportation (DOT), Homeland Security (DHS), Health and Human Services, Housing and Urban Development (HUD), and State. The impact of these weaknesses is far reaching as well: These agencies are responsible for performing functions that affect every aspect of Americans' lives, from education, healthcare, and housing to national security and illicit drug use, as described in the textbox. [Text box] Office of National Drug Control Policy Has Established a Performance Monitoring System to Address Illicit Drug Use, but Not Yet Reported on Results: The public health, social, and economic consequences of illicit drug use, coupled with the nation's constrained fiscal environment, highlight the need for federal programs to use resources efficiently and effectively to address this problem. However, we reported in March 2013 that the Office of National Drug Control Policy and federal agencies have not made progress toward achieving most of the goals in the 2010 National Drug Control Strategy, although they reported to be on track to implement most Strategy action items in support of these goals.[A] In April 2012, the Office established the Performance Reporting System, a monitoring mechanism intended to provide specific, routine information on progress toward Strategy goals and help identify factors for performance gaps and options for improvement. We reported that this could help increase accountability for improving results and identify ways to bridge the gap that existed between the lack of progress toward the Strategy's goals and the strong progress made on implementing the Strategy's actions. While this was promising, the Office does not plan to report on results until later in 2013, and until then, operational information is not available to evaluate its effectiveness. [A] GAO, Office of National Drug Control Policy: Office Could Better Identify Opportunities to Increase Program Coordination, [hyperlink, http://www.gao.gov/products/GAO-13-333], (Washington, D.C.: Mar. 26, 2013). [End of text box] Surveys of Federal Managers Show that Managers' Use of Performance Information for Decision Making Has Stagnated: Our prior work has shown that performance information can be used across a range of management functions to improve results, from setting program priorities and allocating resources to taking corrective action to solve program problems. Since our 2007 survey there was statistically significant improvement on two survey items related to use of performance information. More managers reported in 2013--after GPRAMA's enactment and initial implementation--that they used performance information to a great or very great extent in developing program strategy and refining program performance measures. However, the 2013 improvement on the refining program performance measures item followed an earlier decline and does not represent an improvement in comparison to our 1997 survey results. While there is also a statistically significant change between 1997 and 2013 in the percentage of managers who reported to a great or very great extent that they used performance information in adopting new program approaches or changing work processes, the initial decline on this item occurred between our 1997 and 2000 surveys with no significant changes since then. Overall, our periodic surveys of federal managers since 1997 indicate that with the few exceptions described above, the use of performance information has not changed significantly at the government-wide level, as shown in figure 4. Figure 4: Little Change in Percentage of Federal Managers Reporting That They Use Performance Information to a "Great" or "Very Great" Extent for Various Management Activities: [Refer to PDF for image: horizontal bar graph] Adopting new program approaches or changing work processes[A]: 1997: 66%; 2007: 53%; 2013: 54%. Setting program priorities: 1997: 66%; 2007: 58%; 2013: 61%. Allocating resources: 1997: 63%; 2007: 59%; 2013: 59%. Setting new or revising existing performance goals: 1997: 59%; 2007: 52%; 2013: 53%. Refining program performance measures[B]: 1997: 51%; 2007: 46%; 2013: 53%. Developing program strategy[C]: 2007: 51%; 2013: 58%. Identifying program problems to be addressed[D]: 2007: 51%; 2013: 63%. Taking corrective action to solve program problems[D]: 2007: 59%; 2013: 62%. Streamlining programs to reduce duplicative activities[E]: 2013: 44%. Source: GAO. Notes: The percentages shown in this figure are based on the 83 percent (in 2013), 88 percent (in 2007), and 76 percent (in 1997) of managers who reported having performance measures for the programs they were involved in and those answering on the extent scale. Percentage estimates for 2013 and 2007 have 95 percent confidence intervals within +/-5 percentage points of the estimates, and the percentage estimates for 1997 have confidence intervals within +/-7.3 percent of the estimates. Survey items were abbreviated. For full text, see items 8a, 8b, 8c, 8d, 8e, 8f, 8h, 8i, and 8o in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Statistically significant decrease between 1997 and 2013. [B] Statistically significant increase between 2007 and 2013. However, the 2013 percentages are similar to those initially reported in 1997. [C] Statistically significant increase between 2007 and 2013. Survey item was introduced in 2007. [D] Survey item was introduced in 2007. [E] Survey item was introduced in 2013. [End of figure] In addition, we introduced an item in the 2013 survey on streamlining programs, a performance management activity that can help address the overlap and duplication challenges and opportunities described earlier in this report. Less than half of federal managers (44 percent) reported to a great or very great extent that they used performance information for "streamlining programs to reduce duplicative activities." Surveys of Federal Managers Demonstrate Continued Weaknesses in Employing Management Practices that Can Promote the Use of Performance Information: Our prior work has identified practices that can promote the use of performance information for management decision making, such as leadership demonstrating commitment to using performance information, communicating performance information frequently and effectively, ensuring that performance information is useful, and building capacity to use performance information.[Footnote 51] Moreover, many of the requirements put in place by GPRAMA reinforce the importance of these practices. Our past government-wide surveys of federal managers indicated that these key practices were not always being employed across various agencies.[Footnote 52] Our 2013 survey suggests that effectively adopting these practices continues to be a substantial weakness across the government as described below. Demonstrating leadership commitment: Our prior work has shown that the demonstrated commitment of leadership and management to achieving results and using performance information can encourage the federal workforce to apply the principles of performance management.[Footnote 53] GPRAMA requires top leadership involvement in performance management, such as requiring agency leadership to routinely review performance information and progress toward APGs during the QPRs. However, results from our 2013 survey show almost no statistically significant changes in managers' perceptions of their leaders' and supervisors' attention and commitment to the use of performance information since our last survey in 2007. The only statistically significant change from 2007 to 2013 was a decline in the percentage of managers that agreed to a great or very great extent that their agencies' top leadership demonstrates a strong commitment to achieving results, from 67 percent to 60 percent. Moreover, less than two-thirds of managers agreed to a great or very great extent with other survey items related to leadership commitment and attention to performance information, as shown in figure 5. Figure 5: Less than Two-Thirds of Federal Managers Agreed in 2013 to a "Great" or "Very Great" Extent with Statements about Leadership and Supervisor Commitment and Attention to Performance Information: [Refer to PDF for image: horizontal bar graph] My agency's top leadership demonstrates a strong commitment to achieving results[A]: 1997: 57%; 2007: 67%; 2013: 60%. My agency's top leadership demonstrates a strong commitment to using performance information to guide decision making[B]: 2007: 49%; 2013: 45%. The individual I report to pays attention to performance information in decision making[B]: 2007: 56%; 2013: 54%. The individual I report to periodically reviews with me the results or outcomes of my program(s)[C]: 1997: 42%; 2007: 50%; 2013: 51%. Source: GAO. Notes: Percentage estimates for 2013 and 2007 have 95 percent confidence intervals within +/-4 percentage points of the estimate, and percentage estimates for 1997 have confidence intervals within +/- 6.1 percentage points of the estimate. Some survey items were abbreviated. For the full text, see items 10g, 10h, 11a, and 12c in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Statistically significant decrease between 2007 and 2013. [B] Survey item was introduced in 2007. [C] Statistically significant increase between 1997 and 2013. [End of figure] Communicating performance information: Our prior work showed that communicating performance information frequently and effectively throughout an agency can help managers to inform staff and other stakeholders of their commitment to achieve the agency's goals and to keep these goals in mind as they pursue their day-to-day activities. [Footnote 54] Frequently reporting progress toward achieving performance targets also allows managers to review the information in time to make improvements.[Footnote 55] GPRAMA includes requirements for communicating performance information, such as sharing performance information at least quarterly and directing agencies to update performance indicators on their websites at least annually. However, there was no statistically significant change between 2007 and 2013 in the percentage of federal managers agreeing to a great or very great extent that agency managers at their level effectively communicate performance information on a routine basis (41 percent in 2013 and 43 percent in 2007).[Footnote 56] Our analysis suggests that easy access to performance information is related to the effective communication of performance information. Of the 49 percent of federal managers who agreed to a great or very great extent that performance information is easily accessible to managers at their level, 62 percent also agreed that agency managers at their level effectively communicate performance information on a routine basis to a great or very great extent. Conversely, of the 19 percent that agreed to only a small or no extent that performance information is easily accessible to managers at their level, only 9 percent also agreed that agency managers at their level effectively communicate performance information on a routine basis to a great or very great extent. Ensuring performance information is useful: As we previously reported, to facilitate the use of performance information, agencies should ensure that information meets various users' needs for completeness, accuracy, consistency, timeliness, validity, and ease of use.[Footnote 57] GPRAMA introduced several requirements that could help to address these various dimensions of usefulness. For example, agencies must disclose more information about the accuracy and validity of their performance data and actions to address limitations to the data. [Footnote 58] Without useful performance information, it is difficult to monitor agencies' progress toward critical goals, such as improving veterans' access to health care provided by the Department of Veterans Affairs (VA), as illustrated in the textbox. [Text box] Performance Information on Veterans' Wait Times for Medical Appointments Was Unreliable: The Veterans Health Administration (VHA), within the VA, provided nearly 80 million outpatient medical appointments to veterans in fiscal year 2011. Although access to timely medical appointments is important to ensuring veterans obtain needed care, long wait times and inadequate scheduling processes have been persistent problems. VHA is implementing a number of initiatives to improve veterans' access to medical appointments such as use of technology to interact with patients and provide care. However, we testified in March 2013 that certain aspects of VHA's policies and policy implementation contributed to unreliable performance information on veterans' wait times.[A] Moreover, VHA's ability to ensure and accurately monitor access to timely medical appointments is critical to ensuring quality health care to veterans, who may have medical conditions that worsen if access is delayed. In December 2012, we recommended that the Secretary of VA direct the Under Secretary for Health to take several actions to improve oversight of appointment scheduling and related performance measures.[B] VA concurred with our recommendations and identified actions planned or under way to address them. [A] GAO, VA Health Care: Appointment Scheduling Oversight and Wait Time Measures Need Improvement, [hyperlink, http://www.gao.gov/products/GAO-13-372T] (Washington, D.C.: Mar. 14, 2013). [B] GAO, VA Health Care: Reliability of Reported Outpatient Medical Appointment Wait Times and Scheduling Oversight Need Improvement, [hyperlink, http://www.gao.gov/products/GAO-13-130] (Washington, D.C.: Dec. 21, 2012). [End of text box] Responses to four survey items on hindrances related to the usefulness of performance information indicate some limited improvement. There was a statistically significant improvement between the 2007 and 2013 surveys on two of these four items (shown as declines because they concern hindrances), but no significant change otherwise, as illustrated in figure 6. Figure 6: Percentage of Federal Managers Characterizing Specific Factors as a "Great" or "Very Great" Hindrance to Using Performance Information Shows Some Improvement, though Many Continue to Report Hindrances: [Refer to PDF for image: horizontal bar graph] Difficulty obtaining data in time to be useful: 1997: 20%; 2007: 24%; 2013: 21%. Difficulty obtaining valid or reliable data[A]: 1997: 27%; 2007: 30%; 2013: 25%. Different parties are using different definitions to measure performance: 1997: 35%; 2007: 35%; 2013: 32%. Difficulty determining meaningful measures[A]: 1997: 35%; 2007: 38%; 2013: 30%. Source: GAO. Notes: Because these survey items concern hindrances, a decline in percentages of managers reporting hindrances to a great or very great extent represents improvement. Percentage estimates for 2013 and 2007 have 95 percent confidence intervals within +/-4 percentage points of the estimate, and percentage estimates for 1997 have confidence intervals within +/-6 percentage points of the estimate. Survey items were abbreviated. For the full text, see items 9a, 9b, 9c, and 9d in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Statistically significant change between 2007 and 2013. [End of figure] In addition, related survey items introduced after 1997 showed no significant change between 2007 and 2013, with about 40 percent of managers agreeing to a great or very great extent that "agency managers at my level take steps to ensure that performance information is useful and appropriate" and 36 percent agreeing to the same extent that "I have sufficient information on the validity of the performance data I use to make decisions." Despite these limited improvements, the overall picture from the 2013 results--with about one-fifth to nearly one-third of managers reporting hindrances, as indicated in figure 6, and less than half agreeing with most of the positive statements about the format, timeliness, and accessibility of their performance information in figure 7--remains a major concern. Figure 7: Less than Half of Federal Managers Agree to a "Great" or "Very Great" Extent with Most Statements about the Usefulness of Performance Information: [Refer to PDF for image: horizontal bar graph] Performance information is available in time to manage my program(s): 44%. I have access to the performance information I need to manage my program(s): 52%. My agency's performance information is available in a format that is easy to use: 32%. My agency's performance information is easily accessible to employees: 34%. My agency's performance information is easily accessible to managers at my level: 49%. Source: GAO. Notes: Percentage estimates have 95 percent confidence intervals within +/-5 percentage points of the estimated percentage. Survey items were abbreviated. For full text, see items 7a, 7b, 7d, 7f, and 7g in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. Survey items introduced in 2013. {End of figure] Building capacity to use performance information: We have previously reported that building the capacity to use performance information is critical to using performance information in a meaningful fashion, and that inadequate staff expertise, among other factors, can hinder agencies from using performance information.[Footnote 59] GPRAMA lays out specific requirements for OPM to identify skills and competencies for performance management functions, among other actions, which reinforce the importance of staff capacity to use performance information.[Footnote 60] Managers' survey responses and our recent work indicate areas of weakness in agencies' analysis and evaluation tools and staff's skills and competencies, both of which are critical components of performance management capacity. About a third (36 percent) of managers reported in 2013 that they agreed to a great or very great extent that their agencies have sufficient analytical tools for managers at their levels to collect, analyze, and use performance information. Furthermore, less than a third of managers reported that their agencies were investing resources to improve the use and quality of performance information.[Footnote 61] Thirty percent of managers reported that they agree to a great or very great extent that the programs they are involved with have sufficient staff with the knowledge and skills needed to analyze performance information. [Side bar] SBA Officials Identified and Addressed Skills Gaps; In our work on quarterly performance reviews, we reported that SBA officials told us that some of the agency's staff were less comfortable working with data. They addressed this skills gap in part through training. For example, as part of its leadership training, SBA began developing courses related to "decision support," designed to lead to competencies in spreadsheet development and analysis, presentation delivery, and other analytic and presentation skills. Participants began training in late summer 2012 with courses titled "Principles of Analytics" and "Analytic Boot Camp." Source: [hyperlink, http://www.gao.gov/products/GAO-13-228]. [End of side bar] Additionally, our recent work found gaps in performance management competencies among agency staff. Although PIOs we surveyed at 24 agencies in 2012 for our April 2013 report on performance management leadership roles reported that their staff generally possessed core competencies identified by OPM for performance management staff, certain competencies--performance measurement, information management, organization performance analysis, and planning and evaluating--were present to a lesser extent.[Footnote 62] Training is one way agencies can address a lack of staff capacity to use performance information, as illustrated in the sidebar. Between 1997 and 2013, there was a statistically significant increase in the percentage of managers reporting that their agencies made training available in the past 3 years on most of the performance management tasks we asked about. However, between 2007 and 2013, there was either no significant change or a decline in the percentage of managers responding positively to the same items, as shown in figure 8. Figure 8: After Increasing from 1997 Levels, Percentage of Federal Managers Responding "Yes" to Most Items on Whether Their Agencies Made Training Available in the Past 3 Years on Specific Performance Management Tasks Has Leveled Off or Declined: [Refer to PDF for image: horizontal bar graph] Conduct strategic planning: 1997: 39%; 2007: 47%; 2013: 45%. Set program performance goals[A]: 1997: 35%; 2007: 54%; 2013: 52%. Develop program performance measures[B]: 1997: 36%; 2007: 53%; 2013: 47%. Use program performance information to make decisions[A]: 1997: 31%; 2007: 47%; 2013: 44%. Link program or operations performance to theachievement of agency strategic goals[B]: 1997: 28%; 2007: 51%; 2013: 45%. Assess the quality of performance data[C]: 2007: 43%; 2013: 39%. Source: GAO. Notes: Percentage estimates for 2013 and 2007 have 95 percent confidence intervals within +/-4 percentage points of the estimate, and percentage estimates for 1997 have confidence intervals within +/- 6.1 percentage points of the estimate. Survey items were abbreviated. For the full text, see items 13A, 13B, 13C, 13D, 13E, and 13F in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Statistically significant increase between 1997 and 2013. [B] Statistically significant increase between 1997 and 2013; statistically significant decrease between 2007 and 2013. [C] Survey item introduced in 2000. [End of figure] Data-Driven Performance Reviews Show Promise for Encouraging the Use of Performance Data to Improve Results: Our prior work has indicated that effective data-driven reviews can serve as a leadership strategy, requiring leadership and other responsible parties to come together to review performance information and progress toward results and identify important opportunities to drive performance improvements. According to our 2012 survey of PIOs at 24 agencies, the majority (21 of 24) reported that actionable opportunities for performance improvement are identified through the reviews at least half the time.[Footnote 63] In addition, most officials we interviewed at DOE, Treasury, and the Small Business Administration (SBA) attributed improvements in performance and decision making to their QPRs. The textbox presents one such improvement described by officials at Treasury. [Text box] Treasury Credits QPRs with Decision to Stop Minting $1 Coins for Circulation and Saving U.S. Government Millions[A]: Treasury's Deputy Secretary said that it was a performance review session with the U.S. Mint that first led him to question the direction they had been taking with the $1 coin. Performance data he reviewed for the meeting indicated that the Mint was producing 400 million new $1 coins annually, while the Federal Reserve already had 1.4 billion existing ones in storage. Digging deeper, he learned that the Federal Reserve had previously estimated that there were enough $1 coins to meet demand for more than a decade.[B] He and other Treasury officials explained the reason for the imbalance. Because the Mint does not bear the burden of storing the oversupply of coins, it has no signal to stop production. The Deputy Secretary ordered additional analysis to determine the true costs to the U.S. government as a whole. Ultimately, the Treasury Secretary stopped the minting of $1 coins for circulation, saving an estimated $50 million in production and storage costs. [A] [hyperlink, http://www.gao.gov/products/GAO-13-228]. GAO's prior work indicated that while stopping production of $1 coins may save millions of dollars in production costs in the short term, eliminating $1 notes and replacing them with a $1 coin will have a larger net benefit over time. See [hyperlink, http://www.gao.gov/products/GAO-12-342SP]. [B] This estimate was based on the assumption that demand would remain at 2012 levels. [End of text box] While our case studies and survey of PIOs indicated the benefits of QPRs, our 2013 government-wide federal managers' survey indicated that the majority of federal managers are not familiar with the QPRs at their agencies, although a greater percentage of Senior Executive Service (SES) managers reported that they were familiar with the QPRs, as shown in figure 9. Figure 9: SES Managers Reported Greater Familiarity than Non-SES Managers with QPRs at Their Agencies in 2013 Survey: [Refer to PDF for image: 2 pie-charts] SES familiarity with QPRs: Very familiar: 22; Somewhat familiar: 27; Not familiar: 50; Missing or nonresponse: 1. Non-SES familiarity with QPRs: Very familiar: 11; Somewhat familiar: 21; Not familiar: 67; Missing or nonresponse: 1. Source: GAO analysis. Notes: Percentage estimates have 95 percent confidence intervals within +/-5.2 percent of the estimated percentage. Survey item was abbreviated. For full text, see item 19 in the e- supplement to our report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [End of figure] Our analysis suggests that, while familiarity with QPRs may be somewhat limited government-wide, it is positively related to managers' perceptions of their leadership's demonstrated commitment to using performance information. Of the 12 percent of all federal managers who reported they were very familiar with QPRs, 76 percent agreed that their top leadership demonstrates a strong commitment to using performance information to guide decision making to a great or very great extent.[Footnote 64] In contrast, of the 66 percent who reported they were not familiar with QPRs, 36 percent agreed to a great or very great extent with the same statement. Similarly, our analysis suggests that being the subject of a QPR is positively related to the extent to which managers view the QPRs as being used to accomplish certain purposes to a great or very great extent. For example, federal managers who reported that their programs have been the subject of a QPR to a great or very great extent were more likely to report that their agencies use QPRs to identify problems or opportunities than those who reported that their programs have been the subject of a QPR to a moderate or small or no extent. Figure 10 shows this trend, along with a similar one for federal managers' ratings of agency leadership use of QPRs to help achieve performance goals. Figure 10: More Managers Agreed to a "Great" or "Very Great" Extent with Statements on QPR Uses When Their Programs Have Been the Subject of QPRs to a Greater Extent: [Refer to PDF for image: combined vertical bar and line graph] My program has been the subject of a QPR to a small or no extent: 23%; Agency leadership uses QPRs to help achieve performance goals to a great or very great extent: 5%; My agency uses QPRs to identify problems or opportunities to a great or very great extent: 5%. My program has been the subject of a QPR to a moderate extent: 33%; Agency leadership uses QPRs to help achieve performance goals to a great or very great extent: 14%; My agency uses QPRs to identify problems or opportunities to a great or very great extent: 14%. My program has been the subject of a QPR to a great or very great extent: 44%; Agency leadership uses QPRs to help achieve performance goals to a great or very great extent: 37%; My agency uses QPRs to identify problems or opportunities to a great or very great extent: 37%. Source: GAO. Notes: Percentages shown in this figure are based on the 33 percent of managers who reported being somewhat or very familiar with QPRs, and who answered on the extent scale for both of these QPR items. Percentage estimates have 95 percent confidence intervals within +/- 7.9 percentage points of the estimated percentage. Survey items were abbreviated. For the full text, see items 19, 20A, 20E, and 20F in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [End of figure] Our analysis also suggests that being the subject of a QPR may be positively related to managers' perceptions of their agencies employment of key practices that we have previously reported can promote successful data-driven performance reviews.[Footnote 65] For example, federal managers who reported that their programs have been the subject of a QPR to a great or very great extent were more likely to report that the reviews included key practices, such as leadership actively participating in reviews, than those who reported that their programs have been the subject of QPRs to a moderate or small or no extent. This trend and similar ones for other key practices are shown in figure 11. Figure 11: More Managers Agreed to a "Great" or "Very Great" Extent with Statements on QPR Practices When Their Programs Have Been the Subject of QPRs to a Greater Extent: [Refer to PDF for image: combined vertical bar and line graph] My program has been the subject of a QPR to a small or no extent: 21%; Agency leadership actively participates in QPRs to a great or very great extent: 9%; QPRs include external officials who contribute to goals discussed to a great or very great extent: 4%; QPRs include staff needed to solve problems and identify improvements to a great or very great extent: 5%. My program has been the subject of a QPR to a moderate extent: 33%; Agency leadership actively participates in QPRs to a great or very great extent: 17%; QPRs include external officials who contribute to goals discussed to a great or very great extent: 10%; QPRs include staff needed to solve problems and identify improvements to a great or very great extent: 14%; My program has been the subject of a QPR to a great or very great extent: 46%; Agency leadership actively participates in QPRs to a great or very great extent: 39%; QPRs include external officials who contribute to goals discussed to a great or very great extent: 29%; QPRs include staff needed to solve problems and identify improvements to a great or very great extent: 37%. Source: GAO. Notes: Percentages shown in this figure are based on the 33 percent of managers who reported being somewhat or very familiar with QPRs, and who answered on the extent scale for these QPR items. Percentage estimates in this figure have 95 percent confidence intervals within +/-8.5 percentage points of the estimate itself. Survey items were abbreviated. For the full text, see items 19, 20A, 20D, 20G, and 20H in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [End of figure] Federal managers' responses to items about other key practices-- holding QPRs on a regular, routine basis and having a process for following up on QPRs--were similarly related to the extent to which managers' programs were the subject of a QPR. Agencies Have Taken Steps to Align Daily Operations with Agency Results, but Some Continue to Face Difficulties Measuring Performance: Agencies Have Taken Steps to Align Individual Performance with Results: It is important for individuals to see a connection between their daily operations and results to help understand how individual performance can contribute to organizational success. While our past work has shown that agencies have encountered challenges linking individual performance with broader organizational results,[Footnote 66] progress has been made over the last decade in establishing this linkage and holding individuals accountable for organizational results through performance management systems. For example, while agencies have been required to hold senior executives accountable for their individual and organizational performance by linking performance expectations with GPRA-required goals since 2000, OPM and OMB have continued to reinforce the importance of this alignment in improvements in SES performance management. Most recently, in January 2012, OPM and OMB released a government-wide performance appraisal system for senior executives that provides agencies with a standard framework for managing the performance of its executives. While striving to provide greater clarity and equity in the development of performance standards and link to compensation, among other things, the Directors of OPM and OMB stated that the new system is intended to provide agencies with the necessary flexibility and capability to customize the system in order to meet their needs. As part of this framework, agencies are to identify expectations for the senior executives that focus on measurable outcomes from the strategic plan or other measurable outputs and outcomes clearly aligned to organizational goals and objectives. In addition, the Goals-Engagement-Accountability-Results (GEAR) model, established in 2011, focuses on aligning employee performance with organizational performance, creating a culture of engagement, and implementing accountability at all levels, among other things. [Footnote 67] The GEAR model outlines a series of recommended actions for agencies to adopt in order to help improve employee and organizational performance. We reported in September 2012 that DOE's GEAR implementation plan includes aligning employee performance management with organizational performance management and developing training to support these goals, which along with initiating knowledge- sharing activities, will promote improvement of DOE's organizational performance, according to DOE officials.[Footnote 68] We have ongoing work looking at GEAR implementation in the five pilot agencies and plan to issue the results of our work later in 2013. To further institutionalize individual accountability for achieving results, GPRAMA established in law several mechanisms that help individuals and agencies see this connection and hold them accountable for their contributions to agency and government-wide goals. As we recently reported, agency leaders should hold goal leaders and other responsible managers accountable for knowing the progress being made in achieving goals and, if progress is insufficient, understanding why and having a plan for improvement including improvements in the quality of the data to help ensure they are sufficient for decision making.[Footnote 69] For example, PIOs are responsible for, among other things, assisting the agency head and COO in developing and using performance measures specifically for assessing individual performance in the agency. QPRs offer an opportunity for organizational performance to be assessed and responsible officials to be held accountable for addressing problems and identifying strategies for improvement. As agencies implement the accountability provisions of GPRAMA, they will need to ensure managers have decision-making authority commensurate with the responsibility to identify and address performance problems as they arise. Since our 1997 government-wide survey of federal managers, SES managers have reported improvements in accountability for agency goals and results and the decision-making authority to help achieve agency goals. However, there has been a gap between SES managers' perceptions of their accountability for program performance as opposed to their decision-making authority since our initial survey in 1997. In 2013, 80 percent of SES managers reported that they are held accountable for the results of the programs for which they are responsible to a great or very great extent, while 61 percent reported that they have the decision-making authority they need to help the agency achieve its strategic goals, a 19 percentage point difference. See figure 12. Figure 12: Gap Remains between Percent of SES Managers Reporting They Are Being Held Accountable and Percent Reporting They Have Decision- Making Authority to a "Great" or "Very Great" Extent: [Refer to PDF for image: horizontal bar graph] Managers at my level are held accountable for the results of the programs they are responsible for[B]: 1997: 62%; 2007: 81%; 2013: 80%. Managers at my level are held accountable for agency accomplishment of its strategic goals[A]: 2007: 75%; 2013: 73%. Managers at my level have the decision making authority they need to help the agency accomplish its strategic goals[B]: 1997: 51%; 2007: 61%; 2013: 61%. Source: GAO. Notes: Percentage estimates have 95 percent confidence intervals within +/-8 percentage points of the estimates themselves. Survey items were abbreviated. For full text, see items 10a, 10b, and 10c in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Survey item introduced in 2003. [B] There was a statistically significant increase between 1997 and 2013. [End of figure] Using performance information in employee performance management helps individuals track their performance and progress toward achieving organizational goals and can help emphasize the importance of individual contributions to organizational success. However, the percentage of federal managers reporting use of performance information in employee performance management to a great or very great extent has stagnated with no statistically significant change in reported use from 1997 to 2013.[Footnote 70] See figure 13. Figure 13: Little Change over Time in Federal Managers Reporting Use of Performance Information in Employee Performance Management Issues to a "Great" or "Very Great" Extent: [Refer to PDF for image: horizontal bar graph] I use performance information when rewarding employees I manage or supervise[A]: 1997: 53%; 2007: 61%; 2013: 57%. I use performance information in setting individual expectations for employees I manage or supervise[A]: 1997: 61%; 2007: 62%; 2013: 62%. Managers at my level use performance information to recognize employees for their performance[B]: 2007: 51%; 2013: 49%. Source: GAO. Notes: Percentage estimates for 1997 have 95 percent confidence intervals within +/-7.2 percentage points of the estimate. For 2007 and 2013, the confidence intervals are within +/-5 percentage points. Survey items were abbreviated. For full text, see items 8j, 8k, and 10d in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Percentages are based on the 83 percent (in 2013), 88 percent (in 2007), and 76 percent (in 1997) of managers who reported having performance measures in place for the program(s) they were involved with and those answering on the extent scale for the use of performance information. [B] Survey item was introduced in 2007. [End of figure] Agencies Continue to Face Difficulties Measuring Performance in Some Areas: A fundamental element in an organization's efforts to manage for results is its ability to set meaningful goals for performance and to measure progress toward those goals. In our 1996 Executive Guide, we underscored the importance of taking a balanced approach to setting goals and measuring performance.[Footnote 71] If a balance across an organization's various priorities does not exist, the measures in place can overemphasize some goals and create skewed incentives. This need for agencies to have a balanced set of performance measures was reinforced in GPRAMA, which calls for agencies to develop a variety of measures, such as output, outcome, customer service, and efficiency, across program areas. As we have previously reported, based on our government-wide federal managers surveys, federal managers reported a statistically significant increase in the presence of different types of performance measures for their programs to a great or very great extent following initial implementation of GPRA.[Footnote 72] Despite this early progress in establishing a variety of performance measures, since our 2003 federal managers survey, there generally has been no statistically significant increase in the reported presence of these measures to a great or very great extent.[Footnote 73] More recently, as illustrated in figure 14, the only statistically significant increase between 2007 and 2013 is in the percentage of managers reporting the presence of quality measures. Figure 14: Generally No Statistically Significant Increase since 2007 in the Reported Presence of Performance Measures Available to a "Great" or "Very Great" Extent: [Refer to PDF for image: horizontal bar graph] Output: 1997: 51%; 2007: 62%; 2013: 67%, Efficiency: 1997: 35%; 2007: 51%; 2013: 52%, Outcome: 1997: 44%; 2007: 57%; 2013: 59%, Quality[A]: 1997: 42%; 2007: 47%; 2013: 55%, Customer service: 1997: 43%; 2007: 49%; 2013: 51%, Source: GAO. Notes: Percentages are based on the 83 percent (in 2013), 88 percent (in 2007), and 76 percent (in 1997) of managers who reported that they have performance measures in place for the program(s) they were involved with and those answering on the extent scale for the types of performance measures in place. Percentage estimates for 1997 have 95 percent confidence intervals within +/-7.3 percentage points of the estimate. For 2007 and 2013, the confidence intervals are within +/-5 percentage points. There was a statistically significant increase in managers' reporting the presence of output, efficiency, quality, and outcome measures between 1997 and 2013. Survey items were abbreviated. For full text, see items 6a, 6b, 6c, 6d, and 6e in the e-supplement to this report, [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [A] Between 2007 and 2013 there was a statistically significant increase. However, the 2013 results are similar to the 2003 results. [End of figure] We have further found over the years and through our more recent work that there has been uneven development of outcome-oriented performance measures across federal programs, even though agencies have been responsible for measuring program outcomes, among other things, since the passage of GPRA in 1993. As demonstrated in the textbox, outcome- oriented performance measures help agencies determine if the program is achieving its intended purpose. Additionally, these performance measures are essential for assessing the vast number of results of federal efforts that span multiple agencies and organizations. [Text box] GAO Has Reported on Agency Difficulties in Developing and Using Outcome Measures: * In May 2006, we recommended that USDA and DHS adopt meaningful performance measures for assessing the effectiveness of the Agriculture Quarantine Inspection (AQI) program at intercepting foreign pests and disease on agricultural materials entering the country by all pathways and posing a risk to U.S. agriculture.[A] While the agencies expanded existing performance measures in response to our recommendation, we reported in September 2012 that the performance measures developed were not sufficient to assess the program's overall effectiveness.b Further, the AQI program did not have a strategic plan that could serve as a framework for defining the mission, setting goals, and identifying measures for gauging progress. We recommended that USDA and DHS, as part of a coordinated effort, develop a strategic plan for the AQI programs that lays out its joint mission and program goals, and then identify meaningful performance measures for monitoring progress toward goals. The agencies agreed with this recommendation and expect to have a strategic plan by the summer of 2013. * We reported in March 2013 that the Federal Emergency Management Agency has not yet established clear, objective, and quantifiable capability requirements and performance measures to identify capability gaps in a national preparedness assessment,[C] as recommended in our March 2011 report.d As a result, it is unclear what gaps currently exist and what level of federal resources is needed to close the gaps. Although the agency did not fully agree with our assessment, it has made some progress in addressing this recommendation. * We reported in April 2013 that the Federal Communications Commission, DHS, DOD, and Department of Commerce had taken a variety of actions to support the security of the nation's communications networks, including ones related to developing cyber policy and standards, securing Internet infrastructure, sharing information, supporting national security and emergency preparedness, and promoting sector protection efforts.[E] However, DHS and its partners had not yet initiated the process for developing outcome-based performance measures related to the cyber protection of key parts of the communications infrastructure. We recommended that DHS collaborate with its public and private sector partners to develop, implement, and track sector outcome-oriented performance measures for cyber protection activities related to the nation's communications networks. DHS agreed with this recommendation. [A] GAO, Homeland Security: Management and Coordination Problems Increase the Vulnerability of U.S. Agriculture to Foreign Pests and Disease, [hyperlink, http://www.gao.gov/products/GAO-06-644] (Washington, D.C.: May 19, 2006). [B] GAO, Homeland Security: Agriculture Inspection Program Has Made Some Improvements, but Management Challenges Persist, [hyperlink, http://www.gao.gov/products/GAO-12-885] (Washington, D.C.: Sept. 27, 2012). [C] GAO, National Preparedness: FEMA Has Made Progress in Improving Grant Management and Assessing Capabilities, but Challenges Remain, [hyperlink, http://www.gao.gov/products/GAO-13-456T] (Washington, D.C.: Mar. 19, 2013). [D] [hyperlink, http://www.gao.gov/products/GAO-11-318SP]. [E] GAO, Communications Networks: Outcome-Based Measures Would Assist DHS in Assessing Effectiveness of Cybersecurity Efforts, [hyperlink, http://www.gao.gov/products/GAO-13-275] (Washington, D.C.: Apr. 3, 2013). [End of text box] Our work over the last 20 years has identified difficulties agencies face in measuring performance across various program types, such as regulations and grants.[Footnote 74] Some commonly reported difficulties that cut across the various program types include: * accounting for factors that are both outside of an agency's control and impact the results of a program; * developing appropriate performance measures, especially for programs without a clearly defined purpose or that require a long time period to achieve intended results; and: * obtaining complete, timely, and accurate performance information of the program. Illustrative examples from our recent work that show how agencies have experienced difficulties in measuring program performance are provided in table 2. In our 2013 annual report on fragmentation, overlap, and duplication, we identified the need for improving the measurement of performance and results--including program evaluation--as a theme that cuts across our suggested actions to address fragmentation, overlap, and duplication in federal agencies.[Footnote 75] Table 2: Illustrative Examples of Reported Difficulties Agencies Face in Measuring Performance by Program Type: Program type: Contracts; Program type definition: * Contracts are a business arrangement between a government agency and a private entity in which the private entity promises, generally in exchange for money, to deliver certain products or services to the government agency or to others on the government's behalf; * Federal agencies collectively spend more than $550 billion annually through contracts[A]; Illustrative examples from our work: * A high-risk area since 1990, we reported in our February 2013 update that while DOE has made progress in addressing long-standing contract management weaknesses, the agency needs to ensure that federal managers are receiving and validating accurate and reliable information from contractors that can be used to make decisions and to hold them and DOE accountable for performance[B,C]; * A high-risk area since 1992, we reported in our February 2013 update that DOD does not have an action plan in place that provides baseline data, goals, milestones and metrics for assessing the effectiveness of DOD's efforts to improve contract management.[B] As a result, DOD is not well positioned to determine whether its policies are having the intended effects, readily identify when policies are not being appropriately implemented, or take corrective actions. DOD generally agreed with related recommendations we made on contract management in our earlier reports, and at the time of our 2013 report DOD had efforts under way to address these recommendations. Program type: Direct services; Program type definition: * Direct services are the delivery of a good or service by government employees and can be measured, for example, through an agency's customer service efforts; Illustrative examples from our work: * We reported in April 2013 that in conducting an operational analysis of its information technology system, the Transportation Security Administration--as directed by OMB guidance--conducted surveys to measure customer satisfaction, but it did not include measures to assess whether the investment was delivering the goods and services it was designed to deliver[D]; * We reported in April 2013 that the Internal Revenue Service (IRS) does not have a long-term measurable goal with a related time frame to increase below-average taxpayer satisfaction with its website[E]. Without such a goal, it will be difficult for IRS to determine whether its long-term taxpayer satisfaction plan is successful. We recommended that IRS establish a measurable goal and related time frame to improve taxpayer satisfaction. IRS neither agreed nor disagreed with our recommendation. Program type: Grants; Program type definition: * Grants are payments from a donor government to a recipient organization with the aim of either stimulating or supporting some sort of service or activity by the recipient; * Federal grant outlays to state and local governments in fiscal year 2012 totaled over $544 billion, equal to 3.5 percent of the gross domestic product that year[F]; Illustrative examples from our work: * We reported in June 2013 on two Federal Emergency Management Agency (FEMA) grant programs that collect performance information and feed the resulting data into a higher-level DHS strategic goal[G]. We found that data are self-reported by recipients and FEMA has varied and inconsistent approaches to verifying and validating the data. We recommended that FEMA ensure that there are consistent procedures in place to verify and validate grant performance data [FEMA's comments will be added once they are provided in early June]; * The Federal Aviation Administration (FAA) is charged with implementing and enforcing regulations for airports' noise mitigation efforts and provides grants to airports to help assist in these efforts. We reported in September 2012 that FAA's strategic goal for noise reduction is disconnected from its primary tool to address noise--the noise grants program--because it does not reflect the results of these grants.[H] As a result, FAA had insufficient performance information about the effects of noise grants and the extent to which noise exposure remains a constraint on airport growth. We recommended that the agency align its strategic goal for noise reduction with the results of the noise grant program and establish corresponding performance measures, which the agency agreed to consider. As of April 2013, the agency has not addressed this recommendation. Program type: Regulations; Program type definition: * Regulations are rules issued by a federal agency that are intended to specify a desired action or prohibit certain actions; Illustrative examples from our work: * The Federal Transit Administration (FTA) received authority in 2012 to regulate transit rail system safety. We reported in January 2011, prior to FTA's receiving this authority, that FTA did not have specific performance goals in place that identify the intended direct results of its safety activities or related measures.[I] We also found that FTA's transit safety database--which comprises data provided by state safety oversight and transit agencies--was unreliable. At that time, we recommended that FTA improve this database and develop performance goals and measures that, among other things, identify the direct results its safety efforts are trying to achieve. In March 2013, we reported that FTA officials have taken steps to improve their transit rail safety data, including establishing internal controls over the data collection process, but FTA is still developing goals and measures for rail-transit safety efforts[J]; * In recent years, DOT introduced regulations to enhance passenger protections and help minimize costly delays and cancellations in the event of flight disruptions. In September 2011, we reported that DOT's performance data on airports' rates of delays and cancellations were incomplete given DOT only collects this data from larger airlines.[K] In our analysis of more comprehensive flight data, we found substantial differences in flight performance trends by community size. By collecting data only from the largest airlines, DOT does not obtain and therefore cannot provide consumers with a complete picture of flight performance, particularly at airports in rural communities or among smaller airlines. We recommended that DOT collect and publicize more comprehensive data on airlines' on-time performance, among other things. DOT did not comment directly on the recommendations. As of April 2013, the agency has not addressed this recommendation. Program type: Research and development; Program type definition: * Research and development includes military and civilian research and development efforts which can lead to technological advancements that support an improvement in the productivity of American workers and correspondingly, the nation's standard of living; Illustrative examples from our work: * In May 2012, we reported that a national strategy for federal nanotechnology research efforts was developed to establish shared goals and research needs, among other things[L]. While the strategy documents included goals and objectives, they did not include or did not fully develop priorities, milestones, or outcome-related performance measures that would help allow for monitoring and reporting on progress. We recommended that the Director of the Office of Science and Technology Policy coordinate with relevant agencies to develop performance information for this research. The relevant agencies neither agreed nor disagreed with this recommendation; * In March 2011, we reported that while federal agencies have conducted at least 144 research projects on oil pollution since 2003, the interagency committee set up to coordinate federal oil pollution research had not used these studies to set the priorities and goals for federal research on oil pollution technology. The key source used-- a 1997 research plan mandated by law--has not been updated since its creation. Further, no assessment of completed and ongoing research has been conducted to determine if the existing research priorities are being met. The interagency committee plans to update the 1997 plan in the future[M]. Program type: Tax expenditures; Program type definition: * Tax expenditures are reductions in a taxpayer's tax liability that are the result of special exemptions and exclusions from taxation, deductions, credits, deferrals of tax liability, or preferential tax rates; * As of fiscal year 2012, there were 169 tax expenditures representing over $1 trillion in foregone federal revenue[N]; Illustrative examples from our work: * We reported in April 2013 that IRS, which is responsible for administering tax expenditures, does not collect data sufficient for identifying who claims a tax expenditure and how much they claim based on our analysis of fiscal year 2011 tax expenditures.[O] Such basic data were not available for $492 billion of tax expenditures because they were not on tax forms or did not have their own line items; * We reported in February 2012 that while IRS had data on the numbers of taxpayers and aggregate amounts claimed for community development tax credits, the data often did not tie the use of the tax credits to specific communities, which is critical to determining the effect of the tax benefit on local economic development[P]; * We reported in March 2010 that the Department of Housing and Urban Development (HUD) was unable to validate performance information from local program administrators on the use of some Empowerment Zone tax incentives to revitalize selected urban and rural communities.[Q] Further, HUD was tracking only a portion of the credits and was not using an outcome-oriented performance measure that attempted to measure any resulting benefits from the credits used. In our February 2012 report, we found HUD continued to experience these difficulties[R]. Source: GAO. [A] GAO, Interagency Contracting: Agency Actions Address Key Management Challenges, but Additional Steps Needed to Ensure Consistent Implementation of Policy Changes, [hyperlink, http://www.gao.gov/products/GAO-13-133R] (Washington, D.C.: Jan. 29, 2013). [B] [hyperlink, http://www.gao.gov/products/GAO-13-283]. [C] While no recommendation was made to address this difficulty as outlined in the example, we have made related recommendations in this area including that DOE track the performance of its contract projects. [D] [hyperlink, http://www.gao.gov/products/GAO-13-279SP]. We did not make any recommendation to address this difficulty. [E] GAO, IRS Website: Long-Term Strategy Needed to Improve Interactive Services, [hyperlink, http://www.gao.gov/products/GAO-13-435] (Washington, D.C.: Apr. 16, 2013). [F] OMB, Budget of the United States Government, Fiscal Year 2014 (Washington, D.C.: Apr. 10, 2013). [G] GAO, Grants Performance: Justice and FEMA Collect Performance Data For Selected Grants, but Action Needed to Validate FEMA Performance Data, [hyperlink, http://www.gao.gov/products/GAO-13-552] (Washington, D.C.: June 24, 2013). [H] GAO, Airport Noise Grants: FAA Needs to Better Ensure Project Eligibility and Improve Strategic Goal and Performance Measures, [hyperlink, http://www.gao.gov/products/GAO-12-890] (Washington, D.C.: Sept. 12, 2012). [I] GAO, Rail Transit: FTA Programs Are Helping Address Transit Agencies' Safety Challenges, but Improved Performance Goals and Measures Could Better Focus Efforts, [hyperlink, http://www.gao.gov/products/GAO-11-199] (Washington, D.C.: Jan. 31, 2011). [J] GAO, Department of Transportation: Key Issues and Management Challenges, 2013, [hyperlink, http://www.gao.gov/products/GAO-13-402T] (Washington, D.C.: Mar. 14, 2013). [K] GAO, Airline Passenger Protections: More Data and Analysis Needed to Understand Effects of Flight Delays, [hyperlink, http://www.gao.gov/products/GAO-11-733] (Washington, D.C.: Sept. 7, 2011). [L] GAO, Nanotechnology: Improved Performance Information Needed for Environmental, Health, and Safety Research, [hyperlink, http://www.gao.gov/products/GAO-12-427] (Washington, D.C.: May 21, 2012). [M] GAO, Federal Oil and Gas: Interagency Committee Needs to Better Coordinate Research on Oil Pollution Prevention and Response, [hyperlink, http://www.gao.gov/products/GAO-11-319] (Washington, D.C.: Mar. 25, 2011). While no recommendation was made to address this difficulty as outlined in the example, we made related recommendations on reporting the results of these evaluations. [N] Based on the Department of the Treasury's estimates from the President's Fiscal Year 2014 Budget of the U.S. Government. [O] GAO, Tax Expenditures: IRS Data Available for Evaluations Are Limited, [hyperlink, http://www.gao.gov/products/GAO-13-479] (Washington, D.C.: Apr. 30, 2013). We did not make a recommendation to address this difficulty. [P] GAO, Community Development: Limited Information on the Use and Effectiveness of Tax Expenditures Could Be Mitigated through Congressional Attention, [hyperlink, http://www.gao.gov/products/GAO- 12-262] (Washington, D.C.: Feb. 29, 2012). We did not make a recommendation to address this difficulty. [Q] GAO, Revitalization Programs: Empowerment Zones, Enterprise Communities, and Renewal Communities, [hyperlink, http://www.gao.gov/products/GAO-10-464R] (Washington, D.C.: Mar. 12, 2010). [R] [hyperlink, http://www.gao.gov/products/GAO-12-262]. For more information on the Empowerment Zone and Renewal Communities Programs, see [hyperlink, http://www.gao.gov/products/GAO-10-464R]; Empowerment Zone and Enterprise Community Program: Improvements Occurred in Communities, but the Effect of the Program is Unclear, [hyperlink, http://www.gao.gov/products/GAO-06-727] (Washington, D.C.: Sept. 22, 2006); and Community Development: Federal Revitalization Programs Are Being Implemented, but Data on the Use of Tax Benefits are Limited, [hyperlink, http://www.gao.gov/products/GAO-04-306] (Washington, D.C.: Mar. 5, 2004). Renewal communities tax incentives expired after December 31, 2009 and empowerment zone tax incentives expired on December 31, 2011, but were retroactively extended through December 31, 2013 by the American Taxpayer Relief Act of 2012. [End of table] While some agencies have faced difficulties in measuring program performance, some progress has been made in developing performance measures and using the resulting performance information to measure performance in the applicable program area. For example: * HUD has made progress in measuring grant program performance. As we reported in November 2011, HUD measured progress toward some green building goals by collecting energy consumption data for participating properties receiving grants or loans under its Green Retrofit Program for Multifamily Housing before and after the properties are retrofitted and planned to use this data to calculate savings and evaluate effectiveness.[Footnote 76] * In January 2011, we reported that the Federal Railroad Administration (FRA) has created a set of performance goals and measures that address important dimensions of program performance related to its regulatory safety activities. In its proposed fiscal year 2011 budget, FRA included specific safety goals to reduce the rate of train accidents caused by various factors, including human errors and track defects. These goals were quantitative, with a targeted accident rate per every million train miles. Collecting such accident data equips FRA with a clear way to measure whether or not those safety goals are met. FRA's budget request has also linked FRA's performance goals and measures with DOT's strategic goals.[Footnote 77] Moving forward, we will continue to examine the availability and use of performance measures across a variety of program types and update our work in this area. Given that we have found that agencies across the federal government have experienced similar difficulties in measuring the performance of different program types and have not made consistent progress in addressing them, a comprehensive examination of these difficulties is needed. The PIC could help facilitate this examination. As discussed earlier, GPRAMA requires the PIC, in part, to resolve crosscutting performance issues and facilitate the exchange of practices that have led to performance improvements within specific programs or agencies or across agencies. Although measuring the performance of different program types is a significant and long- standing challenge, the PIC has not yet addressed this issue in a systematic way, such as through a working group to identify common difficulties in developing and using performance measures to assess program performance and share best practices from instances in which agencies have overcome these difficulties. Without a comprehensive examination, it will be difficult for the PIC and agencies to fully understand these measurement issues and develop a crosscutting approach to help address them, which will likely result in agencies experiencing difficulties in measuring program performance in the future. Communication of Performance Information Could Better Meet Users' Needs: Federal Managers Reported that Performance Information Is Not Always Available or Easily Accessible to Federal Managers or the Public: According to our 2013 survey of federal managers, 34 percent reported that performance information is easily accessible to agency employees to a great or very great extent, while 17 percent reported that their agency's performance information is easily accessible to the public to a great or very great extent.[Footnote 78] Survey data also indicate that agencies are not communicating to their employees about contributions to CAP goals or their progress toward achieving APGs. In fact, of the 58 percent of federal managers who indicated they were familiar with CAP goals, 22 percent reported that their agency has communicated to its employees on those goals to a great or very great extent. Of the 82 percent of federal managers who indicated familiarity with APGs, 40 percent reported that their agency has communicated on progress toward achieving them to great or very great extent. Leading Practices for the Development of Federal Websites Should Guide the Continued Development of Performance.gov: We recently reported that Performance.gov, as the central repository for federal government performance information, can assist in oversight and lead to a greater focus within government on the activities and efforts necessary to improve performance.[Footnote 79] OMB's stated goals for Performance.gov include, among others, providing both a public view into government performance to support transparency as well as providing executive branch management capabilities to enhance senior leadership decision making. According to OMB staff, OMB will maintain responsibility for the website, but going forward, the plans are that the effort will be driven more by the General Services Administration (GSA) and the PIC, with GSA continuing to provide technological support. For future development of Performance.gov, OMB, the PIC, and GSA are working with federal agencies to develop the Performance Management Line of Business that, according to OMB staff, will standardize the collection and reporting of performance information by agencies.[Footnote 80] Performance.gov has the potential to increase the accessibility of performance information for users both inside and outside the federal government. An analysis of statements from OMB and GSA staff, agency officials, and feedback we obtained from potential users, however, indicates that there are varying expectations regarding the primary uses of Performance.gov. For example, OMB and GSA staff emphasized that they have viewed Performance.gov as a tool for agencies to support cross-agency coordination and efforts to achieve agency goals. Consistent with this, OMB staff said that Performance.gov has been used to facilitate conversations between OMB examiners and agency managers about progress on APGs. While most officials we interviewed said that OMB had collected feedback from the agencies in the development of Performance.gov, officials from most of these agencies also said that Performance.gov is not being used as a resource by agency leadership or other staff, as they have information sources tailored to meet their needs, and Performance.gov does not contain critical indicators or the ability to display some visualizations used for internal agency performance reviews. In addition, a performance management practitioner and other potential users of the website noted that the detailed, technical nature of Performance.gov seemed primarily oriented toward a government rather than a public audience. According to OMB staff, the specific legal requirements of GPRAMA have been the primary framework used to guide efforts to develop Performance.gov thus far. They noted that they have been focused on working to comply with these requirements by providing information on CAP goals and APGs, and by establishing a phased development plan for the integration of additional information from agency strategic plans, performance plans, and performance reports. OMB and GSA staff members have said, however, that the leading practices for developing federal websites will be helpful in guiding the future development of Performance.gov. OMB and GSA staff have also noted that as the phased development of Performance.gov unfolds, they expect to use broader outreach to, and usability testing with, a wider audience, including members of the public, to make Performance.gov more "public-facing" and "citizen-centric." In accordance with this transition, we recommended in June 2013 that OMB work with GSA and the PIC to clarify the specific ways that intended audiences could use the information on Performance.gov. [Footnote 81] HowTo.gov, a leading source of best practices and guidance on the development of federal government websites, recommends identifying the purposes of a website, and the ways in which specific audiences could use a website to accomplish various tasks, and then structuring information and providing tools to help visitors quickly complete these tasks.[Footnote 82] With greater clarity about the intended uses of Performance.gov, OMB and GSA should have sufficient direction to design Performance.gov to make it a relevant and accessible source of information for a variety of potential users including those specified under GPRAMA--members and committees of Congress and the public. In the same report, we also recommended that OMB should work with GSA and the PIC to systematically collect information on the needs of intended audiences and collect recommended performance metrics that help identify improvements to the website. For example, HowTo.gov practices recommend that a website use consistent navigation. Although users we interviewed had mixed opinions on the organization and navigation of Performance.gov, simplifying the website's navigation, adding an effective internal search engine, and providing an appropriate level of detail and information for intended audiences could increase the overall usability of Performance.gov. Outreach and testing on the ease of navigation and searching would help OMB systematically collect information on the needs of various audiences and how these could be addressed through Performance.gov. With performance goals and measures for the website, it would also be possible for the developers of Performance.gov to identify the gap between current capabilities and what is needed to fulfill stated goals and to identify and set priorities for improvements. OMB staff agreed with these recommendations. Agency Performance Information Is Not Always Useful for Congressional Decision Making: Congressional support has played a critical role in sustaining interest in management improvement initiatives over time. As we have previously reported, Congress has served as an institutional champion for many government-wide management reform initiatives over the years, such as the CFO Act and GPRA in the 1990s and more recently GPRAMA.[Footnote 83] Further, Congress has often played an important role in performance improvement and management reforms at individual agencies. Congress has also provided a consistent focus on oversight and has reinforced important policies. As we have previously reported, having pertinent and reliable performance information available is necessary for Congress to adequately assess agencies' progress in making performance and management improvements and ensure accountability for results. [Footnote 84] However, our work has found that the performance information that agencies provided to Congress was not always useful for congressional decision making because the information was not clear, directly relevant, or sufficiently detailed.[Footnote 85] As stated earlier, in order for performance information to be useful, it needs to meet the needs of different users--including Congress--in terms of completeness, accuracy, consistency, timeliness, validity, and ease of use. GPRA required agencies to consult with Congress and obtain the views of interested stakeholders as a part of developing their strategic plans. However, according to the Senate committee report that accompanied the bill that ultimately became GPRAMA,[Footnote 86] agencies did not adequately consider the input of Congress in developing strategic plans, often because the agencies waited until strategic plans were substantially drafted and reviewed within the executive branch before consulting with Congress. In doing so, agencies limited the opportunities for Congress to provide input on their strategic plans and related goals, as well as the performance information that would be most useful for congressional oversight. To help ensure agency performance information is useful for congressional decision making, GPRAMA strengthens the consultation requirement. The act requires agencies to consult at least once every two years with relevant appropriations, authorization and oversight committees, obtaining majority and minority views, when developing or updating strategic plans--which include APGs. Subsequently, agencies are to describe how congressional input was incorporated into those plans and goals.[Footnote 87] Similarly, OMB is required to consult with relevant committees with broad jurisdiction at least once every two years when developing or updating CAP goals, and describe how that input was incorporated into those goals.[Footnote 88] At the request of Congress, in June 2012, we developed a guide to assist Members of Congress and their staffs in ensuring the consultations required under GPRAMA are useful to the Congress.[Footnote 89] The guide outlines general approaches for successful consultations, including creating shared expectations and engaging the right people in the process at the right time. The guide also provides key questions that Members and congressional staff can ask as part of the consultation process to ensure that agency performance information reflects congressional priorities. However, it is unclear if agencies incorporated congressional input on their updated strategic plans and APGs published in 2012, and therefore if this information will be useful for congressional decision making. In our recent review of APGs, we found that agencies reported engaging Congress during the development of their strategic plans and goals to varying degrees, and only 1 of the 24 agencies we reviewed explained how congressional input was incorporated into its APGs, as required by GPRAMA.[Footnote 90] We recommended in April 2013 that OMB ensure that agencies adhere to OMB's guidance for website updates by providing a description of how input from congressional consultations was incorporated into each goal. OMB staff concurred with our recommendation. In addition, our recent work indicated that the performance information provided on Performance.gov also may not be meeting congressional needs. We found that outreach from OMB to congressional staff was limited, as were opportunities for staff to provide input on the development of Performance.gov.[Footnote 91] According to OMB staff, they met several times with staff from the Senate Homeland Security and Governmental Affairs Committee, House Oversight and Government Reform Committee, and the Senate Budget Committee to discuss the development of Performance.gov, and used this outreach to identify several specific website modifications. Of the three congressional staff that we spoke to that said they had received briefings on the development of Performance.gov, however, only one told us she had been consulted on website input. In addition, since 2010, OMB staff has not held meetings on the development of Performance.gov with staff from other committees in the House or Senate that might use the website to inform their oversight of federal agencies. As previously mentioned, we also found that OMB has not articulated how various intended audiences, including Congress, can use the site to accomplish specific tasks, such as supporting coordination and decision making to advance shared goals. At the request of the Congress, in December 2011 and June 2012, we highlighted several instances in which Congress has used agency performance information in various oversight and legislative activities, including (1) identifying issues that the federal government should address; (2) measuring the federal government's progress toward addressing those issues; and (3) identifying better strategies to address the issues when necessary.[Footnote 92] For example, to help promote the use of e-filing of tax returns with the IRS, Congress used performance information to set clear expectations for agency performance, support oversight activities, and inform the development of additional legislation to help IRS achieve its goals. For further information, see the textbox. [Text box] Congressional Use of Performance Information to Promote E-filing of Tax Returns[A]: Congress sought to promote the use of e-filing, which allows taxpayers to receive refunds faster, is less prone to errors, and provides IRS significant cost savings. Congress took the following actions to increase the use of e-filing: * Setting Expectations: As part of the Internal Revenue Service Restructuring and Reform Act of 1998,[B] Congress established a performance goal of having 80 percent of individual tax returns e- filed by 2007; * Oversight: Congress monitored IRS's progress in meeting the established goal for e-filings; held 22 hearings related to IRS filing seasons and e-filings; and requested annual GAO reports to Congress on filing season performance, including e-filing; * Additional Legislation: Congress saw the need for further actions to help IRS achieve the goal, and subsequently passed legislation to require tax return preparers who file more than 10 returns per year to do so electronically[C]; Although IRS did not meet the 80 percent e-filing target by 2007 (58 percent were e-filed that year), increased use of e-filing has substantially reduced IRS's cost to process returns. IRS subsequently met this goal for individual tax returns as of the 2012 tax filing season, with 82 percent of individual returns e-filed.[D] [A] [hyperlink, http://www.gao.gov/products/GAO-12-215R]. [B] Pub. L. No. 105-206, 112 Stat. 685 (1998). [C] The Worker, Homeownership and Business Assistance Act of 2009, Pub. L. No. 111-92 § 17, 123 Stat. 2984, 2996 (2009). [D] IRS has yet to reach the 80 percent e-file goal for some types of returns other than individual income tax returns. See GAO, 2012 Tax Filing: IRS Faces Challenges Providing Service to Taxpayers and Could Collect Balances Due More Effectively, [hyperlink, http://www.gao.gov/products/GAO-13-156] (Washington, D.C.: Dec. 18, 2012). [End of text box] Conclusions: Moving forward, the federal government will need to make tough choices in setting priorities as well as reforming programs and management practices to address the pressing and complex economic, social, security, sustainability, and other issues the nation confronts. GPRAMA provides a number of tools that could help address these challenges. Since enactment in 2011, the executive branch has taken a number of important steps to implement key provisions of the act, by developing interim CAP goals and APGs, conducting quarterly reviews, assigning key performance management roles and responsibilities, and communicating results more frequently and transparently through Performance.gov. However, the executive branch needs to do more to fully implement and leverage the act's provisions to address these challenges. Our recent work reviewing federal performance issues and implementation of the act has pointed to several areas where improvements are needed and, accordingly, we recommended a number of actions. In addition, examples from our past work along with the most recent results from our survey of federal managers show that the executive branch has made little progress addressing long-standing governance challenges related to improving coordination and collaboration to address crosscutting issues, using performance information to drive decision making, measuring the performance of certain types of federal programs, and engaging Congress in a meaningful way in agency performance management efforts to ensure the resulting information is useful for congressional decision making. Of particular concern, OMB has yet to develop a framework for reviewing the performance of tax expenditures, which represented approximately $1 trillion in forgone revenue in fiscal year 2012. In some areas, forgone revenue due to tax expenditures is nearly equal to or greater than spending for federal outlay programs. Since 1994 we have recommended OMB take this action, and the act puts into place explicit requirements for the CAP goals that OMB identify related tax expenditures and measure their contributions to broader federal outcomes. While early implementation of CAP goals showed some promise, with tax expenditures being identified as contributing to 5 of the 14 goals, many of those tax expenditures were subsequently removed. For example, our work shows that eight tax expenditures, representing about $2.4 billion in forgone revenue, should be listed as contributing to the energy efficiency CAP goal. The few tax expenditures that continue to be listed as contributors to a CAP goal only represent about $2.7 billion in forgone revenue--approximately 0.3 percent of the total estimate of forgone revenue from tax expenditures. While OMB staff told us the removal of these tax expenditures was an oversight and that they will be added as contributors in the near future, it raises concerns as to whether OMB previously ensured all relevant tax expenditures were identified as contributors to the 14 CAP goals when they were published in February 2012. Tax expenditures represent a substantial federal commitment to a wide range of mission areas, but do not receive the same scrutiny as spending programs. Therefore, it is possible that additional tax expenditures should have been identified and included as contributors to one or more of the other 9 CAP goals. Moreover, for the 2 CAP goals where tax expenditures were mistakenly removed, it is unclear if OMB and the goal leaders assessed the contributions of those tax expenditures toward the CAP goal efforts, since they were not listed in the December 2012 and March 2013 updates. Without information about which tax expenditures support these goals and measures of their performance, Congress and other decision makers will not have the needed information to assess overall federal contributions towards desired results and the costs and relative effectiveness associated with those contributions. OMB took another promising action in 2012 by directing agencies to identify tax expenditures among the various federal programs and activities that contribute to their APGs--above and beyond what the act requires for all performance goals, which include APGs. However, the 103 APGs developed for 2012 to 2013 at 24 agencies represent only a small subset of all performance goals across the government. In addition, our review of the APGs for 2012 to 2013 found that only one agency, for one of its APGs, identified two relevant tax expenditures. OMB and agencies are missing important opportunities to more broadly identify how tax expenditures contribute to each agency's overall performance. In addition to measuring the contributions of tax expenditures to their goals, our work has found that agencies have experienced common issues in measuring the performance of various other types of programs and have not made consistent progress in addressing them in the last 20 years. As such, a comprehensive and concerted effort to address these long-standing difficulties needs to be taken. With responsibilities to resolve crosscutting performance issues and facilitate the exchange of proven practices, the PIC should lead such an assessment. The PIC has not yet addressed this issue in a systematic way, and without a comprehensive examination, it will be difficult for the PIC and agencies to fully understand these measurement issues and develop a crosscutting strategy to address them. That would likely result in agencies continuing to experience difficulties in measuring program performance in the future. The PIC's upcoming strategic planning effort provides a venue for developing an approach for tackling this issue by putting in place the necessary plans and accountability. The PIC's strategy should detail specific actors and actions to be made within set time frames to ensure that these persistent measurement challenges are adequately addressed. Recommendations for Executive Action: To improve implementation of GPRAMA and help address pressing governance issues, we make the following four recommendations. To help ensure that the contributions made by tax expenditures toward the achievement of agency goals and broader federal outcomes are properly recognized, we recommend that the Director of OMB take the following three actions: * Revise relevant OMB guidance to direct agencies to identify relevant tax expenditures among the list of federal contributors for each appropriate agency goal. * Review whether all relevant tax expenditures that contribute to a CAP goal have been identified, and as necessary, include any additional tax expenditures in the list of federal contributors for each goal. * Assess the contributions relevant tax expenditures are making toward the achievement of each CAP goal. Given the common, long-standing difficulties agencies continue to face in measuring the performance of various types of federal programs and activities--contracts, direct services, grants, regulations, research and development, and tax expenditures--we also recommend the Director of OMB work with the PIC to develop a detailed approach to examine these difficulties across agencies, including identifying and sharing any promising practices from agencies that have overcome difficulties in measuring the performance of these program types. This approach should include goals, planned actions, and deliverables along with specific time frames for their completion, as well as the identification of the parties responsible for each action and deliverable. Agency Comments: We provided a draft of this report for review and comment to the Director of OMB. Via e-mail, staff from OMB's Office of Performance and Personnel Management agreed with the recommendations in this report. The staff also provided technical comments, which we incorporated as appropriate. We are sending copies of this report to the Director of OMB as well as interested congressional committees and other interested parties. This report will also be available at no charge on the GAO website at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-6806, or mihmj@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of our report. Key contributors to this report are listed in appendix III. Signed by: J. Christopher Mihm: Managing Director, Strategic Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: The GPRA Modernization Act of 2010 (GPRAMA) lays out a schedule for gradual implementation of its provisions during a period of interim implementation--from its enactment in January 2011 to February 2014 when a new planning and reporting cycle begins. GPRAMA also includes provisions requiring us to review implementation of the act at several critical junctures and provide recommendations for improvements to its implementation. This report is the final in a series responding to the mandate to assess initial implementation of the act by June 2013, and pulls together findings from our recent work related to the act, the results of our periodic survey of federal managers, and our related recent work on federal performance and coordination issues.[Footnote 93] Our specific objectives for this report were to assess the executive branch's (1) progress in implementing the act and (2) effectiveness in using tools provided by the act to address challenges the federal government faces. To address both objectives, we reviewed GPRAMA, related congressional documents and Office of Management and Budget (OMB) guidance, and our past and recent work related to managing for results and the act. We also interviewed OMB staff. In addition, to further address the second objective, we administered a web-based questionnaire on organizational performance and management issues to a stratified random sample of 4,391 persons from a population of approximately 148,300 mid-level and upper-level civilian managers and supervisors working in the 24 executive branch agencies covered by the Chief Financial Officers (CFO) Act of 1990, as amended.[Footnote 94] The survey results provided information about the extent to which key performance management practices are in place to help address challenges. The sample was drawn from the Office of Personnel Management's (OPM) Central Personnel Data File (CPDF) as of March 2012, using file designators indicating performance of managerial and supervisory functions.[Footnote 95] In reporting the questionnaire data, when we use the term "government-wide" and the phrases "across the government" or "overall" we are referring to these 24 CFO Act executive branch agencies, and when we use the terms "federal managers" and "managers" we are referring to both managers and supervisors. The questionnaire was designed to obtain the observations and perceptions of respondents on various aspects of results-oriented management topics such as the presence and use of performance measures, hindrances to measuring performance and using performance information, agency climate, and program evaluation use. In addition, to address implementation of GPRAMA, the questionnaire included a section requesting respondents' views on various provisions of GPRAMA, such as cross-agency priority goals, agency priority goals, and quarterly performance reviews. For the agency priority goal questions, we directed the federal managers from the Nuclear Regulatory Commission to not answer these questions since OMB did not require the agency to develop agency priority goals for 2012 to 2013. This survey is comparable to surveys we have conducted four times previously at the 24 CFO Act agencies--1997, 2000, 2003, and 2007. The 1997 survey was conducted as part of the work we did in response to a GPRA requirement that we report on implementation of the act. The 2000, 2003, and 2007 surveys were designed to update the results from each of the previous surveys.[Footnote 96] The 2007 survey also included a section requesting the respondent's view on OMB's Program Assessment Rating Tool and the priority that should be placed on various potential improvements to it. The 2000 and 2007 surveys, unlike the other two surveys, were designed to support analysis of the data at the department and agency level as well as government-wide. For this report, we focus on comparing the 2013 survey results with those from the 1997 baseline survey; and with the results of the 2007 survey, which is the most recent survey conducted before GPRAMA was enacted in 2011. We noted the results from the other two surveys--2000 and 2003--when statistically significant trends compared to 2013 occurred. Similar to the four previous surveys, the sample was stratified by agency and by whether the manager or supervisor was a member of the Senior Executive Service (SES) or non-SES. The management levels covered general schedule (GS) or equivalent schedules at levels comparable to GS-13 through GS-15 and career SES or equivalent. Similar to our 2000, 2003, and 2007 surveys, we also incorporated managers or supervisors in other pay plans at levels generally equivalent to the GS-13 through career SES levels into the population and the selected sample to ensure at least a 90 percent coverage of all mid-to upper-level managers and supervisors at the departments and agencies we surveyed. Most of the items on the questionnaire were closed-ended, meaning that depending on the particular item, respondents could choose one or more response categories or rate the strength of their perception on a 5- point extent scale ranging from "to no extent" at the low end of the scale to "to a very great extent" at the high end. On most items, respondents also had an option of choosing the response category "no basis to judge/not applicable." A few items had yes, no, or do not know options for respondents. Many of the items on the questionnaire were asked in our earlier surveys; the sections of the questionnaire asking about GPRAMA, program evaluations, and availability of performance information are new. For these new questions, we conducted pretests with federal managers in several of the 24 CFO Act agencies. For the 2013 survey, based on feedback we obtained from our pretests with managers, we moved the placement of question 8 in the survey to accommodate the insertion of a new question.[Footnote 97] In previous surveys, only those respondents who answered yes to question 5--that they had performance measures available for their programs--were asked to answer question 8--a series of items about the extent to which they used information obtained from performance measurement when participating in certain activities. Respondents answering "no" or "do not know" to question 5 could skip past the question 8 items. For the 2013 survey, all respondents were asked to answer question 8 given the new question added. To maintain the consistency and comparability with how we have previously analyzed and reported question 8 results, we applied the skip pattern used in prior surveys to question 8 by removing those individuals who did not answer yes to question 5 (and in the past would have been directed to skip out of answering the question). However, in the e-supplement we report the results as the federal managers answered the questionnaire, regardless of how they had answered question 5.[Footnote 98] To administer the survey, an e-mail was sent to managers in the sample that notified them of the survey's availability on the GAO website and included instructions on how to access and complete the survey. With the exception of the managers at the Department of Justice (DOJ), which is discussed below, managers in the sample who did not respond to the initial notice were sent up to four subsequent e-mail reminders and follow-up phone calls asking them to participate in the survey. In our prior surveys, we worked with OPM to obtain the names of the managers and supervisors in our sample as selected through the CPDF. However, since our last survey in 2007, some agencies had requested from OPM that the names of individuals within selected subcomponents be withheld from the CPDF. We worked with officials at these agencies to attempt to gain access to these individuals to maintain continuity of the population of managers surveyed from previous years.[Footnote 99] Due to DOJ's national security concerns about providing identifying information (e.g., names, e-mail addresses, phone numbers) of federal agents to us, we administered the current survey to all DOJ managers in our sample through a DOJ official. To identify the sample of managers whose names were withheld from the CPDF, we provided DOJ with the last four digits of Social Security numbers, the subcomponent, duty location, and pay grade information. To ensure that DOJ managers received the same survey administration process as the rest of the managers in our sample to the extent possible, we provided DOJ with copies of the notification, activation (including the web link to our survey), and follow-up e-mails that managers at other agencies received from us. DOJ administered the survey to its managers and conducted follow-up with the nonrespondents. We administered the survey to all 24 agencies from November 2012 through February 2013. To help determine the reliability and accuracy of the CPDF data elements used to draw our sample of federal managers, we checked the data for reasonableness and the presence of any obvious or potential errors in accuracy and completeness. For example, we identified cases where the managers' names were withheld and contacted OPM to determine the reason and extent of this issue. We also checked the names of the managers in our selected sample provided from OPM with the applicable agency contacts to verify these managers were still employed with the agency in the role. We noted discrepancies when they occurred and excluded them from our population of interest, as applicable. We also reviewed our past analyses of the reliability of the CPDF data. [Footnote 100] On the basis of these procedures, we believe the data we used from the CPDF are sufficiently reliable for the purpose of this report.[Footnote 101] Of the 4,391 managers selected for this survey, we found that 266 of the sampled managers had retired, separated, died, or otherwise left the agency or had some other reason that excluded them from the population of interest. We received usable questionnaires from 2,762 sample respondents, or about 69 percent of the remaining eligible sample. In addition, there were 29 persons that we were unable to locate and therefore unable to request that they participate in the survey.[Footnote 102] The response rate across the 24 agencies ranged from 57 percent to 88 percent. The overall survey results are generalizable to the population of managers as described above at each of the 24 agencies and government- wide. The responses of each eligible sample member who provided a usable questionnaire were weighted in the analyses to account statistically for all members of the population. All results are subject to some uncertainty or sampling error as well as nonsampling error. Because we followed a probability procedure based on random selections, our sample is only one of a large number of samples that we might have drawn. Since each sample could have provided different estimates, we express our confidence in the precision of our particular sample's results as a 95 percent confidence interval. This is the interval that would contain the actual population value for 95 percent of the samples we could have drawn. The percentage estimates presented in this report based on our sample for the 2013 survey have 95 percent confidence intervals within plus or minus 5 percentage points of the estimate itself, unless otherwise noted.[Footnote 103] An online e-supplement shows the questions asked on the survey along with the percentage estimates and associated 95 percent confidence intervals for each item for each agency and government-wide.[Footnote 104] Because a complex survey design was used in the current survey as well as the four previous surveys, and different types of statistical analyses are being done, the magnitude of sampling error will vary across the particular surveys, groups, or items being compared due to differences in the underlying sample sizes, usable sample respondents, and associated variances of estimates. For example, the 2000 and 2007 surveys were designed to produce agency-level estimates and had effective sample sizes of 2,510 and 2,943, respectively. However, the 1997 and 2003 surveys were designed to obtain government-wide estimates only, and their sample sizes were 905 and 503, respectively. Consequently, in some instances, a difference of a certain magnitude may be statistically significant. In other instances, depending on the nature of the comparison being made, a difference of equal or even greater magnitude may not achieve statistical significance. We note in this report when we are 95 percent confident that the difference is statistically significant. Also, as part of any interpretation of observed shifts in individual agency responses between the 2013 and the 2000 surveys, it should be kept in mind that components of some agencies and all of the Federal Emergency Management Agency became part of the Department of Homeland Security. In addition to sampling errors, the practical difficulties of conducting any survey may also introduce other types of errors, commonly referred to as nonsampling errors. For example, difficulties in how a particular question is interpreted, in the sources of information available to respondents, or in how the data were entered into a database or were analyzed can introduce unwanted variability into the survey results. With this survey, we took a number of steps to minimize these nonsampling errors. For example, our staff with subject matter expertise designed the questionnaire in collaboration with our survey specialists. As noted earlier, the new questions added to the survey were pretested to ensure they were relevant and clearly stated. When the data were analyzed, a second independent GAO analyst independently verified the analysis programs to ensure the accuracy of the code and the appropriateness of the methods used for the computer- generated analysis. Since this was a web-based survey, respondents entered their answers directly into the electronic questionnaire, thereby eliminating the need to have the data keyed into a database, thus avoiding a source of data entry error. We conducted this performance audit from August 2012 to June 2013 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Status of Key Recommendations Related to GPRAMA: Title and GAO product number: Managing for Results: Leading Practices Should Guide the Continued Development of Performance.gov, [hyperlink, http://www.gao.gov/products/GAO-13-517], June 6, 2013; Summary of related recommendations: (1) The Director of OMB--working with the Performance Improvement Council (PIC) and General Services Administration (GSA)--should, consistent with GPRAMA purpose and audience requirements, outline the specific ways that intended audiences could use the information on the Performance.gov website to accomplish tasks, and specify the design changes, such as the integration of Web 2.0 technologies, that would be required to facilitate that use; Status: OMB staff agreed with our recommendation. (2) Once the specific audiences of the site have been identified and defined, the Director of OMB, working with the PIC and GSA, should seek to more systematically collect information on the needs of intended audiences, including through the use of customer satisfaction surveys, and insights into how these could be addressed through the design of Performance.gov. They should also seek to both ensure that all performance and customer satisfaction metrics, consistent with leading practices outlined in HowTo.gov, are tracked for the website, and to create goals for those metrics to help identify and prioritize potential improvements to Performance.gov; Status: OMB staff agreed with our recommendation. Title and GAO product number: Managing for Results: Agencies Should More Fully Develop Priority Goals under the GPRA Modernization Act, [hyperlink, http://www.gao.gov/products/GAO-13-174], April 19, 2013; Summary of related recommendations: (1) To ensure that agencies can compare actual results to planned performance on a more frequent basis, as appropriate, and demonstrate how they plan to accomplish their goals as well as contribute to the accomplishment of broader federal efforts, the Director of OMB should revise relevant guidance documents to: * provide a definition of what constitutes "data of significant value;" * direct agencies to develop and publish on Performance.gov interim quarterly performance targets for their agency priority goal performance measures when the above definition applies; * direct agencies to provide and publish on Performance.gov completion dates, both in the near-term and longer-term for their milestones; and: * direct agencies to describe in their performance plans how the agency's performance goals--including priority goals--contribute to any of the cross-agency priority goals. When such revisions are made, the Director of OMB should work with the PIC to test and implement these provisions; Status: OMB staff agreed with our recommendation. (2) As OMB works with agencies to enhance Performance.gov to include additional information about agency priority goals, the Director of OMB should ensure that agencies adhere to OMB's guidance for website updates by providing: * complete information about the organizations, program activities, regulations, policies, tax expenditures, and other activities--both within and external to the agency--that contribute to each goal; and; * a description of how input from congressional consultations was incorporated into each goal; Status: OMB staff agreed with our recommendation. Title and GAO product number: Managing for Results: Agencies Have Elevated Performance Management Roles, but Additional Training Is Needed, [hyperlink, http://www.gao.gov/products/GAO-13-356]; April 16, 2013; Summary of related recommendations: (1) To improve performance management staff capacity to support performance management in federal agencies, the Director of OPM should, in coordination with the PIC and the Chief Learning Officer Council, work with agencies to: * identify competency areas needing improvement within agencies; * identify agency training that focuses on needed performance management competencies; and: * share information about available agency training on competency areas needing improvement; Status: OPM agreed with our recommendation, and explained that it will work with agencies, and in particular with PIOs, to assess the competencies of the performance management workforce. OPM also stated that it will support the use of the PIC's performance learning website to facilitate the identification and sharing of training related to competencies in need of improvement. (2) To ensure that the PIC has a clear plan for accomplishing its goals and evaluating its progress, the Director of OMB should work with the PIC to: * conduct formal feedback on the performance of the PIC from member agencies, on an ongoing basis; and; * update its strategic plan and review the PIC's goals, measures, and strategies for achieving performance, and revise them if appropriate; Status: OMB staff agreed with our recommendation. Title and GAO product number: Managing for Results: Data-Driven Performance Reviews Show Promise but Agencies Should Explore How to Involve Other Relevant Agencies, [hyperlink, http://www.gao.gov/products/GAO-13-228], February 27, 2013; Summary of related recommendations: To better leverage agency quarterly performance reviews as a mechanism to manage performance toward agency priority and other agency-level performance goals, the Director of OMB should--working with the PIC and other relevant groups--identify and share promising practices to help agencies extend their quarterly performance reviews to include, as relevant, representatives from outside organizations that contribute to achieving their agency performance goals; Status: OMB staff agreed with our recommendation. Title and GAO product number: Managing for Results: GAO's Work Related to the Interim Crosscutting Priority Goals under the GPRA Modernization Act, [hyperlink, http://www.gao.gov/products/GAO-12-620R], May 31, 2012; Summary of related recommendations: The Director of OMB, in considering additional programs with the potential to contribute to the crosscutting goals, should review the additional departments, agencies, and programs that we have identified, and consider including them in the federal government performance plan, as appropriate; Status: OMB staff agreed with our recommendation. In December 2012 and March 2013, OMB updated information on Performance.gov on the CAP goals. OMB included some of the agencies we identified for select goals, but in other instances eliminated key contributors that were previously listed. Source: GAO. [End of table] [End of section] Appendix III: GAO Contact and Staff Acknowledgments: GAO Contact: J. Christopher Mihm, (202) 512-6806 or mihmj@gao.gov: Staff Acknowledgments: In addition to the above contact, Elizabeth Curda (Assistant Director) and Benjamin T. Licht supervised this review and the development of the resulting report. Tom Beall, Peter Beck, Mallory Barg Bulman, Virginia Chanley, Laura Miller Craig, Sara Daleski, Karin Fangman, Stuart Kaufman, Don Kiggins, Judith Kordahl, Jill Lacey, Janice Latimer, Adam Miles, Kathleen Padulchick, Mark Ramage, Daniel Ramsey, Marylynn Sergent, Megan Taylor, Sarah Veale, Kate Hudson Walker, and Dan Webb made significant contributions to this report. Pawnee Davis, Shannon Finnegan, Quindi Franco, Ellen Grady, Robert Gebhart, Tom James, Donna Miller, Michael O'Neill, Robert Robinson, and Stephanie Shipman also made key contributions. [End of section] Footnotes: [1] GAO, 2013 Annual Report: Actions Needed to Reduce Fragmentation, Overlap, and Duplication and Achieve Other Financial Benefits, [hyperlink, http://www.gao.gov/products/GAO-13-279SP] (Washington, D.C.: Apr. 9, 2013); 2012 Annual Report: Opportunities to Reduce Duplication, Overlap and Fragmentation, Achieve Savings, and Enhance Revenue, [hyperlink, http://www.gao.gov/products/GAO-12-342SP] (Washington D.C.: Feb. 28, 2012); and Opportunities to Reduce Potential Duplication in Government Programs, Save Tax Dollars, and Enhance Revenue, [hyperlink, http://www.gao.gov/products/GAO-11-318SP] (Washington, D.C.: Mar. 1, 2011). For additional information about this body of work, see [hyperlink, http://www.gao.gov/duplication]. [2] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-13-283] (Washington, D.C.: February 2013). For additional information about this body of work, see [hyperlink, http://www.gao.gov/highrisk]. [3] Pub. L. No. 103-62, 107 Stat. 285 (Aug. 3,1993). [4] Pub. L. No. 111-352, 124 Stat. 3866 (Jan. 4, 2011). [5] GAO, Managing for Results: GPRA Modernization Act Implementation Provides Important Opportunities to Address Government Challenges, [hyperlink, http://www.gao.gov/products/GAO-11-617T] (Washington, D.C.: May 10, 2011). [6] [hyperlink, http://www.gao.gov/products/GAO-13-279SP]. [7] For additional information about GPRAMA requirements and our related work, see our web page on leading practices for results- oriented management at [hyperlink, http://www.gao.gov/key_issues/managing_for_results_in_government]. [8] Pub. L. No. 111-352, § 15(b)(1).Other reports issued pursuant to this mandate include GAO, Managing for Results: Leading Practices Should Guide the Continued Development of Performance.gov, [hyperlink, http://www.gao.gov/products/GAO-13-517] (Washington, D.C.: June 6, 2013); Managing for Results: Agencies Should More Fully Develop Priority Goals under the GPRA Modernization Act, [hyperlink, http://www.gao.gov/products/GAO-13-174] (Washington, D.C.: Apr. 19, 2013); Managing for Results: Agencies Have Elevated Performance Management Roles, but Additional Training Is Needed, [hyperlink, http://www.gao.gov/products/GAO-13-356] (Washington, D.C.: Apr. 16, 2013); Managing for Results: Data-Driven Performance Reviews Show Promise But Agencies Should Explore How to Involve Other Relevant Agencies, [hyperlink, http://www.gao.gov/products/GAO-13-228] (Washington, D.C.: Feb. 27, 2013); and Managing for Results: GAO's Work Related to the Interim Crosscutting Priority Goals under the GPRA Modernization Act, [hyperlink, http://www.gao.gov/products/GAO-12-620R] (Washington, D.C.: May 31, 2012). [9] 31 U.S.C. § 901(b). The 24 CFO Act agencies, generally the largest federal agencies, are the Departments of Agriculture, Commerce, Defense, Education, Energy, Health and Human Services, Homeland Security, Housing and Urban Development, the Interior, Justice, Labor, State, Transportation, the Treasury, and Veterans Affairs, as well as the Agency for International Development, Environmental Protection Agency, General Services Administration, National Aeronautics and Space Administration, National Science Foundation, Nuclear Regulatory Commission, Office of Personnel Management, Small Business Administration, and Social Security Administration. [10] See GAO, Government Performance: Lessons Learned for the Next Administration on Using Performance Information to Improve Results, [hyperlink, http://www.gao.gov/products/GAO-08-1026T] (Washington, D.C.: July 24, 2008); Results-Oriented Government: GPRA Has Established a Solid Foundation for Achieving Greater Results, [hyperlink, http://www.gao.gov/products/GAO-04-38] (Washington, D.C.: Mar. 10, 2004); Managing for Results: Federal Managers' Views on Key Management Issues Vary Widely Across Agencies, [hyperlink, http://www.gao.gov/products/GAO-01-592] (Washington, D.C.: May 25, 2001); and The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven, GAO/GGD-97-109 (Washington, D.C.: June 2, 1997). [11] GAO, Managing for Results: 2013 Federal Managers Survey on Organizational Performance and Management Issues, an E-Supplement to [hyperlink, http://www.gao.gov/products/GAO-13-518], [hyperlink, http://www.gao.gov/products/GAO-13-519SP] (Washington, D.C.: June 2013). This supplement does not include responses for demographic and open-ended items. [12] As with previous surveys, we drew our sample from the Office of Personnel Management's Central Personnel Data File. [13] [hyperlink, http://www.gao.gov/products/GAO-04-38]. [14] The White House, Presidential Memorandum, Implementing Government Reform, (Washington, D.C.: July 11, 2001), and Presidential Memorandum, Implementing Management Reform in the Executive Branch (Washington, D.C.: Oct. 1, 1993) outlined a chief operating officer role in the federal government; Executive Order No. 13450, Improving Government Program Performance, 72 Fed. Reg. 64519 (Nov. 13, 2007) established performance improvement officer positions in federal agencies; and OMB, Planning for the President's Fiscal Year 2011 Budget and Performance Plans, M-09-20 (Washington, D.C.: June 11, 2009) directed agencies to identify a lead official for achieving each high-priority performance goal. [15] 72 Fed. Reg. 64519. [16] The 14 interim CAP goals cover science, technology, engineering, and math education; veteran career readiness; broadband; entrepreneurship and small businesses; energy efficiency; exports; job training; cybersecurity; sustainability; improper payments (financial management); critical skills gaps (human capital management); data center consolidation (information technology management); strategic sourcing (procurement and acquisition management); and real property management. See [hyperlink, http://www.gao.gov/products/GAO-12-620R] for additional information. [17] See [hyperlink, http://www.gao.gov/products/GAO-13-174] for our assessment of these APGs. [18] See [hyperlink, http://www.gao.gov/products/GAO-13-356] for more information about the scope and methodology for our survey of PIOs at these agencies. [19] All estimates based on our 2013 survey are subject to sampling error. The 95 percent confidence interval for this estimate is within +/-5 percentage points of the estimate. Unless otherwise noted, estimates from our 2013 survey have 95 percent confidence intervals within +/-5 percentage points of the estimate. Appendix I contains additional information about this survey and sampling error. [20] See, for example, OMB, Circular No. A-11, Preparation, Submission, and Execution of the Budget, pt 6 (August 2012), Delivering an Efficient, Effective, and Accountable Government, M-11- 31 (Washington, D.C.: Aug. 17, 2011); and Delivering on the Accountable Government Initiative and Implementing the GPRA Modernization Act of 2010, M-11-17 (Washington, D.C.: Apr. 14, 2011). [21] [hyperlink, http://www.gao.gov/products/GAO-13-356]. [22] [hyperlink, http://www.gao.gov/products/GAO-13-356]. [23] GAO, Managing for Results: Key Considerations for Implementing Interagency Collaborative Mechanisms, [hyperlink, http://www.gao.gov/products/GAO-12-1022] (Washington, D.C.: Sept. 27, 2012) and Results-Oriented Government: Practices That Can Help Enhance and Sustain Collaboration among Federal Agencies, [hyperlink, http://www.gao.gov/products/GAO-06-15] (Washington, D.C.: Oct. 21, 2005). [24] [hyperlink, http://www.gao.gov/products/GAO-13-279SP], [hyperlink, http://www.gao.gov/products/GAO-12-342SP], and [hyperlink, http://www.gao.gov/products/GAO-11-318SP]. [25] [hyperlink, http://www.gao.gov/products/GAO-13-283]. [26] GAO/GGD-97-109. [27] [hyperlink, http://www.gao.gov/products/GAO-04-38]. [28] [hyperlink, http://www.gao.gov/products/GAO-12-1022]. [29] [hyperlink, http://www.gao.gov/products/GAO-13-283], [hyperlink, http://www.gao.gov/products/GAO-13-279SP], [hyperlink, http://www.gao.gov/products/GAO-12-342SP], and [hyperlink, http://www.gao.gov/products/GAO-11-318SP]. [30] GAO, High Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-07-310] (Washington, D.C.: Jan. 31, 2007). [31] In March 2009, the President established the Food Safety Working Group to coordinate federal efforts and establish food safety goals to make food safer. [32] [hyperlink, http://www.gao.gov/products/GAO-13-283]. [33] GAO, Climate Change Adaptation: Strategic Federal Planning Could Help Government Officials Make More Informed Decisions, [hyperlink, http://www.gao.gov/products/GAO-10-113] (Washington, D.C.: Oct. 7, 2009). [34] The Council on Environmental Quality coordinates federal environmental efforts and works closely with agencies and other offices in the Executive Office of the President in the development of environmental policies and initiatives. [35] GAO, Employment for People with Disabilities: Little Is Known about the Effectiveness of Fragmented and Overlapping Programs, [hyperlink, http://www.gao.gov/products/GAO-12-677] (Washington, D.C.: June 29, 2012). [36] The Domestic Policy Council coordinates the domestic policy- making process in the Executive Office of the President and offers policy advice to the President on domestic policy issues. The council also supervises the execution of domestic policy and represents the President's priorities to Congress. [37] [hyperlink, http://www.gao.gov/products/GAO-13-283]. [38] GAO, Government Efficiency and Effectiveness: Opportunities to Reduce Fragmentation, Overlap, and Duplication through Enhanced Performance Management and Oversight, [hyperlink, http://www.gao.gov/products/GAO-13-590T] (Washington, D.C.: May 22, 2013). [39] [hyperlink, http://www.gao.gov/products/GAO-11-318SP]. [40] [hyperlink, http://www.gao.gov/products/GAO-13-283]. [41] Pub. L. No. 112-141, 126 Stat. 405 (July 6, 2012). [42] Tax expenditures are reductions in a taxpayer's tax liability that are the result of special exemptions and exclusions from taxation, deductions, credits, deferrals of tax liability, or preferential tax rates. For more information, see our key issues page on tax expenditures at [hyperlink, http://www.gao.gov/key_issues/tax_expenditures]. [43] GAO, Tax Expenditures: Background and Evaluation Criteria and Questions, [hyperlink, http://www.gao.gov/products/GAO-13-167SP] (Washington, D.C.: Nov. 29, 2012). [44] GAO, Government Performance and Accountability: Tax Expenditures Represent a Substantial Federal Commitment and Need to Be Reexamined, [hyperlink, http://www.gao.gov/products/GAO-05-690] (Washington, D.C.: Sept. 23, 2005) and Tax Policy: Tax Expenditures Deserve More Scrutiny, GAO/GGD/AIMD-94-122 (Washington, D.C.: June 3, 1994). [45] The President's fiscal year 2012 budget stated that developing an evaluation framework is a significant challenge and that the administration's focus is on addressing challenges with data availability and analytical constraints so that the administration can work towards crosscutting analyses examining tax expenditures alongside related spending programs. The President's fiscal year 2013 and 2014 budgets did not provide an update on these efforts. [46] [hyperlink, http://www.gao.gov/products/GAO-05-690]. [47] This was reported as part of our updates to specific suggestions for improvements identified in our annual reports on duplication and cost savings, as of March 2013. For this particular update, see tax expenditures, available at http://www.gao.gov/duplication/action_tracker/1721]. [48] [hyperlink, http://www.gao.gov/products/GAO-13-167SP]. [49] [hyperlink, http://www.gao.gov/products/GAO-13-228]. [50] [hyperlink, http://www.gao.gov/products/GAO-08-1026T]. [51] GAO, Managing for Results: Enhancing Agency Use of Performance Information for Management Decision Making, [hyperlink, http://www.gao.gov/products/GAO-05-927] (Washington, D.C.: Sept. 9, 2005). [52] See GAO, Results-Oriented Management: Strengthening Key Practices at FEMA and Interior Could Promote Greater Use of Performance Information, [hyperlink, http://www.gao.gov/products/GAO-09-676] (Washington, D.C.: Aug. 17, 2009) and [hyperlink, http://www.gao.gov/products/GAO-08-1026T] for more information on how federal managers from different agencies reported on use of performance information and various management practices in the 2007 and earlier surveys. We have ongoing work analyzing the agency level results from our 2013 survey, which will be published in a future report. [53] GAO, Managing for Results: Federal Managers' Views Show Need for Ensuring Top Leadership Skills, [hyperlink, http://www.gao.gov/products/GAO-01-127] (Washington, D.C.: Oct. 20, 2000). [54] [hyperlink, http://www.gao.gov/products/GAO-05-927]. [55] [hyperlink, http://www.gao.gov/products/GAO-13-174]. [56] Survey item was introduced in 2007. [57] See [hyperlink, http://www.gao.gov/products/GAO-05-927] and [hyperlink, http://www.gao.gov/products/GAO-04-38]. [58] 31 U.S.C. §§ 1115(b)(8) and 1116(c)(6). [59] [hyperlink, http://www.gao.gov/products/GAO-09-676]. [60] Subsection 12(a) of Pub. L. No. 111-352 states, "Not later than 1 year after the date of enactment of this Act, the Director of the Office of Personnel Management, in consultation with the Performance Improvement Council, shall identify the key skills and competencies needed by Federal Government personnel for developing goals, evaluating programs, and analyzing and using performance information for the purpose of improving Government efficiency and effectiveness." Subsections 12(b) and (c) direct OPM to, within 2 years after GPRAMA's enactment, incorporate these skills and competencies into relevant positions classifications and work with each agency to incorporate key skills into training for relevant employees. [61] Twenty-eight percent of managers agreed to a great or very great extent that their agencies were investing in resources to improve the agencies' capacity to use performance information. Twenty-eight percent responded this way about their agencies investing in resources to ensure that performance data is of sufficient quality. These responses did not change significantly from our 2007 survey. [62] [hyperlink, http://www.gao.gov/products/GAO-13-356]. [63] [hyperlink, http://www.gao.gov/products/GAO-13-356] and [hyperlink, http://www.gao.gov/products/GAO-13-228]. [64] The 95 percent confidence interval for this estimate is from 67.7 to 83 percent. [65] For more information on practices that can promote successful data-driven reviews at the federal level, see [hyperlink, http://www.gao.gov/products/GAO-13-228]. [66] GAO, Results-Oriented Management: Opportunities Exist for Refining the Oversight and Implementation of the Senior Executive Performance-Based Pay System, [hyperlink, http://www.gao.gov/products/GAO-09-82] (Washington, D.C.: Nov. 21, 2008) and Managing for Results: Emerging Benefits From Selected Agencies' Use of Performance Agreements, [hyperlink, http://www.gao.gov/products/GAO-01-115] (Washington, D.C.: Oct. 30, 2000). [67] A workgroup of the National Council on Federal Labor-Management Relations partnered with members of the Chief Human Capital Officers Council to develop GEAR. OPM is piloting this model at five agencies-- HUD, DOE, Coast Guard, OPM, and VA. [68] GAO, Federal Training Investments: Office of Personnel Management and Agencies Can Do More to Ensure Cost-Effective Decisions, [hyperlink, http://www.gao.gov/products/GAO-12-878] (Washington, D.C.: Sept. 17, 2012). [69] [hyperlink, http://www.gao.gov/products/GAO-13-228]. See also OMB Circular No. A-11, pt. 6 (August 2012). [70] For all the years of survey results, the only reported statistically significant change with the 2013 results was between 2000 and 2013 for the item asking about managers' use of performance information in setting individual expectations for employees; however, following the dip in 2000, the reported responses rose back to the 1997 level. [71] GAO, Executive Guide: Effectively Implementing the Government Performance and Results Act, GAO/GGD-96-118 (Washington, D.C.: June 1996). [72] [hyperlink, http://www.gao.gov/products/GAO-08-1026T]. [73] Between the 2007 and 2013 survey there was a statistically significant increase in the percentage of managers reporting quality measures to a great or very great extent. However, following the dip in reported responses in 2007, the percentage of managers reporting in 2013 that they had quality measures to a great or very great extent was back to 2003 levels. [74] For example, see [hyperlink, http://www.gao.gov/products/GAO-04-38] and [hyperlink, http://www.gao.gov/products/GAO/GGD-97-109]. Different program types are also referred to as tools of government. See Lester M. Salamon, The Tools of Government: A Guide to the New Governance, (New York, NY: Oxford University Press, 2002), 9. [75] [hyperlink, http://www.gao.gov/products/GAO-13-279SP]. For our most recent work on performance evaluation, see GAO, Program Evaluation: Strategies to Facilitate Agencies' Use of Evaluation in Program Management and Policy Making, [hyperlink, http://www.gao.gov/products/GAO-13-570] (Washington, D.C.: June 26, 2013). [76] GAO, Green Building: Federal Initiatives for the Nonfederal Sector Could Benefit from More Interagency Collaboration, [hyperlink, http://www.gao.gov/products/GAO-12-79] (Washington, D.C.: Nov. 2, 2011). [77] [hyperlink, http://www.gao.gov/products/GAO-11-199]. [78] Federal managers may be more unclear about the availability of their agency's performance information to the public as compared to its availability within the agency. Twenty-nine percent of managers opted not to provide a response to the question about publicly available performance information compared to the four percent who chose not to provide an answer to the question about performance information availability within their agency. [79] [hyperlink, http://www.gao.gov/products/GAO-13-517]. [80] A line of business initiative is a cross-agency effort to define, design, implement, and monitor a set of common solutions for a government-wide business function or service. The initiatives' goals generally include improved agency mission performance, reduced government costs though consolidation and standardization, and simplified service delivery. [81] [hyperlink, http://www.gao.gov/products/GAO-13-517]. [82] [hyperlink, http://www.howto.gov/]. [83] GAO, Managing for Results: Opportunities for Congress to Address Government Performance Issues, [hyperlink, http://www.gao.gov/products/GAO-12-215R] (Washington, D.C.: Dec. 9, 2011). [84] GAO, Managing for Results: A Guide for Using the GPRA Modernization Act to Help Inform Congressional Decision Making, [hyperlink, http://www.gao.gov/products/GAO-12-621SP] (Washington, D.C.: Jun. 15, 2012). [85] See, for example, [hyperlink, http://www.gao.gov/products/GAO-12-621SP]; GAO, Congressional Oversight: FAA Case Study Shows How Agency Performance, Budgeting, and Financial Information Could Enhance Oversight, [hyperlink, http://www.gao.gov/products/GAO-06-378] (Washington, D.C.: Mar. 8, 2006); and [hyperlink, http://www.gao.gov/products/GAO-04-38]. [86] Committee on Homeland Security and Governmental Affairs, GPRA Modernization Act of 2010, S. Rep. No. 111-372 (2010), at 5. [87] 5 U.S.C. § 306(a)(5) and 31 U.S.C. § 1122(b)(1). [88] OMB is required to consult with the Senate and House Committees on Appropriations, the Senate and House Committees on the Budget, the Senate Committee on Homeland Security and Governmental Affairs, the House Committee on Oversight and Government Reform, the Senate Committee on Finance, the House Committee on Ways and Means, and any other committees as determined appropriate. 31 U.S.C. § 1120(a)(3). [89] [hyperlink, http://www.gao.gov/products/GAO-12-621SP]. [90] [hyperlink, http://www.gao.gov/products/GAO-13-174]. [91] [hyperlink, http://www.gao.gov/products/GAO-13-517]. [92] [hyperlink, http://www.gao.gov/products/GAO-12-621SP] and [hyperlink, http://www.gao.gov/products/GAO-12-215R]. [93] Pub. L. No. 111-352, § 15(b)(1).Other reports issued pursuant to this mandate include GAO, Managing for Results: Leading Practices Should Guide the Continued Development of Performance.gov, [hyperlink, http://www.gao.gov/products/GAO-13-517] (Washington, D.C.: June 6, 2013); Managing for Results: Agencies Should More Fully Develop Priority Goals under the GPRA Modernization Act, [hyperlink, http://www.gao.gov/products/GAO-13-174] (Washington, D.C.: Apr. 19, 2013); Managing for Results: Agencies Have Elevated Performance Management Roles, but Additional Training Is Needed, [hyperlink, http://www.gao.gov/products/GAO-13-356] (Washington, D.C.: Apr. 16, 2013); Managing for Results: Data-Driven Performance Reviews Show Promise But Agencies Should Explore How to Involve Other Relevant Agencies, [hyperlink, http://www.gao.gov/products/GAO-13-228] (Washington, D.C.: Feb. 27, 2013); and Managing for Results: GAO's Work Related to the Interim Crosscutting Priority Goals under the GPRA Modernization Act, [hyperlink, http://www.gao.gov/products/GAO-12-620R] (Washington, D.C.: May 31, 2012). [94] 31 U.S.C. § 901(b). The 24 CFO Act agencies, generally the largest federal agencies, are the Departments of Agriculture, Commerce, Defense, Education, Energy, Health and Human Services, Homeland Security, Housing and Urban Development, the Interior, Justice, Labor, State, Transportation, the Treasury, and Veterans Affairs, as well as the Agency for International Development, Environmental Protection Agency, General Services Administration, National Aeronautics and Space Administration, National Science Foundation, Nuclear Regulatory Commission, Office of Personnel Management, Small Business Administration, and Social Security Administration. [95] OPM has transitioned from the CPDF to the Enterprise Human Resources Integration-Statistical Data Mart (EHRI-SDM) as of fiscal year 2010, but CPDF still exists as a quarterly extract from the EHRI- SDM. We used the March 2012 extract to draw our sample. Additionally, Foreign Service officials from the Department of State are not in the CPDF. We drew a sample for that group with assistance from State. [96] For information on the design and administration of the four earlier surveys, see GAO, Government Performance: Lessons Learned for the Next Administration on Using Performance Information to Improve Results, [hyperlink, http://www.gao.gov/products/GAO-08-1026T] (Washington, D.C.: July 24, 2008); Results-Oriented Government: GPRA Has Established a Solid Foundation for Achieving Greater Results, [hyperlink, http://www.gao.gov/products/GAO-04-38] (Washington, D.C.: Mar. 10, 2004); Managing for Results: Federal Managers' Views on Key Management Issues Vary Widely Across Agencies, [hyperlink, http://www.gao.gov/products/GAO-01-592] (Washington, D.C.: May 25, 2001); and The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven, GAO/GGD-97-109 (Washington, D.C.: June 2, 1997). [97] In this discussion about placement of survey questions, we refer to the questions by the numbering in the 2013 survey; the numbering is slightly different in previous surveys. [98] [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [99] We worked with the following agencies to identify the names of the individuals we selected for our sample--Department of Justice (Federal Bureau of Investigation; the Bureau of Alcohol, Tobacco, Firearms and Explosives; and the Drug Enforcement Administration); Department of Homeland Security (U.S. Secret Service); U.S. Department of Agriculture (Food Safety and Inspection Service); and the Department of the Treasury (Alcohol and Tobacco Tax and Trade Bureau). [100] For example, GAO, Federal Workers: Results of Studies on Federal Pay Varied Due to Differing Methodologies, [hyperlink, http://www.gao.gov/products/GAO-12-564] (Washington, D.C.: June 22, 2012) and Women's Pay: Gender Pay Gap in the Federal Workforce Narrows as Differences in Occupation, Education, and Experience Diminish, [hyperlink, http://www.gao.gov/products/GAO-09-279] (Washington, D.C.: Mar. 17, 2009). [101] We previously reported that government-wide data from the CPDF were 96 percent or more accurate. See GAO, OPM's Central Personnel Data File: Data Appear Sufficiently Reliable to Meet Most Customer Needs, GAO/GGD-98-199 (Washington, D.C.: Sept. 30, 1998). Also, in a document dated February 28, 2008, an OPM official confirmed that OPM continues to follow the CPDF data quality standards and procedures contained in our 1998 report. [102] These 29 are included among the nonrespondents to this survey. [103] Government-wide is for the population of all managers at the 24 agencies combined. [104] [hyperlink, http://www.gao.gov/products/GAO-13-519SP]. [End of section] GAO’s Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO’s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO’s website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select “E-mail Updates.” Order by Phone: The price of each GAO publication reflects GAO’s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO’s website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]