Program Evaluation:

Studies Helped Agencies Measure or Explain Program Performance

GGD-00-204: Published: Sep 29, 2000. Publicly Released: Sep 29, 2000.

Additional Materials:

Contact:

Nancy R. Kingsbury
(202) 512-2700
contact@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

Pursuant to a congressional request, GAO reviewed how federal agencies used evaluation studies to report on their achievements, focusing on: (1) how program evaluation studies or methods served in performance reporting; and (2) circumstances that led agencies to conduct evaluations.

GAO noted that: (1) evaluations helped the agencies improve their measurement of program performance or understanding of performance and how it might be improved--some studies did both; (2) to help improve their performance measurement, two agencies used the findings of effectiveness evaluations to provide data on program results that were otherwise unavailable; (3) one agency supported a number of studies to help states prepare the groundwork for and pilot-test future performance measures; (4) another used evaluation methods to validate the accuracy of existing performance data; (5) to better understand program performance, one agency reported evaluation and audit findings to address other, operational concerns about the program; (6) four agencies drew on evaluations to explain the reasons for observed performance or identify ways to improve performance; (7) three agencies compared their program's results with estimates of what might have happened in the program's absence in order to assess their program's net impact or contribution to results; (8) two of the evaluations GAO reviewed were initiated in response to legislative provisions, but most of the studies were self-initiated by agencies in response to concerns about the program's performance or about the availability of outcome data; (9) some studies were initiated by agencies for reasons unrelated to meeting Government Performance and Results Act requirements and thus served purposes beyond those they were designed to address; (10) in some cases, evaluations were launched to identify the reasons for poor program performance and learn how that could be remedied; (11) in other cases, agencies initiated special studies because they faced challenges in collecting outcome data on an ongoing basis; (12) one departmentwide study was initiated in order to direct attention to an issue that cut across program boundaries and agencies' responsibilities; (13) as agencies governmentwide update their strategic and performance plans, the examples in this report might help them identify ways that evaluations can contribute to understanding their programs' performance; and (14) these cases also provide some examples of ways agencies might leverage their evaluation resources through: (a) drawing on the findings of a wide array of evaluations and audits; (b) making multiple use of an evaluations findings; (c) mining existing databases; and (d) collaborating with state and local program partners to develop mutually useful performance data.

Feb 23, 2015

Feb 19, 2015

Feb 11, 2015

  • government icon, source: Eyewire

    GAO'S 2015 High-Risk Series:

    An Update
    GAO-15-371T: Published: Feb 11, 2015. Publicly Released: Feb 11, 2015.
  • government icon, source: Eyewire

    GAO's 2015 High-Risk Series:

    An Update
    GAO-15-373T: Published: Feb 11, 2015. Publicly Released: Feb 11, 2015.
  • government icon, source: Eyewire

    High-Risk Series:

    An Update
    GAO-15-290: Published: Feb 11, 2015. Publicly Released: Feb 11, 2015.

Feb 10, 2015

Jan 29, 2015

Dec 22, 2014

Dec 18, 2014

Dec 17, 2014

  • government icon, source: Eyewire

    State and Local Governments' Fiscal Outlook:

    2014 Update
    GAO-15-224SP: Published: Dec 17, 2014. Publicly Released: Dec 17, 2014.

Looking for more? Browse all our products here