This is the accessible text file for GAO report number GAO-11-87 
entitled 'Recovery Act: Department of Justice Could Better Assess 
Justice Assistance Grant Program Impact' which was released on October 
18, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office:
GAO: 

October 2010: 

Recovery Act: 

Department of Justice Could Better Assess Justice Assistance Grant 
Program Impact: 

GAO-11-87: 

This report was revised on November 1, 2010. Specifically, the opening 
paragraph of Appendix VI and the map that followed were deleted. The 
paragraph was replaced with the following: "This appendix provides the 
full printed text of the interactive content in figure 4 on page 22 in 
the body of the report. Specifically, the following figures describe 
planned uses of Recovery Act Justice Assistance Grant (JAG) funds by 
each State Administering Agency (SAA) across our 14 sample states, 
which are listed in alphabetical order by state name." None of these 
changes affect this report's conclusions or recommendations. 

GAO Highlights: 

Highlights of GAO-11-87, a report to congressional requesters. 

Why GAO Did This Study: 

Under the American Recovery and Reinvestment Act of 2009 (Recovery 
Act), the U.S. Department of Justice’s (DOJ) Bureau of Justice 
Assistance (BJA) awarded nearly $2 billion in 4-year Edward Byrne 
Memorial Justice Assistance Grant (JAG) funds to state and local 
governments for criminal justice activities. As requested, GAO 
examined: (1) how Recovery Act JAG funds are awarded and how 
recipients in selected states and localities used their awards; (2) 
challenges, if any, selected recipients reported in complying with 
Recovery Act reporting requirements; (3) the extent to which states 
shared promising practices related to use and management of funds, and 
how, if at all, DOJ encouraged information sharing; and (4) the extent 
to which DOJ’s JAG Recovery Act performance measures were consistent 
with promising practices. GAO analyzed recipient spending and 
performance data submitted as of June 30, 2010; interviewed officials 
in a nonprobability sample of 14 states and 62 localities selected 
based on the amount of their awards, planned activities, and their 
reported project status; assessed 19 JAG performance measures against 
a set of key attributes; and interviewed agency officials. 

What GAO Found: 

Recipients of Recovery Act JAG funding in the 14 states GAO reviewed 
received more than $1 billion either through direct allocations from 
DOJ or through an indirect “pass-through” of funds that states 
originally received from the department. These recipients reported 
using their funds for a variety of purposes, though predominantly for 
law enforcement and corrections, which included equipment purchases or 
the hiring or retaining of personnel. More than half of the funding 
that state administering agencies (SAA) passed-through to localities 
was reported to be specifically for law enforcement and corrections 
activities, while localities receiving direct awards more often 
reported planning to use their funds for multiple types of criminal 
justice activities. Officials in all 14 states and 19 percent of 
localities in GAO’s sample (12 of 62) said that without Recovery Act 
JAG funding, support for certain ongoing local law enforcement 
programs or activities would have been eliminated or cut. Overall, 
about $270 million or 26 percent of Recovery Act JAG funds had been 
reported as expended as of June 30, 2010, but the expenditure rates of 
funds awarded through SAAs showed considerable variation, ranging from 
5 to 41 percent of SAA’s total awards. 

State officials cited challenges in meeting quarterly Recovery Act 
reporting time frames. Officials from the majority of states in GAO’s 
sample said that workload demands and personnel shortages made meeting 
Recovery Act deadlines within the prescribed reporting period 
difficult; however, all states reported that they were able to do so. 

States reported sharing information and promising practices related to 
JAG activities in a variety of ways and DOJ encouraged this sharing 
through a number of programs. More than half of state agencies in GAO’
s sample generally reported sharing promising practices or lessons 
learned on topics, such as grant management and administration, with 
other states and localities through participating in law enforcement 
and government association conferences, DOJ training, and Web 
postings, among other methods. 

DOJ established new performance measures to assess the Recovery Act 
JAG program and is working to refine them; however, these measures 
lack key attributes of successful performance assessment systems that 
GAO has previously identified, such as clarity, reliability, a linkage 
to strategic or programmatic goals, and objectivity and measurability 
of targets. Including such attributes could facilitate accountability 
and management’s ability to meaningfully assess and monitor Recovery 
Act JAG’s results. DOJ officials acknowledge that weaknesses exist and 
they plan to improve their performance measures. For example, the 
department already took initial steps to incorporate feedback from 
some states with regard to clarifying the definitions of some 
performance measures; however, its assessment tool lacks a process to 
verify the accuracy of the data that recipients self-report to gauge 
their progress. By including attributes consistent with promising 
practices in its performance measures, DOJ could be better positioned 
to determine whether Recovery Act JAG recipients’ programs are meeting 
DOJ and Recovery Act goals. In addition, by establishing a mechanism 
to verify the accuracy of recipient reports, DOJ can better ensure the 
reliability of the information that recipients provide. 

What GAO Recommends: 

GAO recommends that DOJ (1) continue to revise Recovery Act JAG 
performance measures and consider, as appropriate, including key 
attributes of successful performance measurement systems, and (2) 
develop a mechanism to validate the integrity of self-reported 
performance data. DOJ concurred with these recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-11-87] or key 
components. For more information, contact David C. Maurer at (202) 512-
9627 or maurerd@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

Recovery Act JAG Funds Are Awarded in Different Ways and Recipients 
Report Using Their Awards to Support Law Enforcement and Corrections 
Activities Among Other Things: 

State Administering Agencies Cited Challenges Meeting Quarterly 
Recovery Act Reporting Time Frames: 

States Reported Sharing Information and Promising Practices in a 
Variety of Ways and DOJ Encouraged This through a Number of Programs: 

DOJ's Performance Measures Could Better Assess Progress Consistent 
with Characteristics of Successful Performance Measurement Systems: 

Conclusions: 

Recommendations: 

Agency Comments and Our Evaluation: 

Appendix I: Scope and Methodology: 

Appendix II: Recovery Act JAG Performance Measures: 

Appendix III: GAO Assessment of Whether DOJ's Recovery Act JAG 
Performance Measures Possessed Certain Key Attributes: 

Appendix IV: Recovery Act JAG Award Drawdowns and Expenditures: 

Appendix V: Examples of Use of Recovery Act JAG Funds for Equipment 
Purchases: 

Appendix VI: Full Text for Figure 4 Map of SAAs and Planned Uses of 
Recovery Act JAG Awards by the Seven Allowable Program Categories 
across 14 Sample States: 

Appendix VII: Comments from the Department of Justice: 

Appendix VIII: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Recovery Act JAG Program Areas with Illustrative Examples of 
Possible Fund Use: 

Table 2: Recovery Act JAG Awards across Our 14 Sample States, as of 
June 30, 2010: 

Table 3: Recovery Act JAG Disparate Jurisdiction Awards across Our 14 
Sample States, as of June 30, 2010: 

Table 4: State and Local Recipients' Reported Use of Recovery Act JAG 
Funds to Prevent Staff, Programs, or Services from Being Cut or 
Eliminated: 

Table 5: Percent Share of SAA's Reported Recovery Act JAG Obligations 
by Program Area and across Our 14 Sample States, as of June 30, 2010: 

Table 6: Key Characteristics of Individual Performance Measures: 

Table 7: Activity Types Included in Our Recovery Act JAG Performance 
Measure Review: 

Table 8: Recovery Act JAG Performance Measures Associated with the 
Activities Predominantly Undertaken by Recipients across Our 14 Sample 
States: 

Table 9: GAO Assessment of Whether DOJ's Recovery Act JAG Performance 
Measures Possessed Certain Key Attributes: 

Table 10: Recovery Act JAG Drawdowns across Our Sample States, as of 
May 2010: 

Figures: 

Figure 1: Illustration of a Disparate Jurisdiction: 

Figure 2: Recovery Act JAG Funds Expended by the SAAs across our 14 
Sample States, as of June 30, 2010: 

Figure 3: SAA Awards of Recovery Act JAG Funds by the Seven Allowable 
Program Categories across Our 14 Sample States: 

Figure 4: Map of SAAs and Planned Uses of Recovery Act JAG Awards by 
the Seven Allowable Program Categories Across our 14 Sample States: 

Figure 5: Planned Uses of Recovery Act JAG Awards to Direct Recipients 
by the Seven Allowable Program Categories across Localities Within our 
14 Sample States: 

Figure 6: Illustrative Examples of Equipment Purchased with Recovery 
Act JAG Funding across Localities within our 14 Sample States: 

Abbreviations: 

ARRA: American Recovery and Reinvestment Act of 2009: 

BJA: Bureau of Justice Assistance: 

BJS: Bureau of Justice Statistics: 

COPS: Community Oriented Policing Services: 

CHRP: COPS Hiring Recovery Program: 

CSG: Council of State Governments: 

DOJ: Department of Justice: 

FBI: Federal Bureau of Investigation: 

JAG: Justice Assistance Grant: 

MOU: Memorandum of Understanding: 

NCJA: National Criminal Justice Association: 

NGA: National Governors Association: 

OMB: Office of Management and Budget: 

PMT: Performance Measurement Tool: 

SAA: State Administering Agency: 

TASER: Thomas A. Swift's Electric Rifle: 

UCR: Uniform Crime Report: 

[End of section] 

United States Government Accountability Office:
Washington, DC 20548: 

October 15, 2010: 

The Honorable John Conyers, Jr. 
Chairman: 
The Honorable Lamar Smith: 
Ranking Member: 
Committee on the Judiciary: 
House of Representatives: 

The Honorable Robert C. "Bobby" Scott: 
Chairman: 
The Honorable Louie Gohmert: 
Ranking Member: 
Subcommittee on Crime, Terrorism, and Homeland Security: 
Committee on the Judiciary: 
House of Representatives: 

The recession that began in December 2007 caused states and localities 
significant immediate fiscal pressures in the form of reduced tax 
revenues and increased demand for certain programs, including criminal 
justice programs. Under the American Recovery and Reinvestment Act of 
2009[Footnote 1] (Recovery Act), the existing Edward Byrne Memorial 
Justice Assistance Grant (JAG) Program, which the Department of 
Justice's (DOJ) Bureau of Justice Assistance (BJA) administers, 
provided an additional $2 billion to state and local governments 
through 4-year, formula-based grants.[Footnote 2] JAG Program funds 
support local efforts to prevent and control crime and improve the 
criminal justice system through activities such as drug reduction and 
domestic violence prevention. The Recovery Act JAG Program also 
attempts to meet the overall purposes of the Recovery Act which 
include promoting economic recovery, making investments to provide 
long-term economic benefits, and stabilizing state and local 
government budgets to minimize and avoid reductions in essential 
services. 

The Recovery Act emphasizes the need for accountability and 
transparency in the expenditure of Recovery Act funds and makes it a 
central principle of the act's implementation. Importantly, the 
transparency that is envisioned for tracking Recovery Act spending and 
results is an extensive undertaking for the federal government and 
tracking billions of dollars that are being disbursed to thousands of 
recipients is an enormous effort. The administration expects that 
achieving this degree of visibility will be iterative, whereby both 
the reporting process and the information recipients provide improve 
over time and, if successful, could be a model for transparency and 
oversight beyond the Recovery Act. Thus, Recovery Act JAG funding 
recipients are required to meet federal reporting requirements that 
are in addition to the requirements DOJ established for non-Recovery 
Act JAG program recipients. Specifically, Recovery Act JAG recipients 
are required to provide quarterly status reports on the amount and use 
of such funds and information concerning jobs created or retained by 
the use of these funds. Other than the additional reporting 
requirements, however, the Recovery Act JAG program did not alter the 
structure, purpose, or funding allocation methods of the preexisting 
JAG program.[Footnote 3] 

Consistent with the preexisting program, states and localities can use 
their Recovery Act JAG grant funds over a period of 4 years to support 
a range of activities in seven broad statutorily established program 
areas: (1) law enforcement; (2) prosecution and courts; (3) crime 
prevention and education; (4) corrections; (5) drug treatment and 
enforcement; (6) program planning, evaluation, and technology 
improvement; and (7) crime victim and witness programs. Across the 
seven areas, recipients can use JAG funds for state and local 
initiatives--which are generally designed to improve a program, 
service, or system, or support training, personnel, or equipment. 

You requested that we examine the Recovery Act JAG Program. This 
report addresses the following questions: 

* How are Recovery Act JAG funds awarded and how have recipients in 
selected states and localities used their awards? 

* What challenges, if any, have selected Recovery Act JAG recipients 
reported in complying with Recovery Act reporting requirements? 

* To what extent do states share promising practices related to the 
use and management of Recovery Act JAG funds, and how, if at all, does 
DOJ encourage information sharing? 

* To what extent are DOJ's Recovery Act JAG performance measures 
consistent with promising practices? 

This report expands upon our May 2010 Recovery Act report, which 
described selected states' uses of JAG funding and accountability 
provisions related to Recovery Act JAG, as well as our July 2009 
Recovery Act report, which discussed observations of Recovery Act JAG 
fund obligations and planned uses of the funds.[Footnote 4] In July 
2009, we reported that the 16 states and the District of Columbia in 
our review had not obligated their total Recovery Act JAG awards, in 
part because they were determining how the funds would be used and 
passed through to local entities. In our May 2010 report, we visited 7 
of the states from our July 2009 sample and found that all 7 had 
obligated their Recovery Act JAG awards and reported planned uses 
consistent with their states' priorities and BJA's allowable uses of 
JAG funds.[Footnote 5] 

To conduct our work for this review, we evaluated Recovery Act JAG 
awards in a nonprobability sample of 14 states. The states we selected 
for our review of Recovery Act JAG spending are a subset of a 16-state 
(plus the District of Columbia) sample that we used for our earlier 
Recovery Act work, but we did not include Florida, New Jersey, or the 
District of Columbia since the DOJ Office of the Inspector General was 
already engaged in audit work on the JAG program in these states. 
[Footnote 6] The awards to the 14 states in this review accounted for 
approximately 50 percent of all of the Recovery Act JAG funds 
provided. Where statements are attributed to state and local 
officials, we did not analyze state and locality data sources but 
relied on state and local officials and other state sources for 
relevant state data and materials. We also tabulated and analyzed some 
recipient-reported data submitted to Recovery.gov for the quarterly 
reports that had been due as of June 30, 2010.[Footnote 7] We used 
these data because they are the official source of Recovery Act 
spending data and determined that they were sufficiently reliable for 
the purposes of this report.[Footnote 8] We reviewed the relevant 
guidance DOJ provides to Recovery Act JAG recipients on financial and 
program reporting as well as Recovery Act guidance related to federal 
recipient reporting to understand federal reporting requirements and 
associated time frames and interviewed DOJ officials who administer 
the Recovery Act JAG program.[Footnote 9] 

We also conducted interviews with officials in the state agencies that 
administer Recovery Act JAG funds--known as State Administering 
Agencies (SAA)--in the 14 states we selected for review. In addition, 
we selected a nonprobability sample of 62 local law enforcement 
agencies and other recipients receiving Recovery Act JAG funds within 
these 14 states and conducted interviews with cognizant officials from 
those jurisdictions that received the awards. These jurisdictions were 
selected based on award amount, degree of project completion, planned 
use of funds, and how they received their funds (either as passed- 
through funding from their SAA or localities who received awards 
directly from DOJ--and in some cases as part of disparate 
jurisdictions). Our interviews addressed the use and perceived impact 
of Recovery Act JAG funds, program performance measurement and 
reporting challenges, and the sharing of promising practices. Findings 
from our nonprobability samples cannot be generalized to all states 
and localities that were recipients of Recovery Act JAG funds; 
however, our samples provided us with illustrative examples of uses of 
funds, oversight processes, and reporting issues. Finally, we 
discussed DOJ's performance measurement efforts with DOJ staff and 
conducted an assessment of the performance measures applicable to the 
Recovery Act JAG activities commonly undertaken by the grant 
recipients in our sample to assess the extent to which they contained 
elements consistent with promising practices. Specifically, from DOJ's 
86 Recovery Act JAG performance measures, we selected a nonprobability 
sample of 19 that were (1) related to the largest share of reported 
Recovery Act JAG expenditures across certain activity types and (2) 
most often reported by the recipients in our sample.[Footnote 10] We 
then analyzed this sample against a set of key characteristics that we 
have previously reported as being associated with individual measures 
in successful performance measurement systems.[Footnote 11] See 
appendix I for a more complete description of our methodology and 
appendix II for a list and definition of the 19 performance measures 
we assessed. 

We conducted this performance audit from January 2010 through October 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

Background: 

JAG Purpose Areas: 

According to DOJ officials, the JAG program provides states and 
localities with federal funds to support all components of the 
criminal justice system while providing a great deal of flexibility in 
how they do so. Recovery Act JAG-funded projects may provide services 
directly to communities or improve the effectiveness and efficiency of 
criminal justice systems, processes, or procedures. Like non-Recovery 
Act JAG funds, Recovery Act JAG awards are to be used within the 
context of seven statutorily established areas. The seven statutorily 
[Footnote 12] established areas and examples of how JAG funds may be 
used within these areas are outlined in table 1 below. 

Table 1: Recovery Act JAG Program Areas with Illustrative Examples of 
Possible Fund Use: 

Program area: Law enforcement; 
Examples of some allowable uses of funds: Funds may be used for 
personnel costs and purchasing equipment. 
Personnel: 
Hiring, training, and employing on a continuing basis new or 
additional law enforcement officers and support personnel; Paying 
overtime to employed law enforcement officers and support personnel 
for the purposes of increasing the number of hours worked by such 
personnel; 
Equipment: 
Procuring equipment, computer technology, and other materials directly 
related to basic law enforcement functions. 

Program area: Prosecution and courts; 
Examples of some allowable uses of funds: Funds may be used for 
improving the operational effectiveness of the court process by 
expanding prosecutorial, defender and judicial resources and 
implementing court delay reduction programs. 

Program area: Crime Prevention and education; 
Examples of some allowable uses of funds: Funds may be used for 
providing community and neighborhood programs that assist citizens in 
preventing and controlling crime, including special programs that 
address the problems of crime committed against the elderly and 
special programs for rural jurisdictions. Funds may be used for 
establishing cooperative crime prevention programs between community 
residents. 

Program area: Corrections and community corrections; 
Examples of some allowable uses of funds: Funds may be used for 
programs designed to provide additional public correctional resources 
and improve the corrections system, including treatment in prisons and 
jails, intensive supervision programs and long-range corrections and 
sentencing strategies. Programs can include: (1) intensive 
supervision, probation, and parole; (2) substance abuse treatment; 
(3) correctional facilities planning/population projections; and (4) 
sentencing strategies development. 

Program area: Drug treatment and enforcement; 
Examples of some allowable uses of funds: Funds may be used for 
establishing or supporting drug court programs that include continuing 
judicial supervision over nonviolent offenders with substance abuse 
problems. Funds may also be used for programs, such as substance abuse 
treatment and relapse prevention, as well as multijurisdictional drug 
task forces. 

Program area: Planning, evaluation, and technology improvement; 
Examples of some allowable uses of funds: Funds may be used for 
criminal justice information systems to assist law enforcement, 
prosecution, courts, and corrections organizations. Examples of such 
information systems can include criminal justice records improvement 
and automated fingerprint identification systems. 

Program area: Crime victim and witness; 
Examples of some allowable uses of funds: Funds may be used to develop 
and implement programs which provide assistance to witnesses and 
assistance (other than compensation) to victims of crime. 

Source: GAO. 

[End of table] 

Financial Requirements and Internal Controls: 

DOJ requires that all Recovery Act JAG award recipients establish and 
maintain adequate accounting systems, financial records, and internal 
controls to accurately account for funds awarded to them and their 
subrecipients. Award recipients must also ensure that Recovery Act JAG 
funds are accounted for separately and not commingled with funds from 
other sources or federal agencies. If a recipient or subrecipient's 
accounting system cannot comply with the requirement to account for 
the funds separately, then the recipient/subrecipient is to establish 
a system to provide adequate fund accountability for each project that 
has been awarded. 

Recipient Reporting and Performance Measurement Requirements: 

All state and local Recovery Act JAG recipients are required to meet 
both Recovery Act and BJA quarterly reporting requirements. The 
Recovery Act requires that nonfederal recipients of Recovery Act funds 
(including recipients of grants, contracts, and loans) submit 
quarterly reports, which include a description of each project or 
activity for which Recovery Act funds were expended or obligated, and 
an estimate of the number of jobs created and the number of jobs 
retained by these projects and activities.[Footnote 13] In particular, 
the Recovery Act requires recipients to report on quarterly activities 
within 10 days of the end of each quarter. For Recovery Act JAG 
grants, BJA has added language in the grant awards that requires that 
grantees meet the federal reporting requirements and provides 
sanctions if they do not. Because the Recovery Act JAG program 
includes a pass-through element, SAAs must gather the required data 
elements for all pass-through recipients during the same 10-day time 
frame in order to meet their own reporting requirements. 

Separately, BJA requires that states and those localities receiving 
their funds directly through DOJ report on their progress in meeting 
established performance measures related to funded activities. 
[Footnote 14] BJA also requires all Recovery Act JAG recipients to 
submit an annual programmatic report with narrative information on 
accomplishments, barriers, and planned activities, as well as a 
quarterly financial status report as required by the Office of 
Management and Budget (OMB). In early 2010, after a year-long 
development and initial refinement period, BJA officially launched a 
new, online Performance Measurement Tool (PMT) to improve upon its 
previous grants management system and allow online performance 
measurement data submission.[Footnote 15] BJA plans to use the PMT to 
help evaluate performance outcomes in at least 13 grant programs, 
including Recovery Act JAG. According to the Standards for Internal 
Control in the Federal Government, activities need to be established 
to monitor performance measures and indicators. Such controls should 
be aimed at validating the integrity of performance measures and 
indicators--in other words, ensuring they are reliably designed to 
collect consistent information from respondents. BJA is also planning 
on using the PMT to assess performance measurement data and direct 
improvement efforts in 5 additional programs by the end of 2010. 
[Footnote 16] However, given that grantees were not required to submit 
their PMT reports until the second quarter of fiscal year 2010, some 
grantees did not begin submitting their first completed PMT reports 
until March 2010.[Footnote 17] 

BJA requires Recovery Act JAG recipients to use the PMT for quarterly 
reporting on their status in meeting the Recovery Act JAG program's 86 
individual performance measures, such as percent of staff who reported 
an increase in skills and percent of Recovery Act JAG-funded programs 
that have implemented recommendations based on program evaluation. 

Recovery Act JAG Funds Are Awarded in Different Ways and Recipients 
Report Using Their Awards to Support Law Enforcement and Corrections 
Activities among Other Things: 

Recipients of Recovery Act JAG funding receive their money in one of 
two ways--either as a direct payment from BJA or as a pass-through 
from an SAA--and they reported using their funds primarily for law 
enforcement and corrections. According to state officials from our 
sample states, more than half of the funding that localities received 
as pass-through awards from their SAAs was obligated specifically for 
law enforcement and corrections support, while about a quarter of the 
funds that recipients of direct awards received was dedicated 
exclusively to law enforcement. Regardless of the source, officials in 
states and localities reported using Recovery Act JAG funds to 
preserve jobs and activities that without Recovery Act JAG funds would 
have been cut or eliminated; however, expenditure rates across states 
in our sample showed considerable variation. 

Localities Receive Funding either Directly from BJA or as a Pass- 
Through from an SAA: 

BJA allocates Recovery Act JAG funds the same way it allocated non- 
Recovery Act JAG funds by combining a statutory formula determined by 
states' populations and violent crime statistics with a statutory 
minimum allocation to ensure that each state and eligible territory 
receives some funding. Under this statutory JAG formula, the total 
award allocated to a state is derived from two sources, each given 
equal value: half of the allocation is based on a state's respective 
share of the U.S. population, and the other half is based on the 
state's respective share of violent crimes, as reported in the Federal 
Bureau of Investigation's (FBI) Uniform Crime Report (UCR) Part I for 
the 3 most recent years for which data are available.[Footnote 18] Of 
such amounts awarded to states, 60 percent of a state's allocation is 
awarded directly to a SAA in each of the states, and each SAA must in 
turn allocate a formula-based share of these funds to local entities, 
which is known as the "pass-through portion."[Footnote 19] 

BJA awards the remaining 40 percent of the state's allocation directly 
to eligible units of local government within the state.[Footnote 20] 
The eligible units of local governments that receive direct awards 
from DOJ either get them individually or as part of awards to 
"disparate" jurisdictions which jointly use correctional facilities or 
prosecutorial services.[Footnote 21] In the cases of the disparate 
jurisdiction awards, to qualify for funds, the units of local 
government involved must submit a joint application to DOJ and sign a 
memorandum of understanding (MOU) outlining how they will share funds. 
They also are to determine amongst themselves which local government 
will serve as the fiscal agent, and thereby be responsible for 
reporting to DOJ on behalf of the others and ensuring that all members 
of the disparate jurisdiction follow applicable federal financial 
guidance and meet reporting requirements. The following figure 
illustrates the participation of localities in a disparate 
jurisdiction award. In the example, High Point city is the fiscal 
agent and Greensboro city and Guilford County are both subrecipients. 

Figure 1: Illustration of a Disparate Jurisdiction: 

[Refer to PDF for image: map of North Carolina; inset of Guilford 
County] 

Fiscal agent (prime recipient) of the grant: High Point; 
Subrecipients of the grant: Greensboro. 

Sources: U.S. Census Bureau, Census 2000 (state map); Guilford County, 
NC Department of Geographic Information Services (county map). 

[End of figure] 

The total awards that DOJ allocates directly to units of local 
government--the 40 percent share--are to be based solely on the local 
jurisdiction's proportion of the state's total violent crime 3-year 
average based on reports from the FBI's UCR Part I. Units of local 
government that could receive $10,000 or more after the Bureau of 
Justice Statistics (BJS) analyzes the UCR data are eligible for a 
direct award from DOJ. Funds that could have been distributed to 
localities through awards of less than $10,000 are grouped together 
and then provided to the SAA. Under the JAG program, SAAs and direct 
grant recipient agencies may draw down funds from the Treasury 
immediately rather than requiring up-front expenditure and 
documentation for reimbursement. Such funds are required to be 
deposited into an interest-bearing trust fund and, in general, any 
interest income that states and localities earn from the funds drawn 
down is to be accounted for and used for program purposes. 

Table 2 shows the total allocation of Recovery Act JAG funding across 
our sample states, including the grant amounts BJA made directly to 
the SAAs (the 60 percent share); the number of pass-through grants the 
SAAs made in turn; and the grant amounts and number of grants BJA made 
directly to localities (the 40 percent share). The 14 states in our 
sample received $1,033,271,865 in JAG Recovery Act funds, which was 
more than half of the funds awarded nationwide for the program. 

Table 2: Recovery Act JAG Awards across Our 14 Sample States, as of 
June 30, 2010: 

State: Arizona; 
Total Recovery Act JAG allocation: $41,966,266; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$25,306,956; 
Number of pass-through awards the SAA made: 41; 
Awards that went directly to localities-the "40 percent share"[B]: 
$16,659,310; 
Number of direct awards DOJ made: 37. 

State: California; 
Total Recovery Act JAG allocation: $225,354,622; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$135,641,945; 
Number of pass-through awards the SAA made: 226; 
Awards that went directly to localities-the "40 percent share"[B]: 
$89,712,677; 
Number of direct awards DOJ made: 149. 

State: Colorado; 
Total Recovery Act JAG allocation: $29,858,171; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$18,323,383; 
Number of pass-through awards the SAA made: 77; 
Awards that went directly to localities-the "40 percent share"[B]: 
$11,534,788; 
Number of direct awards DOJ made: 65. 

State: Georgia; 
Total Recovery Act JAG allocation: $59,045,753; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$36,210,659; 
Number of pass-through awards the SAA made: 232; 
Awards that went directly to localities-the "40 percent share"[B]: 
$22,835,094; 
Number of direct awards DOJ made: 181. 

State: Illinois; 
Total Recovery Act JAG allocation: $83,663,470; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$50,198,081; 
Number of pass-through awards the SAA made: 33; 
Awards that went directly to localities-the "40 percent share"[B]: 
$33,465,389; 
Number of direct awards DOJ made: 7. 

State: Iowa; 
Total Recovery Act JAG allocation: $18,702,718; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$11,777,401; 
Number of pass-through awards the SAA made: 38; 
Awards that went directly to localities-the "40 percent share"[B]: 
$6,925,317; 
Number of direct awards DOJ made: 47. 

State: Massachusetts; 
Total Recovery Act JAG allocation: $40,793,878; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$25,044,649; 
Number of pass-through awards the SAA made: 49; 
Awards that went directly to localities-the "40 percent share"[B]: 
$15,749,229; 
Number of direct awards DOJ made: 100. 

State: Michigan; 
Total Recovery Act JAG allocation: $67,006,344; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$41,198,830; 
Number of pass-through awards the SAA made: 123; 
Awards that went directly to localities-the "40 percent share"[B]: 
$25,807,514; 
Number of direct awards DOJ made: 87. 

State: Mississippi; 
Total Recovery Act JAG allocation: $18,394,045; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$11,199,389; 
Number of pass-through awards the SAA made: 7; 
Awards that went directly to localities-the "40 percent share"[B]: 
$7,194,656; 
Number of direct awards DOJ made: 75. 

State: New York; 
Total Recovery Act JAG allocation: $110,592,269; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$67,280,689; 
Number of pass-through awards the SAA made: 45; 
Awards that went directly to localities-the "40 percent share"[B]: 
$43,311,580; 
Number of direct awards DOJ made: 71. 

State: North Carolina; 
Total Recovery Act JAG allocation: $56,345,356; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$34,491,558; 
Number of pass-through awards the SAA made: 139; 
Awards that went directly to localities-the "40 percent share"[B]: 
$21,853,798; 
Number of direct awards DOJ made: 165. 

State: Ohio; 
Total Recovery Act JAG allocation: $61,645,375; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$38,048,939; 
Number of pass-through awards the SAA made: 189; 
Awards that went directly to localities-the "40 percent share"[B]: 
$23,596,436; 
Number of direct awards DOJ made: 72. 

State: Pennsylvania; 
Total Recovery Act JAG allocation: $72,372,843; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$45,453,997; 
Number of pass-through awards the SAA made: 120; 
Awards that went directly to localities-the "40 percent share"[B]: 
$26,918,846; 
Number of direct awards DOJ made: 51. 

State: Texas; 
Total Recovery Act JAG allocation: $147,530,755; 
Awards that went directly to the SAA--the "60 percent share"[A]: 
$90,295,773; 
Number of pass-through awards the SAA made: 478; 
Awards that went directly to localities-the "40 percent share"[B]: 
$57,234,982; 
Number of direct awards DOJ made: 231. 

Sources: GAO analysis of Bureau of Justice Assistance and SAA data. 

[A] Due to rounding, these amounts may not exactly equal 60 percent of 
the total JAG award. 

[B] Due to rounding, these amounts may not exactly equal 40 percent of 
the total JAG award. 

[End of table] 

DOJ Made a Large Percentage of Direct Award Funds Available as 
Disparate Jurisdiction Awards: 

Of the total of 1,338 direct awards that DOJ made to localities in the 
14 states in our sample, approximately one-third of these direct 
awards, or 436, went to disparate jurisdictions and are split by 
agreement among the designated jurisdictions. Under these 
arrangements, one jurisdiction functions as the prime recipient and 
fiscal agent who is supposed to be responsible for submitting all 
programmatic or financial reports on behalf of the disparate group as 
well as monitoring other neighboring localities' use of funds on 
activities covered by the grants. In our sample states, while one-
third of the total number of direct grant awards were made to 
disparate jurisdictions, these arrangements accounted for 72 percent 
of the funds DOJ awarded directly to local recipients. For example, in 
Illinois, 100 percent of direct awards were provided to disparate 
jurisdictions, and in 8 of the other 13 states DOJ awarded more than 
70 percent of funds in this manner. Officials we met with in 
localities that received funds under this type of arrangement reported 
that they provided varying amounts of oversight in there role as 
fiscal agent. The DOJ Inspector General has raised the oversight of 
subgrantee awards as an issue for DOJ's attention and has recommended 
that DOJ develop further training for recipients; DOJ concurred with 
the recommendation.[Footnote 22] Table 3 summarizes the distribution 
of direct award funds to disparate jurisdictions in our sample states. 

Table 3: Recovery Act JAG Disparate Jurisdiction Awards across Our 14 
Sample States, as of June 30, 2010: 

State: Arizona; 
Awards that went directly to localities-the "40 percent share"[A]: 
$16,659,310; 
Percent of funds awarded to disparate jurisdictions: 87.9%; 
Value of disparate jurisdiction awards[B]: $14,648,987. 

State: California; 
Awards that went directly to localities-the "40 percent share"[A]: 
$89,712,677; 
Percent of funds awarded to disparate jurisdictions: 79.3%; 
Value of disparate jurisdiction awards[B]: $71,158,804. 

State: Colorado; 
Awards that went directly to localities-the "40 percent share"[A]: 
$11,534,788; 
Percent of funds awarded to disparate jurisdictions: 61.3%; 
Value of disparate jurisdiction awards[B]: $7,073,073. 

State: Georgia; 
Awards that went directly to localities-the "40 percent share"[A]: 
$22,835,094; 
Percent of funds awarded to disparate jurisdictions: 47.2%; 
Value of disparate jurisdiction awards[B]: $10,777,559. 

State: Illinois; 
Awards that went directly to localities-the "40 percent share"[A]: 
$33,465,389; 
Percent of funds awarded to disparate jurisdictions: 100.0%; 
Value of disparate jurisdiction awards[B]: $33,465,419. 

State: Iowa; 
Awards that went directly to localities-the "40 percent share"[A]: 
$6,925,317; 
Percent of funds awarded to disparate jurisdictions: 96.5%; 
Value of disparate jurisdiction awards[B]: $6,680,835. 

State: Massachusetts; 
Awards that went directly to localities-the "40 percent share"[A]: 
$15,749,229; 
Percent of funds awarded to disparate jurisdictions: 24.9%; 
Value of disparate jurisdiction awards[B]: $3,918,486. 

State: Michigan; 
Awards that went directly to localities-the "40 percent share"[A]: 
$25,807,514; 
Percent of funds awarded to disparate jurisdictions: 90.6%; 
Value of disparate jurisdiction awards[B]: $23,392,722. 

State: Mississippi; 
Awards that went directly to localities-the "40 percent share"[A]: 
$7,194,656; 
Percent of funds awarded to disparate jurisdictions: 70.0%; 
Value of disparate jurisdiction awards[B]: $5,039,822. 

State: New York; 
Awards that went directly to localities-the "40 percent share"[A]: 
$43,311,580; 
Percent of funds awarded to disparate jurisdictions: 25.2%; 
Value of disparate jurisdiction awards[B]: $10,920,032. 

State: North Carolina; 
Awards that went directly to localities-the "40 percent share"[A]: 
$21,853,798; 
Percent of funds awarded to disparate jurisdictions: 72.4%; 
Value of disparate jurisdiction awards[B]: $15,830,038. 

State: Ohio; 
Awards that went directly to localities-the "40 percent share"[A]: 
$23,596,436; 
Percent of funds awarded to disparate jurisdictions: 97.7%; 
Value of disparate jurisdiction awards[B]: $23,047,606. 

State: Pennsylvania; 
Awards that went directly to localities-the "40 percent share"[A]: 
$26,918,846; 
Percent of funds awarded to disparate jurisdictions: 49.2%; 
Value of disparate jurisdiction awards[B]: $13,246,575. 

State: Texas; 
Awards that went directly to localities-the "40 percent share"[A]: 
$57,234,982; 
Percent of funds awarded to disparate jurisdictions: 87.3%; 
Value of disparate jurisdiction awards[B]: $49,978,561. 

Total: 
Awards that went directly to localities-the "40 percent share"[A]: 
$402,799,616; 
Percent of funds awarded to disparate jurisdictions: 71.8%; 
Value of disparate jurisdiction awards[B]: $289,178,519. 

Source: GAO analysis of BJA data. 

[A] Due to rounding, these amounts may not exactly equal 40 percent of 
the total JAG award. 

[B] Due to rounding, these amounts may not exactly equal the total 
amount. 

[End of table] 

SAAs Passed-Through about 50 Percent of Their Total Recovery Act 
Awards: 

The 14 SAAs in our sample received more than $630 million collectively 
as their share of the Recovery Act JAG funds. JAG statutory provisions 
require that each state pass-through no less than a specific 
designated minimum percentage of the funds that they receive as 
subgrants to localities, municipal governments, and nonprofit 
organizations. Among our sample states, this mandatory pass-through 
percentage varied from a high of 67.3 percent in California to a low 
of 35.5 percent in Massachusetts. SAAs are also allowed to retain up 
to 10 percent of the funds that they receive for administrative 
purposes. The completion of these pass-through award processes 
occurred at different rates across the 14 states that we sampled and 
resulted in some states expending their Recovery Act JAG funds faster 
than others. As of June 30, 2010, the SAAs we reviewed had made nearly 
all of their pass-through awards, with the exception of Mississippi 
and Pennsylvania. In addition, many local pass-through recipients 
reported that there was a time lag in being reimbursed by their SAAs 
for funds that they had spent. Additional information on amounts drawn 
down and expended is included in appendix IV. 

SAAs and Localities Expended Their Awards at Varying Rates: 

According to Recovery.gov, the SAAs and localities that received grant 
funds directly from DOJ in our sample of 14 states were awarded 
approximately $1.028 billion in Recovery Act JAG funds. This amount 
represents about 52 percent of the nearly $2 billion awarded to SAAs 
and directly funded localities across the nation. As of June 30, 2010, 
the SAAs and the directly funded localities in our sample expended 
over $270.7 million or about 26.4 percent of the total amount awarded. 
Recovery Act JAG fund recipients may spend their respective awards 
over a 4-year period. 

As depicted in figure 2 below, in the 14 states in our sample, the 
expenditure of Recovery Act JAG funds generally lags behind the amount 
of funds awarded by the SAAs and drawn down. For example, as of June 
30, 2010, California--whose SAA received the largest direct award in 
our sample--had expended only about $6.6 million of the $135 million, 
or nearly 5 percent, of JAG grant funds the state received. Texas 
reported expending the most--more than $37 million--after combining 
expenditures the SAA made independently with the expenditures made by 
the more than 400 pass-through recipients. 

Figure 2: Recovery Act JAG Funds Expended by the SAAs across our 14 
Sample States, as of June 30, 2010: 

[Refer to PDF for image: stacked vertical bar graph] 

State: Arizona	
State retained funds: $2.43 million; 
Pass-through to localities: $8.13 million. 

State: California	
State retained funds: $623,417; 
Pass-through to localities: $5.99 million. 

State: Colorado	
State retained funds: $613,506; 
Pass-through to localities: $3.65 million. 

State: Georgia	
State retained funds: $2.23 million; 
Pass-through to localities: $1.69 million. 

State: Illinois	
State retained funds: $9.83 million; 
Pass-through to localities: $3.61 million. 

State: Iowa	
State retained funds: $340,000; 	
Pass-through to localities: $3.58 million. 

State: Massachusets	
State retained funds: $13.27 million; 
Pass-through to localities: $3.34 million. 

State: Michigan	
State retained funds: $3.42 million; 
Pass-through to localities: $4.75 million. 

State: Mississippi	
State retained funds: $398,565; 
Pass-through to localities: $176,354. 

State: North Carolina	
State retained funds: $5.87 million; 
Pass-through to localities: $3.61 million. 

State: New York	
State retained funds: $2.85 million; 
Pass-through to localities: $3.87 million. 

State: Ohio	
State retained funds: $203,385; 
Pass-through to localities: $8.92 million. 

State: Pennsylvania	
State retained funds: $112,389; 
Pass-through to localities: $4.29 million. 

State: Texas	
State retained funds: $12.03 million; 
Pass-through to localities: $25.13 million. 

Source: GAO analysis of SAA data. 

[End of figure] 

California SAA officials stated they delayed in awarding JAG funds 
because of the design of two new programs focused on probation and 
drug offender treatment services that accounted for $90 million of the 
$135 million in grant funds the SAA received. As of June 30, 2010, 100 
percent of California's subrecipients were finalized through grant 
award agreements, but many projects have recently become fully 
operational resulting in the slow expenditure of funds which are 
handled on a reimbursement basis.[Footnote 23] In Pennsylvania, SAA 
officials said the state faced two challenges in expending Recovery 
Act JAG funds quickly: (1) a state budget impasse, which delayed the 
allocation of Recovery Act JAG awards; and (2) Recovery Act JAG 
funding for state projects focused on technology costs, which require 
lengthy procurement times. Further, they noted that state pass-through 
funding to localities is recorded on a quarterly basis after expenses 
are incurred, so the pace of expenditure could be somewhat misleading. 

Other SAA officials we contacted cited additional reasons for more 
slowly expending Recovery Act JAG funds. For example, all of the SAAs 
we contacted have procedures in place that require subrecipients to 
make their purchases up-front with local funds and request 
reimbursement from the SAA after documentation is received. Two states 
we contacted have policies that restricted Recovery Act JAG funding to 
shorter time limits with an option for renewal rather than providing 
localities authority to use grants during the 4-year grant period 
applicable to the initial recipient of the grant. In addition, 1 of 
the 14 SAAs had a preference to retain Recovery Act JAG funds and 
expend funds gradually in longer-term projects, such as technology 
improvements, as allowed during the 4-year grant period. 

SAAs and Localities Reported Using Recovery Act JAG Funds to Preserve 
Jobs and Programs, and a Relatively Large Percentage of Both Pass- 
Through and Direct Funds Were Used to Support law Enforcement 
Activities: 

Using funds received through direct and pass-through awards, all 
states reported using Recovery Act JAG funds to prevent staff, 
programs, or essential services from being cut. In addition, local 
officials reported that without Recovery Act JAG funding law 
enforcement personnel, equipment purchases, and key local law 
enforcement programs would have been eliminated or cut. SAAs reported 
that they passed through about 50 percent of their funds and 
collectively they planned to use the largest share--about 30 percent, 
or almost $168 million--for law enforcement purposes. Direct 
recipients reported that funds were most often to be used for multiple 
purposes. 

States and Localities Used Recovery Act JAG Funds to Help Preserve 
Jobs and Services: 

Officials from all states in our sample reported using Recovery Act 
JAG funds to prevent staff, programs, or essential services from being 
cut. Also, 19 percent of localities in GAO's sample, or officials in 
12 of 62 localities, provided specific examples of ongoing local law 
enforcement programs or activities, such as juvenile recidivism 
reduction programs, prisoner re-entry initiatives, and local foot or 
bicycle patrols in high-crime neighborhoods that would not have 
continued without the addition of these funds. Table 4 provides some 
examples that state and local recipients reported regarding how they 
used Recovery Act JAG funds to help them preserve jobs and essential 
services. 

Table 4: State and Local Recipients' Reported Use of Recovery Act JAG 
Funds to Prevent Staff, Programs, or Services from Being Cut or 
Eliminated: 

State: California; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Recovery Act JAG 
funds helped support jobs and programs including substance abuse 
treatment. 

State: Iowa; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Funds allowed the 
state to continue regional drug task forces and community crime 
prevention programs. 

State: Illinois; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Local officials 
said that prisoner re-entry programs and information technology 
improvements would have been eliminated without Recovery Act JAG funds. 

State: Massachusetts; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Law enforcement 
personnel were retained and core health services for inmates were 
maintained using Recovery Act JAG funds. 

State: Michigan; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Gaps in criminal 
agency budgets across multiple criminal justice agencies were filled 
by Recovery Act JAG funds. Replacement vehicles and equipment were 
purchased and sworn officers and other personnel were retained with 
Recovery Act JAG funds. 

State: Ohio; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Police officers 
and other staff were retained who would otherwise have been laid off 
without Recovery Act JAG funds. 

State: Pennsylvania; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Staff in 
prosecution and probation offices were retained and juvenile services 
programs were spared from cuts using JAG Recovery Act funds. 

State: Texas; 
Illustrative examples of projects, activities, or staff positions 
reported preserved through Recovery Act JAG funding: Necessary 
equipment or technology improvements were made and law enforcement 
personnel, such as one entire police academy class of 41 officers, 
were retained using Recovery Act JAG funds. 

Source: GAO analysis of SAA and locality data. 

[End of table] 

SAAs Report That More than Half of Funding They Passed-Through Was 
Designated for Law Enforcement and Corrections, but Funded Activities 
Varied: 

SAAs reported that they awarded the largest share--about 30 percent, 
or almost $168 million--for law enforcement purposes, such as hiring 
or retaining staff who might otherwise have been laid off, or 
purchasing equipment in direct support of law enforcement activities, 
as shown in figure 3. In addition, SAAs reported awarding 
approximately 24 percent, or more than $137 million, to support 
corrections programs or activities. SAAs reported allocating the 
smallest share for crime victim and witness programs, 2.1 percent or 
approximately $11.8 million. 

Figure 3: SAA Awards of Recovery Act JAG Funds by the Seven Allowable 
Program Categories across Our 14 Sample States[A]: 

[Refer to PDF for image: pie-chart] 

Law enforcement: $168,452,562 (29.8%); 
Corrections: $137,673,969 (24.4%); 
Prosecution and courts: $75,390,630 (13.4%); 
Drug treatment and enforcement $75,132,681 (13.3%); 
Program planning, evaluation, and technology improvement: $68,760,405 
(12.2%). 
Crime prevention and education: $27,323,285 (4.8%); 
Crime victim and witness programs: $11,825,08 (2.1%). 

Source: GAO analysis of SAA data. 

[A] Figure does not include the approximately $64 million--or about 10 
percent of the total amount awarded across the 14 states in our 
sample--in state-retained funds for administration, funds yet to be 
awarded, or funds designated for other purposes. 

[End of figure] 

Within the category of law enforcement, equipment expenditures spanned 
a wide range of law enforcement gear, but vehicles and weapons 
purchases were often reported. Frequent types of purchases included: 

* police cruisers; 

* weapons, such as TASERs, and ammunition;[Footnote 24] 

* communications devices, such as hand-held two-way radios, and mobile 
laptops in police cruisers; and: 

* safety equipment, such as protective vests and shields. 

See appendix V for examples of selected equipment purchased with JAG 
funds. 

Overall, localities in 13 out of the 14 states we contacted reported 
using Recovery Act JAG funds to maintain positions or pay officer 
overtime for activities related to law enforcement. Individual SAAs, 
however, reported obligating their Recovery Act JAG funds in a variety 
of ways as shown in table 5. The percentages do not include the funds 
that the SAAs retained for administrative purposes or funds not yet 
awarded. 

Table 5: Percent Share of SAA's Reported Recovery Act JAG Obligations 
by Program Area and across Our 14 Sample States, as of June 30, 
2010[A]: 

State: Arizona[B]; 
Law enforcement: 38.7; 
Corrections: 0; 
Drug treatment and enforcement: 0; 
Prosecution and courts: 48.2; 
Program planning, evaluation and technology improvements: 5.5; 
Crime prevention and education: 0; 
Crime victim & wellness programs: 0. 

State: California; 
Law enforcement: 22.5; 
Corrections: 33.3; 
Drug treatment and enforcement: 33.1; 
Prosecution and courts: 9.0; 
Program planning, evaluation and technology improvements: 0.1; 
Crime prevention and education: 0.6; 
Crime victim & wellness programs: 1.4. 

State: Colorado; 
Law enforcement: 11.6; 
Corrections: 37.8; 
Drug treatment and enforcement: 13.7; 
Prosecution and courts: 12.0; 
Program planning, evaluation and technology improvements: 13.2; 
Crime prevention and education: 9.5; 
Crime victim & wellness programs: 2.3. 

State: Georgia; 
Law enforcement: 43.7; 
Corrections: 15.9; 
Drug treatment and enforcement: 0.8; 
Prosecution and courts: 27.5; 
Program planning, evaluation and technology improvements: 4.7; 
Crime prevention and education: 0.6; 
Crime victim & wellness programs: 6.9. 

State: Illinois; 
Law enforcement: 25.0; 
Corrections: 34.1; 
Drug treatment and enforcement: 1.0; 
Prosecution and courts: 18.1; 
Program planning, evaluation and technology improvements: 9.2; 
Crime prevention and education: 12.6; 
Crime victim & wellness programs: 0. 

State: Iowa; 
Law enforcement: 0; 
Corrections: 18.2; 
Drug treatment and enforcement: 76.7; 
Prosecution and courts: 0; 
Program planning, evaluation and technology improvements: 0.4; 
Crime prevention and education: 4.7; 
Crime victim & wellness programs: 0. 

State: Massachusetts; 
Law enforcement: 27.6; 
Corrections: 56.0; 
Drug treatment and enforcement: 0; 
Prosecution and courts: 0; 
Program planning, evaluation and technology improvements: 2.7; 
Crime prevention and education: 13.8; 
Crime victim & wellness programs: 0. 

State: Michigan; 
Law enforcement: 56.8; 
Corrections: 3.8; 
Drug treatment and enforcement: 0; 
Prosecution and courts: 33.4; 
Program planning, evaluation and technology improvements: 3.5; 
Crime prevention and education: 2.5; 
Crime victim & wellness programs: 0. 

State: Mississippi; 
Law enforcement: 37.8; 
Corrections: 0; 
Drug treatment and enforcement: 26.1; 
Prosecution and courts: 8.2; 
Program planning, evaluation and technology improvements: 26.0; 
Crime prevention and education: 2.0; 
Crime victim & wellness programs: 0. 

State: New York; 
Law enforcement: 1.7; 
Corrections: 53.7; 
Drug treatment and enforcement: 26.3; 
Prosecution and courts: 15.1; 
Program planning, evaluation and technology improvements: 3.3; 
Crime prevention and education: 0; 
Crime victim & wellness programs: 0. 

State: North Carolina; 
Law enforcement: 7.2; 
Corrections: 6.1; 
Drug treatment and enforcement: 0; 
Prosecution and courts: 1.9; 
Program planning, evaluation and technology improvements: 71.9; 
Crime prevention and education: 13.0; 
Crime victim & wellness programs: 0. 

State: Ohio; 
Law enforcement: 35.0; 
Corrections: 23.6; 
Drug treatment and enforcement: 2.7; 
Prosecution and courts: 8.0; 
Program planning, evaluation and technology improvements: 10.2; 
Crime prevention and education: 13.0; 
Crime victim & wellness programs: 7.6. 

State: Pennsylvania; 
Law enforcement: 3.6; 
Corrections: 18.6; 
Drug treatment and enforcement: 0; 
Prosecution and courts: 15.7; 
Program planning, evaluation and technology improvements: 21.0; 
Crime prevention and education: 24.0; 
Crime victim & wellness programs: 17.1. 

State: Texas; 
Law enforcement: 65.8; 
Corrections: 2.5; 
Drug treatment and enforcement: 0.1; 
Prosecution and courts: 2.5; 
Program planning, evaluation and technology improvements: 27.9; 
Crime prevention and education: 0.1; 
Crime victim & wellness programs: 1.1. 

Source: GAO analysis of SAA data. 

[A] Percentages do not include the approximately $64 million--or about 
10 percent of the total amount awarded across the 14 states in our 
sample--in state-retained funds for administration, funds yet to be 
awarded, or funds designated for other purposes. 

Due to rounding, some percentage figures may not total to exactly 100 
percent. 

[B] Arizona SAA officials reported using approximately 7.7 percent of 
almost $23 million of obligated funds for Forensic Laboratory 
Services, which they did not include among the seven program areas 
above. 

[End of table] 

Nearly all SAAs in our sample states, except for Iowa, which reported 
using most of its funds to support drug enforcement activities, 
reported using Recovery Act JAG funds to support law enforcement 
activities. With the exception of Iowa, at the state level the share 
of Recovery Act JAG funds used to support direct equipment purchases 
and personnel expenses ranges from a high of 65.8 percent in Texas to 
a low of 1.7 percent in New York. 

Localities in more than a third of the states in our sample (5 of 14) 
reported that uncertainties about the availability of future JAG 
funding steered them toward one-time equipment purchases, such as the 
procurement of license plate readers and in-car laptop computers, 
rather than investments, such as hiring new personnel, that would 
require an ongoing commitment of funds and whose sustainability could 
be threatened when Recovery Act JAG funds expire. 

In addition, officials in about a quarter of the localities in our 
sample (15) discussed how they coordinate the use of their Recovery 
Act JAG funds with resources that they received from other federal 
funding streams. For example, the cities of Austin, Texas and 
Greensboro, North Carolina were each waiting to receive a separate 
federal grant specifically for the purpose of hiring police officers 
so that they could determine whether to spend Recovery Act JAG funds 
to equip the officers once hired.[Footnote 25] See figure 4 for an 
interactive map with additional information on Recovery Act JAG funds 
purchases and activities in our sample states. 

Figure 4: Map of SAAs and Planned Uses of Recovery Act JAG Awards by 
the Seven Allowable Program Categories Across our 14 Sample States: 

[Refer to PDF for image: map of the U.S.] 

Interactive features: Click your mouse over the state highlighted in 
blue for more information on the state's planned use of JAG awards by 
the seven allowable program categories. Information on the SAA's 
planned use of JAG awards, illustrative examples, and some pictures 
will also appear. To see the full text, see appendix VI. 

14 highlighted states are: 

Arizona; 
California; 
Colorado; 
Georgia; 
Illinois; 
Iowa; 
Massachusetts; 
Michigan; 
Mississippi; 
New York; 
North Carolina; 
Ohio; 
Pennsylvania; 
Texas. 

Sources: GAO analysis; Map Resources (map). 

[End of figure] 

Direct Award Recipients Reported Using Recovery Act JAG Funds for a 
Wider Array of Purposes, Including Law Enforcement and Technology 
Programs: 

As shown in figure 5, data reported by direct recipient localities in 
the 14 states that we sampled[Footnote 26] indicate that they 
obligated the largest share--more than 63 percent, or over $256 
million--for multiple purposes and 21.5 percent, or about $86.8 
million, to directly support law enforcement programs or 
activities.[Footnote 27] Program planning, evaluation, and technology 
improvement funds, which accounted for approximately 8 percent of 
spending, were primarily used to enhance communications equipment or 
purchase computer hardware and software for all types of criminal 
justice agencies and programs. Based on the information grantees 
reported to Recovery.gov, the number of the projects reported has 
dropped slightly over the last three reporting periods since projects 
that are completed discontinue reporting. This was the case most often 
when funds were used for discrete equipment purchases, such as law 
enforcement vehicles, laptop computers in police cars, or weapons. 

Figure 5: Planned Uses of Recovery Act JAG Awards to Direct Recipients 
by the Seven Allowable Program Categories across Localities Within our 
14 Sample States[A]: 

[Refer to PDF for image: pie-chart] 

Multiple program areas funded: $256,196,435 (63.4%); 
Law enforcement: $86,826,186 (21.5%); 
Program planning, evaluation and technology improvement: $32,246,688 
(8%); 
Not enough information to identify purpose: $18,354,724 (4.5%); 
Crime prevention and education: $4,823,075 (1.2%); 
Prosecution and courts: $2,683,042 (0.7%); 
Drug treatment and enforcement: $1,325,453 (0.3%); 
rime victim and witness programs: $955,017 (0.2%); 
Corrections: $733,038 (0.2%). 

Source: GAO analysis of Recovery Act data. 

[A] Data from approximately 10 recipients who have likely completed 
activities and discontinued reporting by June 30, 2010, are not 
included in this figure. 

[End of figure] 

State Administering Agencies Cited Challenges Meeting Quarterly 
Recovery Act Reporting Time Frames: 

A majority of the SAA officials we interviewed said that workload 
demand and personnel shortages made meeting Recovery Act mandated 
deadlines within the prescribed reporting period difficult. Section 
1512(c) of the Recovery Act requires that each Recovery Act award 
recipient submit a report no later than 10 days after the end of each 
quarter to the federal awarding agency. In the case of Recovery Act 
JAG, the federal awarding agency is DOJ. The Section 1512(c) report 
that Recovery Act recipients, such as Recovery Act JAG recipients, are 
required to submit must contain the following data: (1) the total 
amount of recovery funds received from the federal awarding agency; 
(2) the amount of recovery funds received that were expended or 
obligated to projects or activities; and (3) a detailed list of all 
projects or activities for which recovery funds were expended or 
obligated.[Footnote 28] All 14 SAAs we contacted said that they had 
the necessary systems in place to account for Recovery Act JAG funds 
received and that subrecipients were generally in compliance with 
their financial reporting requirements. 

Officials in 10 out of 14 SAAs in our sample specifically cited the 
Recovery Act's window of reporting no later than 10 days after the end 
of each quarter as challenging. Officials in 8 out of 14 SAAs in our 
sample said that meeting federal Recovery Act reporting requirements 
increased staff workload and about one-third of the SAAs told us that 
personnel shortages have created challenges in their abilities to 
specifically meet Recovery Act reporting deadlines. For example, 
officials for one county in Colorado noted that increased reporting 
responsibilities associated with Recovery Act JAG grants resulted in 
one full-time staff member spending nearly 2 full work weeks on 
federal oversight and reporting requirements over a 5 ½-month time 
frame. Officials noted that the same individual spent 16 hours on 
reporting requirements for a non-Recovery Act JAG award and a state 
pass-through award during the same time period. Furthermore, officials 
in Texas, New York, and Mississippi said they required additional 
personnel to manage Recovery Act awards and meet reporting 
requirements. In addition, an official in one SAA also told us that 
because of short data collection time frames they initially submitted 
incomplete quarterly data and likely underreported the impact of the 
Recovery Act JAG program in the first two quarterly 1512(c) reports. 

While state and local officials we interviewed said that meeting the 
1512(c) report's 10-day time frame remains challenging, none of the 
states in our sample said that they were unable to meet the 1512(c) 
reporting deadline. In addition, the number of direct award recipients 
that completed the report has generally remained constant (around 800) 
over the three reporting quarters from October 1, 2009, to June 30, 
2010.[Footnote 29] 

DOJ awarded over 70 percent, or more than $289 million of direct award 
funds, to 436 disparate jurisdictions. DOJ guidance states that the 
recipient (i.e., fiscal agent) in each disparate jurisdiction is 
responsible for monitoring "subawards" and for "oversight of 
subrecipient spending and monitoring of specific outcomes and benefits 
attributable to the use of Recovery Act funds by its subrecipients." 
[Footnote 30] DOJ guidance provides detailed information on financial 
and accounting requirements for direct recipients and subrecipients of 
DOJ grant programs. The guidance also states that fiscal agents must 
implement and communicate a policy for reviewing subrecipient data. 
DOJ guidance, however, does not provide instruction on what a 
subrecipient monitoring or data policy should include; nor does it 
state how outcomes and benefits tied to the Recovery Act should be 
monitored. The DOJ Office of the Inspector General issued a report in 
August 2010 which included the results of grant audits it performed 
across 12 state and local recipients of both Recovery Act and non-
Recovery Act JAG program funds.[Footnote 31] The Inspector General 
found that 7 of the 12 grant recipients had deficiencies in the area 
of monitoring of subrecipients and contractors. The Inspector General 
recommended that DOJ's Office of Justice Programs provide additional 
training and oversight of JAG recipients to ensure that they establish 
policies and procedures for monitoring subrecipients' activities to 
provide reasonable assurance that subrecipients administer JAG funds 
in accordance with program guidelines. DOJ concurred with the 
recommendation that it provide additional training and oversight over 
the monitoring of subrecipient activities, and plans to review 
financial training course content to ensure that proper internal 
control guidance on subrecipient monitoring is included. DOJ 
anticipates developing a training module specific to subrecipient 
monitoring by March 31, 2011. 

States Reported Sharing Information and Promising Practices in a 
Variety of Ways and DOJ Encouraged This through a Number of Programs: 

All of the SAAs we contacted (14 of 14) reported that they generally 
shared Recovery Act JAG information, promising practices, or lessons 
learned with other states and localities using a variety of 
techniques. Furthermore, DOJ had developed a number of programs that 
encourage the sharing of information and promising practices.[Footnote 
32] 

State SAA officials told us that efforts to share information with one 
another or amongst the localities in their jurisdictions include in- 
person meetings, telephone calls, e-mail, Web postings, and/or hosting 
conferences. In addition, the SAA officials told us they find value in 
sharing information by attending DOJ training sessions and conferences 
and participating in programs and events sponsored by associations, 
such as the National Governors Association (NGA), the National 
Criminal Justice Association (NCJA), and the Council of State 
Governments (CSG). For example: 

* Texas officials developed an electronic state government grant 
management and tracking system that they stated is helpful and 
efficient in managing Recovery Act JAG funds. Texas officials told us 
they shared the design of this online system with several states. In 
addition, during BJA conferences and other national training 
conferences, Texas officials noted that they took the opportunity to 
discuss with other states the promising practices and lessons learned 
related to grant management and the administration of JAG funds using 
their system. 

* Colorado officials said that SAA staff made presentations at 
national and regional conferences regarding the following: (1) grant 
management and monitoring of state uses for effective grant 
administration, (2) various programs the state has funded, and (3) 
outcomes the state has achieved. SAA officials said that the state 
encourages subgrantees that have demonstrated successful programs to 
respond to requests for presenters at state and national conferences. 
Officials told us that staff from three Colorado Recovery Act JAG 
subgrantee projects made presentations at the NCJA Western Regional 
Conference in April 2010. For example, Colorado officials told us that 
one presentation involved the retraining of probation and parole 
officers to reduce recidivism by working with other agencies in taking 
an overall supportive approach to working with ex-offenders that 
included assistance in such areas as housing, health, and finding work. 

* Ohio officials told us they take the initiative to contact other 
SAAs to discuss and share experiences, lessons learned, and promising 
practices regarding problems encountered in administering Recovery Act 
JAG grants. They also said that NCJA provides SAAs with a forum to 
share information and challenges associated with administering 
recovery funds, which Ohio has leveraged. For example, they stated 
that at the 2010 NCJA Mid-Western Regional Conference that Ohio 
officials attended, there were sessions where SAAs shared experiences 
about the administration of Recovery Act funds, as well as were 
workshops on model projects funded through the Recovery Act. According 
to Ohio officials, the information was helpful both in terms of 
planning their own initiatives and in reaffirming decisions they had 
made regarding Recovery Act and Recovery Act JAG programs. 

* Illinois officials told us that they hosted a 2-day criminal justice 
planning summit in September 2010 for all state actors in the criminal 
justice system including Recovery Act JAG practitioners, policymakers, 
academics, and legislators. According to SAA officials, the focus of 
the summit was on how to fight crime more effectively in a time of 
diminishing resources by using the promising evidence-based practices. 
State summit planners told us that both presentations by state and 
national experts and workshops focused on implementing promising 
practices, while the emphasis in follow-up work groups was on 
producing a long-range criminal justice plan for the state of 
Illinois. In addition, SAA officials told us that they share promising 
practices and lessons learned by participating in regional training 
conferences, Web-based seminars, and/or informational conferences 
provided by OMB, DOJ, as well as Illinois state agencies. 

DOJ encourages information sharing through regional training 
conferences, Web-sites, and Web-based clearinghouses. For example, 
training meetings and Webinars provide a forum which states find 
valuable for sharing information and promising practices, according to 
a majority of (9 of the 14) states we interviewed. In addition, BJA 
has developed a Web site that illustrates examples of successful 
and/or innovative Recovery Act JAG programs. The Web site highlights 
JAG subgrantees and/or statewide projects that BJA believes show 
promise in meeting the objectives and goals of Recovery Act JAG. In 
particular, the site describes the planned Illinois criminal justice 
information strategic planning initiative and summit discussed above. 
Further, DOJ's Office of Justice Programs is in the process of 
developing an informational Web-based clearinghouse of promising 
practice information for the criminal justice community through a 
public Web site where researchers, grant applicants, and others may 
find a list of model programs proven to be effective. According to DOJ 
officials, it will also be a site that SAAs can use to help find best 
practices and model programs, thereby funding discretionary programs 
that show promise based upon evidence. While the focus of the DOJ 
information-sharing programs is broader than Recovery Act JAG, they 
offer methods and mechanisms to share information related to program 
priorities, such as law enforcement, corrections, and technology 
improvement. SAA officials, in a majority of the states we 
interviewed, indicated that they were supportive of these efforts. 

In addition, national associations such as NGA, CSG, and NCJA 
encourage states to share information and promising practices. The 
focus of these programs is generally broader than Recovery Act JAG, 
but some exclusively focus on Recovery Act JAG priorities such as law 
enforcement, corrections, and technology improvement. For example, BJA 
has funded NCJA to provide on-site training and technical assistance, 
Webinars, and regional conferences, and creates and disseminates 
publications to assist SAAs in developing their statewide criminal 
justice plans and ensure effective use of Recovery Act JAG funds. NCJA 
also serves as an information clearinghouse on innovative programming 
from across the nation, and coordinates information sharing for the 
justice assistance community. 

DOJ's Performance Measures Could Better Assess Progress Consistent 
with Characteristics of Successful Performance Measurement Systems: 

DOJ developed and implemented 86 new performance measures for the 
Recovery Act JAG program in 2009 and continues to make efforts to 
improve them, but the current set of performance measures varies in 
the degree to which it includes key characteristics of successful 
performance measurement systems. According to DOJ officials, these 
performance measures are currently being refined in consultation with 
stakeholders, such as SAAs and the external contractor hired to 
maintain the PMT. We acknowledge that creating such measures is 
difficult, given that the performance measurement system is under 
development, but until these measures are refined, they could hinder 
the department's ability to assess and communicate whether the goals 
of the Recovery Act JAG program are being achieved. In addition, 
states conveyed mixed perspectives about the utility of DOJ's 
performance measurement tool which enables recipients to self-identify 
activities associated with their grant and then self-report on the 
relevant set of performance measures under each activity. DOJ has not 
yet completed development of a mechanism to verify the accuracy of 
this recipient-reported information in the PMT.[Footnote 33] 

DOJ's Performance Measures Lack Some Key Characteristics of Successful 
Assessment Systems: 

From the more than 80 Recovery Act JAG performance measures, we 
analyzed a nonprobability sample of 19 (see appendix II) and found 
several areas where the measures could better reflect the 
characteristics that our prior work has shown to support successful 
assessment systems (see appendix III)[Footnote 34] For example, the 19 
Recovery Act JAG performance measures we reviewed generally lacked, in 
varying degrees, several key attributes of successful performance 
measurement systems, such as clarity, reliability, linkages with 
strategic or programmatic goals, objectivity, and the measurability of 
targets. DOJ officials acknowledge the limitations of the current 
system and are undertaking efforts to refine Recovery Act JAG 
performance measures. As we have previously reported, performance 
measures that evaluate program results can help decision makers make 
more informed policy decisions regarding program achievements and 
performance.[Footnote 35] By including key attributes of successful 
performance measurement systems into its performance measure 
revisions, DOJ could facilitate accountability, be better positioned 
to monitor and assess results, and subsequently improve its grants 
management.[Footnote 36] 

Table 6 describes 5 of 9 key characteristics of successful assessment 
systems and the potentially adverse consequences agencies face when 
omitting these attributes from their measurement design. These 5 
characteristics--clarity, reliability, linkage to strategic goals, 
objectivity, and measurable targets--are attributes that may be most 
effectively used when reviewing performance measures individually. 
There are 4 others--governmentwide priorities, core program 
activities, limited overlap, or balance--that are best used when 
reviewing a complete set of measures. Since we selected a 
nonprobability sample of 19 measures that were most closely associated 
with the majority of expenditures, we focused our analysis on the 5 
that could be applied to individual measures and did not assess the 
sample for the other 4 attributes that are associated with an 
evaluation of a full set of measures. Nevertheless, these 4 attributes 
also can provide useful guidance when establishing or revising a set 
of performance measures as a whole.[Footnote 37] 

Table 6: Key Characteristics of Individual Performance Measures: 

Characteristic: Clarity; 
Definition: Measure is clearly stated and the name and definition are 
consistent with the methodology used to calculate it. A measure that 
is not clearly stated is one that contains extraneous information or 
omits key data elements or has a name or definition that is 
inconsistent with how it is calculated; 
Potentially adverse consequences of omission: Data could be confusing 
and misleading to users. 

Characteristic: Reliability; 
Definition: Measure produces the same result under similar conditions; 
Potentially adverse consequences of omission: Reported performance 
data are inconsistent and add uncertainty. 

Characteristic: Linkage to strategic goals; 
Definition: Measure is aligned with unit and agencywide goals/missions 
and is clearly communicated throughout the organization; 
Potentially adverse consequences of omission: Behaviors and incentives 
created by measures do not support the fulfillment of division or 
agencywide goals/mission. 

Characteristic: Objectivity; 
Definition: Measure is reasonably free from significant bias or 
manipulation; 
Potentially adverse consequences of omission: Performance assessments 
may be systematically over-or understated. 

Characteristic: Measurable targets; 
Definition: Measure has a numerical goal; 
Potentially adverse consequences of omission: Cannot tell whether 
performance is meeting expectations. 

Sources: GAO-03-143, GAO-10-835, Drug Control: DOD Needs to Improve 
Its Performance Measurement System to Better Manage and Oversee Its 
Counternarcotics Activities (July 2010), and GAO-10-837, Merida 
Initiative: The United States Has Provided Counternarcotics and 
Anticrime Support but Needs Better Performance Measures (July 2010). 

[End of table] 

In conducting our analysis, we applied the 5 characteristics most 
applicable to assessment of individual performance to the 19 measures 
in our nonprobability sample. Our analysis found that 5 of the 19 
measures were clearly defined but the remaining 14 were not, which is 
inconsistent with DOJ's guidance to grant recipients for assessing 
program performance. In particular, DOJ advises that states' grant 
programs should have performance measures with "clearly specified 
goals and objectives."[Footnote 38] In addition, 14 of the 19 measures 
were not linked to DOJ's strategic or programmatic goals. We also 
found that while 9 out of the 19 measures were objective, 13 out of 19 
were not reliable, and 17 out of the 19 measures did not have 
measurable targets. 

In addition to our analysis, we provided a standard set of questions 
to officials across our sample states seeking their perspectives on 
how effectively the Recovery Act JAG performance measures evaluate 
program results. These officials provided their comments about the PMT 
and raised concerns about how the performance measures lack clarity, 
reliability, and linkage to strategic goals. 

Clarity: 

From our analysis we determined that 14 out of the 19 measures we 
analyzed lacked sufficient descriptive detail to facilitate precise 
measurement. For example, our analysis found that 1 of DOJ's measures 
associated with evaluating personnel activities is the "percent of 
departments that report desired efficiency." However, for this 
measure, DOJ's guidance based on the definition provided in the 
performance measure lacks key data elements that would make the 
measure more clear--namely, which departments should be included in 
the measure or how states and localities should interpret "desired 
efficiency." 

In addition, officials we interviewed from 9 of the 14 SAAs in our 
sample stated that DOJ's Recovery Act JAG performance measures were 
unclear. Some examples of states' perspectives follow: 

* In particular, an official from the Texas SAA told us that Texas 
refined its state data collection tool to clarify performance measure 
guidance and eliminate instances where DOJ rejected data entries 
because the measure was not clear. As another example, according to 
Texas officials, one of the DOJ performance measures related to 
training is "Other forms of training conducted during the reporting 
period." However, Texas state officials noted that BJA did not clarify 
whether this measure would include non-Recovery Act training. As a 
result, the Texas state data collection tool revised the performance 
measure for better context and asked for the "the number of other 
forms of training conducted during the reporting period and paid with 
ARRA JAG funds." 

* Other state officials from Michigan and Georgia cited challenges in 
understanding what is being asked by the 13 measures listed under the 
activity type, "state and local initiatives." In particular, one of 
these states noted confusion and lack of clarity related to the 
measure, "number of defined groups receiving services," since in many 
instances their initiatives were associated with equipment purchases, 
and it would be difficult to determine who and how many benefited from 
a new computer system or the acquisition of new ammunition, for 
example. 

* Ohio and Pennsylvania state officials noted that DOJ uses 
terminology such as "efficiency" and "quality" that is not clearly 
defined. 

Officials we interviewed from another five states stated that they 
could not understand whether the term "personnel" should include the 
entire agency or department that was awarded the Recovery Act JAG 
grant or if it should include only the portion of staff within a 
department that is directly affected by the funding. When we discussed 
with DOJ officials our concerns that the performance measure 
definitions at times lacked clarity, they stated that each was 
defined, but that further work was being done to solicit feedback from 
grantees on the measures and their definitions. However, as we 
discussed above, our analysis determined that 14 out of the 19 
measures do not have clear definitions. DOJ officials noted that the 
department hosts several training opportunities designed to provide 
grantees opportunities for clarification, including two Webinars every 
quarter and ongoing field training. DOJ officials also explained that 
they hired an external contractor to operate the PMT Help Desk to 
provide grantees guidance from 8:30-5:00 EST. However, officials from 
three states we contacted noted that while the PMT Help Desk provided 
useful technical assistance, the Help Desk provided limited guidance 
to clarify the definition of performance measures. Therefore, 
officials from these states reported being confused about what to 
report. In July 2010, we reported that a measure not clearly stated 
can confuse users and cause managers or other stakeholders to think 
that performance was better or worse than it actually was.[Footnote 39] 

Reliability: 

Our analysis showed that 13 out of 19 measures could lead to 
unreliable findings because respondents could interpret and report on 
the measures inconsistently. A performance measure is considered 
reliable when it is designed to collect data or calculate results such 
that each time the measure is applied--in the same situation--a 
similar result is likely to be reported. Respondents' inconsistent 
interpretation of the measures could preclude using many of the 
measures as indicators of performance. For example, we found that one 
measure: "the percent of departments that report desired efficiency," 
was measured and reported on differently by different recipients. 
According to SAA officials in one state, different police department 
units in a single large metropolitan area counted themselves as 
separate departments, while according to SAA officials in another 
state, all police department units were counted collectively as one. 
In another state, SAA staff stated that BJA's guidance document for 
the Recovery Act JAG performance measures did not provide enough 
instruction to ensure that agencies reported the correct data. For 
example, the staff said they could not determine whether the PMT 
measure for "the number of personnel retained with Recovery Act JAG 
funds during the reporting period" was to include any personnel 
position paid for with Recovery Act JAG funds during the reporting 
period, or to represent an unduplicated number of personnel positions 
retained with Recovery Act JAG funds during the reporting period. 
Given the confusion, the officials sought and received guidance from 
the Help Desk on how to interpret and report the measure. Further, 
officials from 4 of the 14 SAAs in our sample expressed concern about 
possible inconsistent data entry among the subrecipients of their pass-
through grants. For example, officials from Ohio noted that since 
subrecipients had their own interpretation of how to report on the 
measures, they believed that there would be a lack of consistency and 
reliability within the state as well as across all states once BJA 
attempted to aggregate the responses. 

In addition, a related issue is how DOJ validates the information 
states and localities submit in order to ensure that the results the 
department reports are accurate and reliable. We have previously 
reported that weaknesses in monitoring processes for verifying 
performance data can raise concerns about the accuracy of the self- 
reported data received from grantees.[Footnote 40] We also reported 
that if errors occur in the collection of data or the calculation of 
their results, it may affect conclusions about the extent to which 
performance goals have been achieved.[Footnote 41] For example, self- 
reported performance information that is not reported accurately could 
provide data that are less reliable for decision making. 

DOJ officials acknowledged that they have not verified the accuracy of 
states' and localities' self-reported performance data. However, they 
told us they have been meeting with their contractor to review a draft 
verification and validation plan, but have not yet implemented a 
system to verify and validate grantees' performance data or implement 
data reliability checks on the performance measures in the PMT. DOJ 
officials also attributed their challenges to ensuring data integrity 
to limited resources, stating that they lack adequate full-time staff 
to improve, develop, and implement performance measures at this time. 
Specifically, DOJ officials told us that they rely on a contractor 
because they have only one staff person overseeing states' and locals' 
completion of the measures, and improving and developing the tool. 

Until a data verification process is in place, DOJ could experience 
difficulty in ensuring performance results are reported reliably 
across state and local grantee recipients. 

Linkage to Programmatic or Strategic Goals: 

DOJ communicated specific Recovery Act goals, such as jobs created or 
retained, to recipients; but did not provide information on how its 
Recovery Act JAG performance measures aligned with programmatic or 
strategic goals. Our analysis showed that 5 of the 19 measures were 
linked to Recovery Act goals.[Footnote 42] For example, DOJ recently 
included a performance measure for Recovery Act jobs reporting, which 
is the "number of personnel retained with Recovery Act JAG funds." The 
remaining 14 measures lacked a clear linkage to any of DOJ's goals. 
For example, 1 of the measures related to the activity type 
"information systems" is the "percent of departments that completed 
improvements in information systems for criminal justice." However, 
DOJ does not explain how the performance measure for "improvements to 
information systems for criminal justice" relates or links to 
agencywide goals. When we asked DOJ officials to describe how the 
Recovery Act JAG performance measures align with broader departmental 
goals, they explained that the JAG authorizing legislation guides the 
states' use of the funds within the seven general purpose areas for 
JAG and that they do not link these purpose areas to current year DOJ 
goals. However, DOJ officials explained that Recovery Act JAG 
performance measures are linked to the department's strategic goal 2, 
"Prevent Crime, Enforce Federal Laws, and Represent the Rights and 
Interests of the American People," and strategic goal 3, "Ensure the 
Fair and Efficient Administration of Justice."[Footnote 43] DOJ 
officials did not provide written documentation or guidance to 
Recovery Act JAG recipients that explained this linkage to facilitate 
understanding of how performance measures were being used consistently 
with DOJ's strategic and programmatic goals. Further, with the 
exception of Recovery Act goals, officials from all 14 of the SAAs 
noted that they did not see a direct linkage between the Recovery Act 
JAG performance measures and DOJ's overall agencywide goals. 

As we have previously reported, successful organizations try to link 
specific performance goals and measures to the organization's overall 
strategic goals and, to the extent possible, have performance goals 
that will show annual progress toward achieving their long-term 
strategic goals.[Footnote 44] In addition, we have previously reported 
that, without performance measures linked to goals on the results that 
an organization expects the program to achieve, several consequences 
can occur: (1) managers may be held accountable for performance that 
is not mission critical or at odds with the mission, and (2) staff 
will not have a road map to understand how the measures support 
overall strategic and operating goals. 

Objectivity: 

In our assessment, we determined that 9 out of the 19 measures were 
objective. We previously reported that to be objective, performance 
measures should (1) be reasonably free of significant bias; and (2) 
indicate specifically what is to be observed, in which population or 
conditions, and in what time frame. An example of a BJA performance 
measure that we determined is objective is the measure "amount of 
Recovery Act JAG funds used to purchase equipment and/or supplies 
during the reporting period." This measure provides a specific time 
frame in which expenditures for equipment and/or supplies must have 
occurred and clearly explains that the amount of funds used for 
purchasing equipment and/or supplies is what should be reported. An 
example of a BJA performance measure that we determined lacks 
objectivity is the measure the "percent of staff that directly benefit 
from equipment or supplies purchased by Recovery Act JAG funds, who 
report a desired change in their job performance." We determined that 
this measure lacks objectivity because it does not indicate 
specifically what is to be observed, in which population, and in what 
time frame, and is not free from opinion and judgment. For example, it 
requires those reporting to subjectively determine which staff members 
directly benefit from an equipment or supplies purchase and which 
staff members do not. It also requires a subjective determination of 
how the purchase of equipment or supplies affected a desired change in 
the performance of staff members who directly benefited from the 
purchase. When we discussed the issue of objectivity with DOJ they 
stated that BJA instructs grantees to only report on BJA funded 
activities which occurred during the reporting period. However, they 
conceded that the measures were open to interpretation and that was a 
weakness, but suggested that that was the best option given the need 
to have universal measures that apply to a broad range of uses. We do 
not agree that all the measures we reviewed were defined sufficiently 
to prevent subjective interpretation. 

In addition Texas officials expressed concern that DOJ will not be 
able to obtain useful data from the PMT because of the subjective 
interpretation involved in responding to certain of the Recovery Act 
JAG performance measures. For example, Texas officials identified 
responses to questions, such as the "percent of departments that 
report desired program quality" or "percent of staff who reported an 
increase in skills" as illustrative of the kinds of questions that are 
open to wide interpretation based on the size of the law enforcement 
organization and the classification of individuals within the 
organization. 

Measurable Targets: 

In our assessment, we determined that 17 out of the 19 measures lacked 
measurable targets. Among the 17, the absence of measurable targets 
meant that outside of their original application the award recipients 
did not have the opportunity to establish in advance what their target 
level of performance would be to allow for comparisons to actual 
performance achieved for the reporting period covered. For example, in 
the measure "Number of overtime hours paid with Recovery Act JAG 
funds," BJA did not design the measure to allow award recipients to 
specify their target number of hours paid prior to receiving funding. 

DOJ did recognize that the "project objectives," i.e. the funded 
activities, should be linked to meaningful and measurable outcomes 
associated with the Recovery Act and the likelihood of achieving such 
outcomes be assessed. For example, language in the Recovery Act JAG 
application instructions requires that, where possible and 
appropriate, an estimate of the number of jobs created and retained be 
developed. In addition, the Recovery Act JAG application for funds 
also requires that the narrative include performance measures 
established by the organization to assess whether grant objectives are 
being met and a timeline or plan to identify when the goals and 
objectives are completed. However, measurable targets against which to 
benchmark results are not explicitly required in the narrative. 

As noted, two measures did include measurable targets, and as such 
will facilitate future assessments of whether overall goals and 
objectives are achieved because comparisons can be easily made between 
projected performance and actual results. For example in these two 
measures--"the change in the number of individuals arrested in a 
targeted group by crime type" and "the change in reported crime rates 
in a community by crime type"--DOJ provides a list of expectations, 
such as "we expected number of individuals arrested to increase as a 
result of our efforts" or "we expected number of individuals arrested 
to decrease as a result of our efforts," from which the department 
expects respondents to choose, to facilitate comparison between the 
actual and expected number of arrests and reported crimes during a 
particular quarter. 

DOJ officials said that they believed that states could better 
establish measurable targets for the funds than the department could 
since the SAA would have the primary responsibility for establishing 
priorities and grant monitoring. While we agree that this is 
appropriate for individual projects, overall the lack of targets or 
other measurable values limits the Recovery Act JAG performance 
measures' usefulness as part of a successful performance measurement 
tool. As we previously reported, the performance measures should 
translate goals into observable conditions that determine what data to 
collect to learn whether progress was made toward achieving goals. 

State Officials Had Varying Views of the PMT and Recovery Act 
Performance Measures: 

State officials had mixed perspectives on the PMT and Recovery Act 
performance measures, with some critiquing it even as they 
acknowledged its utility in principle. For example, five SAAs noted 
that DOJ's measures were in development and acknowledged the 
difficulty for DOJ in developing a tool that could be used nationwide 
for assessing outputs and outcomes across multiple programs. They also 
were hopeful that the tool would increase uniform program data 
collection and allow for meaningful comparisons of data and outcomes 
across states and different jurisdictions. State officials also had 
positive comments about DOJ's Help Desk and the staff who provided 
technical support for the use of the tool. 

In addition, while eight states were silent on the issue, state 
officials from our remaining seven states stressed that reporting on 
the JAG Recovery Act performance measures is time-consuming and 
duplicative of other existing state performance measurement reporting 
systems. For example, officials from Colorado, Pennsylvania, and 
Illinois had concerns about limited staff availability to monitor the 
workload associated with meeting both Recovery Act and the PMT 
reporting requirements. Specifically, officials stated that they have 
to monitor subrecipient activities and provide monthly and quarterly 
information--as well as validate jobs reporting through payroll, 
expenses, and timesheets--to ensure job counts are calculated 
accurately and consistently. In other examples, officials from 
Colorado and Iowa expressed concern that the PMT duplicates their 
existing state performance measurement systems with similar measures 
and results in duplication of effort. 

In addition, the burden of complying with both BJA and state 
requirements led some states, such as Michigan, Ohio, and Texas, to 
eliminate some of their state performance systems even though 
officials told us that they believed that these systems measured 
performance outcomes better than the PMT performance measures. For 
example, Michigan state officials explained that their preexisting 
state quarterly performance reports provided specific data on grant 
outcomes that were of interest to state legislators and policymakers, 
and which are not included in the PMT performance measures. In 
particular, Michigan's state performance system included measures 
related to drug courts, such as the number of drug-free babies that 
are born to participants. 

While DOJ officials believe they ultimately will use the PMT to target 
the need for technical assistance for Recovery Act JAG recipients, 
they have developed a phased approach for system refinement, 
acknowledging the weaknesses that exist in the current performance 
measures. BJA has adopted some suggestions from JAG stakeholders, 
including SAAs and the DOJ Office of the Inspector General, during the 
year-long revision period for the PMT, which ended in early 2010. In 
addition, BJA plans to update information based on discussion with 
some SAAs in working groups and use some of their insights and 
recommendations to clarify the department's performance measures, and 
plans a new request for proposal in 2011 to augment an existing 3-year 
contract totaling about $3.4 million to help maintain, support, and 
improve the PMT.[Footnote 45] BJA also plans to complete an internal 
report of Recovery Act JAG findings--due by September 2010--and 
expects to release updated performance measures during the fall of 
2010 for use with JAG grants. 

Conclusions: 

Under the Recovery Act, the JAG program made available nearly $2 
billion in additional funds for states and local governments, which 
states and localities reported using primarily for law enforcement 
activities while also maintaining some programs that would have been 
eliminated or cut. Although reporting challenges remain with regard to 
the Recovery Act itself, states and localities took steps to share 
information about promising practices funded through JAG, and DOJ has 
measures in place to facilitate such information sharing. In addition, 
the new performance measures that DOJ has developed capture 
information on the use of Recovery Act JAG funds. 

However, while DOJ's performance measures include attributes of 
successful measures, further improvements are possible. Because the 
Recovery Act JAG program supports a wide array of activities, as well 
as the personnel to implement them, having clear performance measures 
that allow grant recipients to demonstrate results would provide 
useful information to DOJ regarding how Recovery Act JAG funds are 
being used. Our previous work has identified key attributes of 
successful performance measurement systems that would help assess 
progress and make performance information useful for key management 
decisions. 

According to the sample we reviewed, DOJ's performance measures do not 
consistently exhibit key attributes of successful performance 
measurement systems, such as clarity, reliability, linkage, 
objectivity, and measurable targets. Measures that are not clearly 
stated can confuse users and cause managers or other stakeholders to 
think that performance was better or worse than it actually was. The 
lack of data reliability can create challenges in ensuring accurate 
information is recorded for performance purposes. Further, the lack of 
measurable targets also limits the ability to assess program 
performance and provides limited information to Congress about the 
success of the program. Moreover, successful organizations try to link 
performance goals and measures to the organization's strategic goals 
and should have performance goals that will show annual progress 
toward achieving long-term strategic goals. In addition, by 
establishing a mechanism to verify accuracy of self-reported data, DOJ 
can better ensure reliability of information that is reported. By 
addressing attributes consistent with promising performance 
measurement practices as it works to revise its performance measures, 
DOJ could be better positioned to determine whether Recovery Act JAG 
recipients' programs are used to support all seven JAG program 
purposes and are meeting DOJ and Recovery Act program goals. 

Recommendations: 

Recognizing that DOJ is already engaged in efforts to refine its 
Recovery Act JAG performance measures in the PMT, we recommend that 
the Acting Director of the Bureau of Justice Assistance take the 
following two actions to better monitor Recovery Act JAG program 
performance and demonstrate results through use of this instrument: 

* in revising the department's Recovery Act JAG performance measures 
consider, as appropriate, key attributes of successful performance 
measurement systems, such as clarity, reliability, linkage, 
objectivity, and measurable targets; and: 

* develop a mechanism to validate the integrity of Recovery Act JAG 
recipients' self-reported performance data. 

Agency Comments and Our Evaluation: 

We provided a draft of this report to DOJ for review and comments. DOJ 
provided written comments on the draft report, which are reproduced in 
full in Appendix VII. DOJ concurred with the recommendations in the 
report and stated that BJA plans to take actions that will address 
both of our recommendations by October 1, 2011. Specifically, in 
response to our first recommendation that DOJ revise the Recovery Act 
JAG performance measures to consider, as appropriate, key attributes 
of successful performance measurement systems, DOJ stated that BJA is 
taking steps to revise the Recovery Act JAG performance measures--in 
conjunction with State Administering Agencies--and that it 
specifically will consider clarity, reliability, linkage, objectivity, 
and measurable targets in redesigning its performance measures. In 
response to our second recommendation relating to data quality, DOJ 
stated that BJA will develop and implement a mechanism to validate the 
integrity of Recovery Act JAG recipients' self-reported performance 
data. DOJ also provided technical comments on a draft of this report, 
which we incorporated as appropriate. 

We are sending copies of this report to the Attorney General, selected 
congressional committees, and other interested parties. In addition, 
the report will be available at no charge on the GAO Web site at 
[hyperlink, http://www.gao.gov]. Please contact David Maurer at (202) 
512-9627 if you or your staff has any questions concerning this 
report. Contact points for our Offices of Congressional Relations and 
Public Affairs may be found on the last page of this report. Key 
contributors to this report are listed in appendix VIII. 

Signed by: 

David C. Maurer: 
Director, Homeland Security and Justice Issues: 

[End of section] 

Appendix I: Scope and Methodology: 

This report addresses the following four questions: (1) How are 
Recovery Act Justice Assistance Grant (JAG) funds awarded and how have 
recipients in selected states and localities used their awards? (2) 
What challenges, if any, have Recovery Act JAG recipients reported in 
complying with Recovery Act reporting requirements? (3) To what extent 
do states share promising practices related to the use and management 
of Recovery Act JAG funds, and how, if at all, does the Department of 
Justice (DOJ) encourage information sharing? (4) To what extent are 
DOJ's Recovery Act JAG performance measures consistent with promising 
practices? 

As agreed with your office, we focused our review on Recovery Act JAG 
grants in a nonprobability sample of 14 states. The grants made to 
these states included both direct awards that DOJ made to State 
Administering Agencies (SAAs) and localities, as well as pass-through 
awards SAAs made to localities. A portion of this work was done in 
conjunction with our other Recovery Act reviews that focused on those 
16 states, as well as the District of Columbia that represent the 
majority of Recovery Act spending.[Footnote 46] The 16 states included 
Arizona, California, Colorado, Florida, Georgia, Illinois, Iowa, 
Massachusetts, Michigan, Mississippi, New Jersey, New York, North 
Carolina, Ohio, Pennsylvania, and Texas. We selected these states and 
the District of Columbia on the basis of federal outlay projections, 
percentage of the U.S. population represented, unemployment rates and 
changes, and a mix of states' poverty levels, geographic coverage, and 
representation of both urban and rural areas. Collectively, these 
states contain about 65 percent of the U.S. population and are 
estimated to receive about two-thirds of the intergovernmental 
assistance available through the Recovery Act. However, for the 
purposes of this report, we limited our scope to a subset of 14 of 
these states so as not to duplicate ongoing work in the other 3 
(Florida, New Jersey, and the District of Columbia) that the DOJ 
Office of Inspector General was conducting. The awards to these 14 
states accounted for approximately 50 percent of all Recovery Act JAG 
funds provided. 

To identify how recipients of direct and pass-through funds received 
and used their Recovery Act JAG awards in selected states and 
localities, we conducted in-person and telephone interviews with 
officials from SAAs in all 14 states as well as officials from a 
nonprobability sample of 62 localities in these states. Where 
statements are attributed to state and local officials, we did not 
analyze state and locality data sources but relied on state and local 
officials and other state sources for relevant state data and 
materials. We selected these localities based on the amount of their 
grant awards, the activities that they were undertaking with grant 
funds, whether they reported that they had completed 50 percent or 
more of their grant activities according to their responses provided 
in Recovery Act reporting, and how they received their funds (either 
as passed-through funding from their SAA or received awards directly 
from DOJ--and in some cases as part of disparate jurisdictions.) Our 
interviews addressed the use and perceived impact of Recovery Act JAG 
funds, program performance measurement and reporting challenges, and 
sharing of promising practices. Also, we reviewed DOJ direct award 
data and SAA pass-through awards in 14 SAAs. We also reviewed Recovery 
Act quarterly reports from Recovery.gov (4th quarter 2009, 1st quarter 
2010, and 2nd quarter 2010) to identify additional information on the 
use of JAG funds. Based on this information, we assigned the grants to 
one of the seven JAG general purpose areas. For those where multiple 
purposes were indicated, they were so identified. In cases where a 
purpose could not be identified we placed it in the category of "not 
enough information." We collected and used these funding data because 
they are the official source of Recovery Act spending. Based on our 
limited examination of the data thus far we consider them to be 
sufficiently reliable for our purposes. Findings from our 
nonprobability samples cannot be generalized to all states and 
localities that were recipients of Recovery Act JAG funds; however, 
our samples provided us with illustrative examples of uses of funds, 
oversight processes, and reporting issues. 

To determine the extent to which Recovery Act JAG recipients faced 
challenges in complying with Recovery Act requirements, we interviewed 
representatives from the 14 SAAs and 62 localities and asked them 
about their experience with 1512(c) reporting requirements and Office 
of Management and Budget (OMB) guidance. In addition, we reviewed our 
previous reports that discuss Recovery Act recipient reporting issues. 

To identify how states share promising practice information, and the 
extent to which DOJ encourages information sharing, we conducted in- 
person and telephone interviews with representatives from all 14 of 
the SAAs. We also reviewed DOJ information, interviewed DOJ officials, 
and consulted reports from the National Criminal Justice Association, 
the National Governors' Association, and others that describe their 
information-sharing activities. 

To identify the extent to which DOJ's performance measurement approach 
is consistent with promising practices to assess progress, we 
interviewed representatives from the 14 SAAs and 62 localities and 
asked them about their experience with the Performance Measurement 
Tool (PMT). We also discussed the PMT's design and Recovery Act JAG 
performance measure improvement efforts with DOJ staff. Further, we 
conducted a review of the performance measures that were required for 
use under the Recovery Act JAG activities commonly reported to have 
been undertaken by the grant recipients in our sample. From the 86 
Recovery Act JAG performance measures under 10 activity types, we 
analyzed a nonprobability sample of the 19 performance measures 
required under 4 of the activity areas (Personnel, Equipment and 
Supplies, Information Systems for Criminal Justice, and the category 
Outcomes for all Activity Types). We selected these activity types and 
measures because they were the ones associated with the largest share 
of reported Recovery Act JAG expenditures and therefore most often 
encountered by the grant recipients. We then assessed these measures 
against a set of key characteristics that we have previously reported 
as being associated with promising practices and successful 
performance measures we have identified in our previous work.[Footnote 
47] Some of the 9 key characteristics of successful performance 
measures are attributes that may be most effectively used when 
reviewing performance measures individually and some are best used 
when reviewing a complete set of measures. Since we selected a 
nonprobability sample of measures that was most closely associated 
with the majority of expenditures, we focused our analysis most 
heavily on those attributes that could be applied to individual 
measures--clarity, reliability, linkage to strategic goals, 
objectivity, and measurable targets. We did not assess the subset of 
19 performance measures for the attributes of governmentwide 
priorities, core program activities, limited overlap, or balance that 
are associated with an evaluation of a full set of measures. To 
evaluate the sample, four analysts independently assessed each of the 
performance measures against attributes of successful performance 
measures previously identified by GAO. Those analysts then met to 
discuss and resolve any differences in the results of their analysis. 
In conducting this analysis, we analyzed program performance measure 
information contained in DOJ's Performance Measurement Tool for 
American Recovery and Reinvestment Act (Recovery Act - ARRA) and 
fiscal year 2009 Justice Assistance Grant Programs. We did not do a 
detailed assessment of DOJ's methodology for developing the measures, 
but looked at the issues necessary to assess whether a particular 
measure met the overall characteristics of a successful performance 
measure. We also reviewed our previous reports that discuss the 
importance of performance measurement system attributes and obtained 
information on the extent to which such systems may impact agencies' 
planning.[Footnote 48] The activity types and number of measures 
selected are listed in table 7. 

Table 7: Activity Types Included in Our Recovery Act JAG Performance 
Measure Review: 

Activity type: Personnel; 
DOJ Description: May include the employment of new staff either 
through new recruitment activities or payment to existing staff for 
work over and beyond (overtime) the normal work period; 
also allows use of funds for retention of positions otherwise lost due 
to budget cutbacks; 
Number of performance measures: 7; 
Number selected for evaluation: 7. 

Activity type: Equipment and supplies; 
DOJ Description: Includes the purchase of new or replacement equipment 
or supplies to improve or replace what currently exists; 
Number of performance measures: 4; 
Number selected for evaluation: 4. 

Activity type: Information systems for criminal justice system; 
DOJ Description: Includes the development, implementation or 
improvements made to benefit staff or departments; 
Number of performance measures: 5; 
Number selected for evaluation: 5. 

Activity type: Outcomes for all activity types; 
DOJ Description: As they apply to grant funded activities; 
Number of performance measures: 3; 
Number selected for evaluation: 3. 

Activity type: State/local initiatives; 
DOJ Description: Includes activities that are planned for 
implementation of a new program to provide a direct service to improve 
a criminal justice system by implementing a new process, procedure, or 
policy or improve a program, service, or system; 
Number of performance measures: 13; 
Number selected for evaluation: 0. 

Activity type: Training; 
DOJ Description: Includes the provision of different types of 
training, as well as the purchase of training services for or to staff 
or departments; 
Number of performance measures: 11; 
Number selected for evaluation: 0. 

Activity type: Technical assistance; 
DOJ Description: Includes the provision of technical assistance for 
staff; 
Number of performance measures: 7; 
Number selected for evaluation: 0. 

Activity type: Contractual support; 
DOJ Description: Includes activities that address issues that help to 
improve the effectiveness an/or efficiency in various points of the 
criminal justice system; 
Number of performance measures: 4; 
Number selected for evaluation: 0. 

Activity type: Research, evaluation, and product development; 
DOJ Description: Includes research and evaluation activities that have 
a goal of informing decisions and providing information as to what 
works; 
Number of performance measures: 14; 
Number selected for evaluation: 0. 

Activity type: Task force activities; 
DOJ Description: Applies to grantees who will utilize ARRA JAG funds 
to cover task force activities not otherwise captured in other 
activity areas; 
Number of performance measures: 18; 
Number selected for evaluation: 0. 

Activity type: Total; 
DOJ Description: [Empty]; 
Number of performance measures: 86; 
Number selected for evaluation: 19. 

Source: GAO review of Recovery Act JAG performance measures. 

[End of table] 

We conducted this performance audit from January 2010 through October 
2010 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

[End of section] 

Appendix II: Recovery Act JAG Performance Measures: 

The following table contains the 19 Performance Measurement Tool (PMT) 
performance measures that were required for use under the Recovery Act 
JAG activity types commonly undertaken by the grant recipients in our 
sample. 

Table 8: Recovery Act JAG Performance Measures Associated with the 
Activities Predominantly Undertaken by Recipients across Our 14 Sample 
States: 

Measure: Number of new personnel hired with (ARRA) JAG funds (System 
Improvement); 
DOJ Definition: The purpose of this output indicator is to measure the 
extent of personnel hours hired with (ARRA) JAG funds (system 
capacity). Appropriate for grantees in purpose areas that use (ARRA) 
JAG funds for system improvement. Report the number of new personnel 
hired with (ARRA) JAG funds during the reporting period. Personnel 
hired from the represented agency are defined by the grantee or 
subrecipients as hired for either a department, division, agency, or 
organization. "Other funding" refers to all other funding sources that 
are not JAG or ARRA JAG funds. Source: Agency records are preferred 
data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 1. Number of NEW personnel hired with (ARRA) JAG 
funds during the reporting period. Only Report New Personnel Hired 
During The Quarter. This Number Will Be Aggregated Across All 
Reporting Periods; 
2. Total number of new personnel hired with all OTHER (as applicable 
to non-ARRA JAG or JAG) sources during the reporting period; 
3. Total Auto-calculated by PMT; 
4. Percent Auto-calculated by PMT. 

Measure: Indicate the type of new personnel paid with (ARRA) JAG funds 
(System Improvement); 
DOJ Definition: The purpose of this output indicator is to measure 
accountability. Appropriate for grantees in purpose areas that use 
(ARRA) JAG funds for system improvement. Check all boxes applicable to 
the type of NEW personnel paid with (ARRA) JAG funds during the 
reporting period. Source: Agency records are preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Law Enforcement Personnel; 
2. Prosecution and Court Personnel; 
3. Prevention and Education Personnel; 
4. Corrections and Community Corrections Personnel; 
5. Drug Treatment and Enforcement Personnel; 
6. Planning, Evaluation, and Technology Improvement Personnel; 
7. Crime Victim and Witness Personnel. 

Measure: Number of personnel retained with (ARRA) JAG funds (System 
Improvement); 
DOJ Definition: The purpose of this output indicator is to measure the 
extent of personnel retained as a result of (ARRA) JAG funds. This 
measure is only appropriate for the (ARRA) JAG because the (ARRA) JAG 
allows use of funds for retention of positions otherwise lost due to 
budget cutbacks. The intent of this measure is to capture the number 
of personnel retained each quarter for the life of the award. Report 
the number of personnel maintained with (ARRA) JAG funds during the 
reporting period. SOURCE: Agency records are the preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of personnel retained with (ARRA) JAG funds during the 
reporting period. Does Not Include New Personnel Hired During The 
Current And Previous Reporting Periods; 
2. Total number of existing personnel paid by all OTHER (as applicable 
to non-ARRA or JAG) sources during the reporting period; 
3. Total Auto-calculated by PMT; 
4. Percent Auto-calculated by PMT. 

Measure: Indicate the type of retained personnel paid with (ARRA) JAG 
funds (System Improvement); 
DOJ Definition: The purpose of this output indicator is to measure 
accountability. Appropriate for grantees in purpose areas that use 
(ARRA) JAG funds for system improvement. This measure is only 
appropriate for the (ARRA) JAG because the (ARRA) JAG allows use of 
funds for retention of positions otherwise lost due to budget 
cutbacks. Check all boxes applicable to the type of Retained personnel 
paid with (ARRA) JAG funds during the reporting period. SOURCE: Agency 
records are the preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Law Enforcement Personnel; 
2. Prosecution and Court Personnel; 
3. Prevention and Education Personnel; 
4. Corrections and Community Corrections Personnel; 
5. Drug Treatment and Enforcement Personnel; 
6. Planning, Evaluation, and Technology Improvement Personnel; 
7. Crime Victim and Witness Personnel. 

Measure: Number of overtime hours paid with (ARRA) JAG funds (System 
Improvement); 
DOJ Definition: The purpose of this output indicator is to measure 
system/program capacity. Appropriate for grantees in purpose areas 
that use (ARRA) JAG funds for system improvement. Report the number of 
overtime hours paid with (ARRA) JAG funds during the reporting period. 
Report total hours of overtime paid by all OTHER (as applicable to non-
ARRA JAG or JAG) sources. This is a total of hours from the agency 
represented in the grant application. Source: Agency records are the 
preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of overtime hours paid with (ARRA) JAG funds during the 
reporting period Report Hours Of Overtime Not Dollars; 
2. Total number of hours of overtime paid by all Other (as applicable 
to non-ARRA JAG or JAG) sources during the reporting period; 
3. Total Auto-calculated by the PMT; 
4. Percent Auto-calculated by the PMT. 

Measure: Percent of departments that report desired efficiency (System 
Improvement); 
DOJ Definition: The Purpose Of This Outcome Indicator Is To Measure 
Desired Efficiency. Appropriate For Grantees Under Any Purpose Area 
That Use (ARRA) Jag Funds For System Improvement Activities. Report 
The Number Of Departments That Report Desired Efficiency As A Result 
Of New Personnel Or Overtime Paid With (ARRA) Jag Funds. Source: 
Agency Records Are The Preferred Data Source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of departments that report desired efficiency during the 
reporting period; 
2. Total number of departments that used (ARRA) JAG funds to hire new 
personnel, maintain personnel or pay for overtime hours; 
3. Percent Auto-calculated by PMT. 

Measure: Percent of departments that report desired program quality 
(System Improvement); 
DOJ Definition: The purpose of this outcome indicator is to measure 
increased program quality. Appropriate for grantees under any purpose 
area that use (ARRA) JAG funds for system improvement activities. 
Report the number of departments that report desired program quality 
as a result of new personnel and overtime paid with (ARRA) JAG funds. 
Source: Agency records are preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of departments that report desired program quality during 
the reporting period; 
2. Total number of departments that used (ARRA) JAG funds to hire new 
personnel, maintain personnel or pay for overtime hours; 
3. Percent Auto-calculated by PMT. 

Measure: Amount of (ARRA) JAG funds used to purchase equipment and/or 
supplies (System Improvement); 
DOJ Definition: The purpose of this output measure is to document the 
extent of equipment and/or supplies purchased with (ARRA) JAG funds. 
Appropriate for grantees in all purpose areas that use (ARRA) JAG 
funds for system improvement. Report the amount of (ARRA) JAG funds 
used to purchase equipment and/or supplies. Source: Program records 
are preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Amount of (ARRA) JAG funds used to purchase equipment and/or 
supplies during the reporting period. 

Measure: Indicate the quantity for each type of equipment and/or 
supplies purchased with (ARRA) JAG funds (Report Quantity Not Dollars) 
(System Improvement); 
DOJ Definition: The purpose of this output indicator is to measure 
accountability. Appropriate for grantees in purpose areas that use JAG 
funds for system improvement. Report the quantity for each type of 
equipment or supplies purchased with JAG funds during the reporting 
period. Source: Program records are preferred data source; 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Weapons; 
2. Equipment for police cruisers; 
3. Uniforms; 
4. CAD; 
5. RMS; 
6. Software; 
7. Computers; 
8. Mobile access equipment (ex. Aircards for Verizon, Sprint, AT&T, 
etc.); 
9. Security systems (station or evidence room); 
10. Biometric equipment (Live scans, fingerprint readers, etc.); 
11. In-car camera systems; 
12. Video observation (station, community, pole cams); 
13. Undercover surveillance equipment (microphones, video); 
14. License plate readers; 
15. Kiosk units for community access or registration; 
16. Vehicles; 
1750. Radios; 
18. Other (please specify type and quantity). 

Measure: Should Only Answer If Your Agency Received Requests 
Considered For Funding With JAF Funds. Number of equipment and/or 
supply requests funded with (ARRA) JAG funds (System Improvement); 
DOJ Definition: The purpose of this output measure is to document the 
extent of equipment and/or supplies funded with (ARRA) JAG funds. 
Appropriate for grantees in all purpose areas that use (ARRA) JAG 
funds for system improvement. Report the number of equipment and/or 
supply requests received and of that, the number funded with (ARRA) 
JAG funds. Source: Program records are preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of equipment and/or supply requests funded with (ARRA) JAG 
funds during the reporting period; 
2. Number of equipment and/or supply requests received for 
consideration with JAG funding; 
3. Percent Auto-calculated by PMT. 

Measure: Percent of staff that directly benefit from equipment or 
supplies purchased by (ARRA) JAG funds, who report a desired change in 
their job performance (System Improvement); 
DOJ Definition: The purpose of this outcome indicator is to measure 
efficiency. Appropriate for grantees in purpose areas that use (ARRA) 
JAG funds for system improvement. Report the number of staff that 
directly benefit from equipment or supplies purchased with (ARRA) JAG 
funds, who report a desired change in their job performance during 
this reporting period. Source: This is a count of direct staff that 
benefit from the equipment and/or supplies purchased. Program records 
are preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of staff that directly benefit from equipment or supplies 
purchased by (ARRA) JAG funds, who report a desired change in their 
job performance; 
2. Number of staff to receive equipment or supplies purchased with 
(ARRA) JAG funds during the reporting period; 
3. Percent Auto-calculated by PMT; 
4. Explain the impact on job performance for the reporting period. 

Measure: Amount of (ARRA) JAG funds used for improvements to 
information systems for criminal justice systems (System Improvement); 
DOJ Definition: The purpose of this output indicator is to improve 
system effectiveness and/or capacity. Appropriate for grantees under 
any purpose area that uses (ARRA) JAG funds for system improvement. 
Report the amount of (ARRA) JAG funds used to improve information 
systems for criminal justice systems during the reporting period. 
Source: Agency records are a preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Amount of (ARRA) JAG funds used for improvements to information 
systems for criminal justice systems during the reporting period. 

Measure: Number of departments that used (ARRA) JAG funds to make 
improvements to information systems for criminal justice (System 
Improvement); 
DOJ Definition: The purpose of this output measure is for 
system/program capacity based on the idea that new, enhanced or 
improved information systems can provide staff with better efficiency 
to do their jobs. Appropriate for grantees under any purpose area that 
uses (ARRA) JAG funds for system improvement activities. Report the 
number of departments that uses (ARRA) JAG funds to make improvements 
to information systems for criminal justice during the reporting 
period. Source: Agency records are a preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of departments that used (ARRA) JAG funds to make 
improvements to criminal justice information systems started in the 
previous period; 
2. Number of NEW departments that use (ARRA) JAG funds to make 
improvements to criminal justice information systems that were added 
during the reporting period; 
3. Total Auto-calculated by PMT. 

Measure: Percent of departments that completed improvements to 
information systems for criminal justice (System Improvement); 
DOJ Definition: The purpose of this outcome measure is for system 
accountability. Appropriate for grantees under any purpose area that 
uses (ARRA) JAG funds for system improvement activities. Report the 
number of departments that completed improvements to criminal justice 
information systems during the reporting period. Source: Agency 
records are a preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of departments that completed improvements to information 
systems for criminal justice during the reporting period as a result 
of (ARRA) JAG funds; 
2. Number of departments to use (ARRA) JAG funds to make improvements 
to information systems for criminal justice; 
3. Percent Auto-calculated by PMT. 

Measure: Percent of departments that report a desired change in 
efficiency (System Improvement); 
DOJ Definition: The purpose of this outcome measure is to document 
improved efficiency. Appropriate for grantees under any purpose area 
that uses (ARRA) JAG funds for system improvement activities. Report 
the number of departments that report a desired change in efficiency 
as a result of improved information systems for criminal justice 
systems as a result of (ARRA) JAG funds during the reporting period. 
Source: Agency records are a preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of departments that report a desired change in efficiency 
during the reporting period; 
2. Number of departments that completed improvements to information 
systems for criminal justice systems as a result of (ARRA) JAG funds; 
3. Percent Auto-calculated by PMT; 
4. Explain the impact on efficiency for the reporting period. 

Measure: Percent of departments that report a desired change in 
program quality (System Improvement); 
DOJ Definition: The purpose of this outcome measure is to document 
improved program quality. Appropriate for grantees under any purpose 
area that use (ARRA) JAG funds for system improvement activities. 
Report the number of departments that report a desired change in 
program quality (e.g. per staff caseloads meet professional standards, 
increased availability of specialized services) as a result of (ARRA) 
JAG funds. Source: Agency records are preferred data source. 
Applies to purpose areas: Law Enforcement, Prosecution and Court, 
Prevention and Education, Corrections and Community Corrections, Drug 
Treatment and Enforcement, Planning, Evaluation and Technology 
Improvement, Crime Victim and Witness; 
Data Grantee Reports: 
1. Number of departments that report a desired change in program 
quality during the reporting period; 
2. Number of departments that completed improvements to information 
systems for criminal justice systems during the reporting period; 
3. Percent Auto-calculated by PMT; 
4. Explain the impact on program quality during the reporting period. 

Measure: Change in number of individuals arrested in a targeted group 
by crime type; 
DOJ Definition: The purpose of this outcome indicator is to measure 
rates of individuals arrested in a targeted group by crime type. 
Appropriate for grantees in purpose areas that provide direct service 
to individuals with (ARRA) JAG funds. Report the number of individuals 
arrested for a targeted group by crime type. For the first reporting 
period, the "a" value reflects available data for the quarter prior to 
the start of grant-funded activities. For subsequent reporting 
periods, the "a" value will reflect the number of individuals arrested 
during the quarter before the start of the award. Population numbers 
will vary based on target population/sub-population of the program/ 
initiative Crime types are identified by the target of grant-funded 
activity. Source: Program records; 
Data Grantee Reports: 
1. The number of individuals (by related crime) arrested during the 
quarter before the start of the award; 
2. Total number of individuals arrested (by related crime) during the 
reporting period; 
3. Pick One: 
4. We expected number of individuals arrested to increase as a result 
of our efforts; 
5. We expected number of individuals arrested to decrease as a result 
of our efforts; 
6. We expected number of individuals arrested to remain stable (no 
change) as a result of our efforts; 
7. We had no expectations about changes in number of individuals 
arrested of crime as a result of our efforts; 
8. Not applicable for this reporting period. 

Measure: Change in reported crime rates in a community by crime type; 
DOJ Definition: The purpose of this outcome indicator is to measure 
rates of related crimes in a targeted community. Appropriate for 
grantees in purpose areas that provide direct service to communities 
or organizations with (ARRA) JAG funds. Report the number of related 
crimes reported during the reporting period. Population numbers will 
vary based on target populations/sub-population of the program/ 
initiative. The "a" value reflects the quarter prior to the start of 
the award. This measure is intended to collect rates of crime targeted 
by (ARRA) JAG award. Crime types are identified by the target of grant-
funded activity. Source: Program records; 
Data Grantee Reports: 
1. Number of reported crimes (targeted by (ARRA) JAG funds) during the 
quarter before the start of the award; 
2. Total number of reported crimes (targeted by (ARRA) JAG funds) 
during the period; 
3. Pick One: 
4. We expected the crime rate to increase as a result of our efforts; 
5. We expected the crime rate to decrease as a result of our efforts; 
6. We expected the crime rate to remain stable (no change) as a result 
of our efforts; 
7. We had no expectations about the crime rate as a result of our 
efforts; 
8. Not applicable for this reporting period. 

Measure: Type of crime; 
DOJ Definition: Provide the type of crime targeted; 
Data Grantee Reports: 
1. Homicides; 
2. Forcible Rapes; 
3. Robberies; 
4. Aggravated Assaults; 
5. Other [types of crimes targeted], please define. 

Source: BJA. 

[End of table] 

[End of section] 

Appendix III: GAO Assessment of Whether DOJ's Recovery Act JAG 
Performance Measures Possessed Certain Key Attributes: 

Table 9: GAO Assessment of Whether DOJ's Recovery Act JAG Performance 
Measures Possess Certain Key Attributes: 

Performance Measure: Number of new personnel hired with (ARRA) JAG 
funds; 
Clarity: No; 
Reliability: Yes; 
Linkage to strategic goals: Yes; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Indicate the type of new personnel paid with 
(ARRA) JAG funds; 
Clarity: Yes; 
Reliability: No; 
Linkage to strategic goals: Yes; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Number of personnel retained with (ARRA) JAG 
funds; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: Yes; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Type of retained personnel paid with (ARRA) JAG 
funds; 
Clarity: Yes; 
Reliability: No; 
Linkage to strategic goals: Yes; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Number of overtime hours paid with (ARRA) JAG 
funds; 
Clarity: No; 
Reliability: Yes; 
Linkage to strategic goals: Yes; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Percent of departments that report desired 
efficiency; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Percent of departments that report desired 
program quality; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Amount of (ARRA) JAG funds used to purchase 
equipment and/or supplies; 
Clarity: Yes; 
Reliability: Yes; 
Linkage to strategic goals: No; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Indicate the quantity for each type of equipment 
and/or supplies purchased with (ARRA) JAG funds; 
Clarity: Yes; 
Reliability: Yes; 
Linkage to strategic goals: No; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Number of equipment and/or supply requests funded 
with (ARRA) JAG funds; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Percent of staff that directly benefit from 
equipment or supplies purchased by (ARRA) JAG funds, who report a 
desired change in their job performance; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Amount of (ARRA) JAG funds used for improvements 
to information systems for criminal justice systems; 
Clarity: Yes; 
Reliability: Yes; 
Linkage to strategic goals: No; 
Objectivity: Yes; 
Measurable targets: No. 

Performance Measure: Number of departments that used (ARRA) JAG funds 
to make improvements to information systems for criminal justice; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Percent of departments that completed 
improvements to information systems for criminal justice; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Percent of departments that report a desired 
change in efficiency; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Percent of departments that report a desired 
change in program quality; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: No. 

Performance Measure: Change in number of individuals arrested in a 
targeted group by crime type; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: Yes. 

Performance Measure: Change in reported crime rates in a community by 
crime type; 
Clarity: No; 
Reliability: No; 
Linkage to strategic goals: No; 
Objectivity: No; 
Measurable targets: Yes. 

Performance Measure: Type of crime; 
Clarity: No; 
Reliability: Yes; 
Linkage to strategic goals: No; 
Objectivity: Yes; 
Measurable targets: No. 

Source: GAO. 

[End of table] 

[End of section] 

Appendix IV: Recovery Act JAG Award Drawdowns and Expenditures: 

Department of Justice (DOJ) records indicate that all 14 of the states 
in our sample have drawn down the vast majority of their Recovery Act 
Justice Assistance Grant (JAG) awards as of May 2010. Specifically, 
the amounts drawn down range from less than 53 percent to almost 98 
percent. Table 10 shows the amount and percentage of these funds that 
have been drawn down and expended by State Administering Agencies 
(SAAs), their subrecipients, and localities. 

Table 10: Recovery Act JAG Drawdowns across Our Sample States, as of 
May 2010: 

State: Arizona; 
Total Recovery Act JAG allocation: $41,966,266; 
Amount awarded[A]: $41,953,775; 
Amount drawn down: $40,314,122; 
Drawn down (%): 96.1; 
Amount expended[B]: $15,258,007. 

State: California; 
Total Recovery Act JAG allocation: $225,354,622; 
Amount awarded[A]: $225,308,016; 
Amount drawn down: $213,948,344; 
Drawn down (%): 95.0; 
Amount expended[B]: $34,627,855. 

State: Colorado; 
Total Recovery Act JAG allocation: 29,858,171; 
Amount awarded[A]: $29,806,448; 
Amount drawn down: $25,065,819; 
Drawn down (%): 84.1; 
Amount expended[B]: $8,029,790. 

State: Georgia; 
Total Recovery Act JAG allocation: $59,045,753; 
Amount awarded[A]: $58,883,245; 
Amount drawn down: $49,539,594; 
Drawn down (%): 84.1; 
Amount expended[B]: $13,182,007. 

State: Illinois; 
Total Recovery Act JAG allocation: $83,663,470; 
Amount awarded[A]: $83,663,470; 
Amount drawn down: $81,661,161; 
Drawn down (%): 97.6; 
Amount expended[B]: $22,854,366. 

State: Iowa; 
Total Recovery Act JAG allocation: $18,702,718; 
Amount awarded[A]: $18,702,304; 
Amount drawn down: $17,306,900; 
Drawn down (%): 92.5; 
Amount expended[B]: $6,250,415. 

State: Massachusetts; 
Total Recovery Act JAG allocation: $40,793,878; 
Amount awarded[A]: $40,737,593; 
Amount drawn down: $21,430,523; 
Drawn down (%): 52.6; 
Amount expended[B]: $23,920,025. 

State: Michigan; 
Total Recovery Act JAG allocation: $67,006,344; 
Amount awarded[A]: $67,076,152; 
Amount drawn down: $64,762,546; 
Drawn down (%): 96.6; 
Amount expended[B]: $11,096,373. 

State: Mississippi; 
Total Recovery Act JAG allocation: $18,394,045; 
Amount awarded[A]: $18,013,882; 
Amount drawn down: $14,069,121; 
Drawn down (%): 78.1; 
Amount expended[B]: $2,176,030. 

State: New York; 
Total Recovery Act JAG allocation: $110,592,269; 
Amount awarded[A]: $110,496,533; 
Amount drawn down: $103,197,464; 
Drawn down (%): 93.4; 
Amount expended[B]: $20,382,971. 

State: North Carolina; 
Total Recovery Act JAG allocation: $56,345,356; 
Amount awarded[A]: $56,103,394; 
Amount drawn down: $50,230,759; 
Drawn down (%): 89.5; 
Amount expended[B]: $18,496,403. 

State: Ohio; 
Total Recovery Act JAG allocation: $61,645,375; 
Amount awarded[A]: $61,604,789; 
Amount drawn down: $55,601,917; 
Drawn down (%): 90.3; 
Amount expended[B]: $22,489,444. 

State: Pennsylvania; 
Total Recovery Act JAG allocation: $72,372,843; 
Amount awarded[A]: $72,361,289; 
Amount drawn down: $65,846,268; 
Drawn down (%): 91.0; 
Amount expended[B]: $15,406,221. 

State: Texas; 
Total Recovery Act JAG allocation: $147,530,755; 
Amount awarded[A]: $147,102,910; 
Amount drawn down: $135,929,639; 
Drawn down (%): 92.4; 
Amount expended[B]: $56,607,213. 

Source: GAO analysis of Bureau of Justice Assistance, SAA data, and 
Recovery.gov. 

[A] Amounts awarded are reported by DOJ as of May 12, 2010: 

[B] Amount expended includes data reported by SAAs and direct local 
recipients to Recovery.gov during the quarter ending June 30, 2010. 
Note that during the quarter ending June 30, 2010, 797 direct 
recipients, including SAAs reported expenditure data to Recovery.gov 
compared to 807 direct recipients, including SAAs, in the quarter 
ending December 31, 2009. DOJ officials said that when a direct 
recipient expends all funds in its award, the recipient is no longer 
required to report data to DOJ or the Office of Management and Budget 
(OMB) which manages Recovery.gov. All reporting is completed when 
expenditures are completed. Therefore, the approximately 10 local 
direct recipients who spent all award funds prior to the quarter 
ending June 30, 2010, are not included in the amount expended column. 

[End of table] 

[End of section] 

Appendix V: Examples of Use of Recovery Act JAG Funds for Equipment 
Purchases: 

The following table illustrates the types of equipment purchases 
recipients within our 14 sample states have made using Recovery Act 
Justice Assistance Grant (JAG) funds. 

Figure 6: Illustrative Examples of Equipment Purchased with Recovery 
Act JAG Funding across Localities within our 14 Sample States: 

[Refer to PDF for image: photograph and associated information] 

Locality: Utica, New York. 
Equipment purchase: A mobile computer system for police vehicles. 
Reported impact of equipment purchase: Police officers reported that 
the new mobile computer systems helped replace larger, more cumbersome 
computers that hit the dashboard. In addition to providing 
flexibility, the new computers have touch screen Global Positioning 
System capability that improves their ability to locate cars in the 
county to fight crime more effectively. 

Locality: Inkster, Michigan; 
Equipment purchase: A canine (Belgian Melanois breed). 
Reported impact: Officials reported that, in addition to providing a 
critical function in tracking narcotics, the canine will help with 
general article recovery and locating missing children. 

Locality: El Paso, Texas; 
Equipment purchase: 1,145 Colt M-4 Carbine, Semiautomatic urban rifles. 
Reported impact of equipment purchase: Police officers reported that 
the M-4 urban rifle has capabilities including improved ease of use 
and increased firepower, which will allow El Paso Police Department 
officers to support border security initiatives and be adequately 
equipped to ensure the protection of citizens of El Paso. 

Locality: Ottawa County, Michigan; 
Equipment purchase: A 28-foot Tiara Pursuit patrol boat. 
Reported impact of equipment purchase: According to officials, Ottawa 
County has one of the largest boating populations in the state of 
Michigan and the Sheriff's Office will use the patrol boat to assist 
neighboring jurisdictions respond to boating incident calls. In 
addition, they will use the boat to assist the United States Coast 
Guard with homeland security-related patrols and investigations. In 
addition, the boat will be used for underwater recovery operations and 
rescue calls. 

Sources: Utica Police Department; Inkster Police Department; El Paso 
Police Department; Ottawa County Police Department, (from top to 
button). 

[End of figure] 

[End of section] 

Appendix VI: Full Text for Figure 4 Map of SAAs and Planned Uses of 
Recovery Act JAG Awards by the Seven Allowable Program Categories 
across 14 Sample States: 

This appendix provides the full printed text of the interactive content 
in figure 4 on page 22 in the body of the report. Specifically, the 
following figures describe planned uses of Recovery Act Justice 
Assistance Grant (JAG) funds by each State Administering Agency (SAA) 
across our 14 sample states, which are listed in alphabetical order by 
state name. 

Arizona[A]: 

According to state officials, without Recovery Act funds, the state 
faced budget cuts and would have had to severely cut or discontinue at 
least half of the projects previously funded with JAG money. In 
particular, about $20.8 million in Recovery Act JAG funds supported 
drug task forces and these drug task forces helped account for 
seizures of 847,665 grams of cocaine; 49,586 grams of heroin; 206,713 
grams of methamphetamine; and 305,082 pounds of marijuana in 2008. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $11,074,062 (48.2%); 
Law enforcement: $8,887,842 (38.7%); 
Crime victim and witness programs: $1,265,348 (5.5%); 
Crime prevention and education: 0%; 
Program planning, evaluation and technology improvement: 0%; 
Drug treatment and enforcement: 0%; 
Corrections: 0%. 

Source: Arizona Criminal Justice Commission. 

[A] Arizona figure does not include the approximately $1.8 million--or 
about 7.7 percent of Arizona state funds awarded for forensic 
laboratory services. 

[End of figure] 

California: 

According to state and local officials, Recovery Act JAG supported 
local gang and drug reduction efforts, helped prevent human 
trafficking, facilitated a regional approach to reducing 
methamphetamine production and distribution, and helped develop 
communications infrastructure. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $11,981,362 (9%); 
Law enforcement: $30,047,654 (22.5%); 
Crime victim and witness programs: $1,858,242 (1.4%); 
Crime prevention and education: $835,678 (0.6%); 
Program planning, evaluation and technology improvement: $131,213 
(0.1%); 
Drug treatment and enforcement: $44,254,215 (33.1%); 
Corrections: $44,510,237 (33.3%). 

Sources: California Emergency Management Agency; Los Angeles Police 
Department. 

[End of figure] 

Colorado: 

State officials noted that Recovery Act JAG helped maintain services 
in corrections, such as support for problem youth and adult offenders 
and prison treatment programs, that faced cuts given the state’s 
revenue shortfalls and budget reductions. In addition, local officials 
stated that Recovery Act JAG helped support jobs and purchase 
equipment that otherwise would have been eliminated or gone unfunded. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $1,972,990 (12%); 
Law enforcement: $1,916,433 (11.6%); 
Crime victim and witness programs: $381,322 (2.3%); 
Crime prevention and education: $1,557,764 (9.4%); 
Program planning, evaluation and technology improvement: $2,173,632 
(13.2%); 
Drug treatment and enforcement: $2,252,813 (13.7%); 
Corrections: $6,235,927 (37.8%). 

Sources: Colorado Division of Criminal Justice; Colorado Department of 
Public Safety. 

Georgia: 

According to state and local officials, Recovery Act JAG funds helped 
support jobs, including retaining public safety personnel, and 
continue delivery of services, such as drug court services, drug 
prevention, and victims’ assistance. In addition, Savannah Police 
Department officials noted that Recovery Act JAG funds were used to 
purchase a fully “patrol-certified” Belgian Malinois breed canine to 
assist with recovery of stolen items, searching for suspects and 
missing persons, and tracking narcotics. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $8,570,732 (27.5%); 
Law enforcement: $13,618,792 (43.7%); 
Crime victim and witness programs: $2,138,127 (6.9%); 
Crime prevention and education: $185,797 (0.6%); 
Program planning, evaluation and technology improvement: $1,468,394 
(4.7%); 
Drug treatment and enforcement: $233,962 (0.8%); 
Corrections: $4,957,258 (15.9%). 

Sources: Georgia Criminal Justice Coordinating Council; Savannah 
Police Department. 

[End of figure] 

Illinois: 

According to state and local officials, Recovery Act JAG funds helped 
purchase law enforcement equipment, such as in-car video systems, that 
would have gone unfunded. Support for other programs and services 
include, for example, support for overtime wages of law enforcement 
agents, mentoring programs and drug treatment programs, domestic 
violence programs, and specialty courts for nonviolent, repeat 
offenders. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $8,142,570 (18.1%); 
Law enforcement: $11,237,969 (25%); 
Crime victim and witness programs: 0%; 
Crime prevention and education: $5,671,274 (12.6%); 
Program planning, evaluation and technology improvement: $4,122,386 
(9.2%); 
Drug treatment and enforcement: $452,965 (1%); 
Corrections: $15,333,173 (34.1%). 

Sources: Illinois Criminal Justice Information Authority, Cook County 
Sheriff’s Office; City of Rockford Police Department. 

[End of figure] 

Iowa: 

Officials in Boone City, Iowa have used a portion of their Recovery 
Act JAG award to institute cross-training of some employees in the city’
s police and fire department. Under the city’s public safety umbrella 
philosophy, some employees in the city’s police and fire departments 
receive training in firefighting, emergency response, and law 
enforcement. Those who receive this “cross-training” are known as 
public safety employees and can respond to any type of incident where 
a police officer or firefighter is needed. Officials said that this 
type of cross-training has allowed the city to be able to do more with 
limited resources. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: 0%; 
Law enforcement: 0%; 
Crime victim and witness programs: 0%; 
Crime prevention and education: $464,214 (4.7%); 
Program planning, evaluation and technology improvement: $36,296 
(0.4%); 
Drug treatment and enforcement: $7,540,845 (76.7%); 
Corrections: $1,792,449 (18.2%). 

Source: Boone City Police Department. 

[End of figure] 

Massachusetts: 

According to local officials, Recovery Act JAG funds helped supplement 
current state public safety programs, retain jobs, and support core 
services, including supporting local police departments through 
funding officer and crime analyst salaries in localities adversely 
affected by local budget conditions. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: 0%; 
Law enforcement: $6,200,000 (27.6%); 
Crime victim and witness programs: 0%; 
Crime prevention and education: $3,100,000 (13.8%); 
Program planning, evaluation and technology improvement: $599,672 
(2.7%); 
Drug treatment and enforcement: 0%; 
Corrections: $12,588,916 (56%). 

Source: Worcester Police Department. 

[End of figure] 

Michigan: 

The Ottawa County Police Department used their Recovery Act JAG funds 
to purchase equipment for law enforcement purposes. The department 
purchased a 20-foot patrol boat, a fingerprint and jail mug-shot 
system, and global positioning satellite (GPS) tracker devices. The 
patrol boat replaces a nearly 20-year-old boat in need of major 
maintenance. The fingerprint and jail mug-shot system improves 
efficiency by enabling the department to identify potential suspects 
with the state’s criminal databases. The GPS tracker devices have 
helped the department in retrieving numerous stolen items and have 
provided evidence useful in the prosecution of defendants. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $14,270,111 (33.4%); 
Law enforcement: $24,249,802 (56.8%); 
Crime victim and witness programs: 0%; 
Crime prevention and education: $1,067,558 (2.5%); 
Program planning, evaluation and technology improvement: $1,511,762 
(3.5%); 
Drug treatment and enforcement: 0%; 
Corrections: $1,611,359 (3.8%). 

Source: Ottawa County Sheriff’s Department. 

[End of figure] 

Mississippi: 

According to state and local officials, Recovery Act JAG funds helped 
support jobs to manage the state JAG program, and supported local 
police departments by filling positions, retaining other positions, 
and funding overtime to provide increased patrols and surveillance. 
JAG funds will support a variety of programs including 
multijurisdictional task forces, victim witness assistance, juvenile 
justice, drug courts, family violence, and increased law enforcement 
training. Recovery Act JAG funds were also used to purchase law 
enforcement equipment including crime lab equipment, computers, police 
cruisers, and integrated software for patrol car laptops. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $825,000 (8.2%); 
Law enforcement: $3,809,668 (37.8%); 
Crime victim and witness programs: 0%; 
Crime prevention and education: $200,000 (2%); 
Program planning, evaluation and technology improvement: $2,619,462 
(26%); 
Drug treatment and enforcement: $2,625,320 (26%); 
Corrections: 0%. 

Source: Mississippi Division of Public Safety Planning. 

[End of figure] 

New York: 

According to state and local officials, Recovery Act JAG funds 
supported the implementation of recent drug law reform, including 
helping assistant district attorneys in reducing the number of prison 
commitments, and continue recidivism pilot programs. New York City 
officials estimate that JAG funds enabled New York City to retain 158 
jobs that would otherwise have been eliminated due to budget cuts, and 
helped create 51 new jobs. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $9,586,534 (15.1%); 
Law enforcement: $1,061,718 (1.7%); 
Crime victim and witness programs: 0%; 
Crime prevention and education: 0%; 
Program planning, evaluation and technology improvement: $2,100,000 
(3.3%); 
Drug treatment and enforcement: $16,740,000 (26.3%); 
Corrections: $34,167,234 (53.7%). 

Sources: New York State Division of Criminal Justice Services; New 
York City Office of the Criminal Justice Coordinator. 

[End of figure] 

North Carolina: 

The Rutherford County Sheriff’s Department used its share of Recovery 
Act JAG funds to purchase a tactical vehicle for their officers when 
responding to volatile situations. The vehicle replaces an old 1986 
Ford van that subjected officers to unnecessary risk and can 
accommodate a team of up to 16 officers as well as store equipment, 
such as weapons and bullet-resistant vests. The department also 
purchased portable surveillance equipment that can be thrown or rolled 
into a room and can provide a 360-degree view to enable officers to 
identify any potential threats before entering a risky environment. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $577,951 (1.9%); 
Law enforcement: $2,222,408 (7.2%); 
Crime victim and witness programs: 0%; 
Crime prevention and education: $4,035,331 (13%); 
Program planning, evaluation and technology improvement: $22,242,265 
(71.9%); 
Drug treatment and enforcement: 0%; 
Corrections: $1,872,374 (6%). 

Source: Rutherford County Sheriff’s Department. 

[End of figure] 

Ohio: 

According to state and local officials, without Recovery Act JAG 
funds, law enforcement agencies would have faced massive layoffs. 
Additional funds were also used to support the purchase of law 
enforcement equipment such as a license plate reader. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $2,805,401 (8%); 
Law enforcement: $12,315,091 (34.9%); 
Crime victim and witness programs: $2,676,585 (7.6%); 
Crime prevention and education: $4,593,430 (13%); 
Program planning, evaluation and technology improvement: $3,590,904 
(10.2%); 
Drug treatment and enforcement: $934,406 (2.7%); 
Corrections: $8,323,142 (23.6%). 

Sources: Office of Criminal Justice Services; Franklin County. 

[End of figure] 

Pennsylvania: 

State and local officials noted that Recovery Act JAG funds supported 
regional antidrug task forces, juvenile programs, and initiatives such 
as records management improvement, prisoner re-entry programs, and at-
risk youth employment programs. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $3,626,239 (15.7%); 
Law enforcement: $836,894 (3.6%); 
Crime victim and witness programs: $3,930,520 (17.1%); 
Crime prevention and education: $5,522,163 (24%); 
Program planning, evaluation and technology improvement: $4,838,141 
(21%); 
Drug treatment and enforcement: 0%; 
Corrections: $4,289,409 (18.6%). 

Sources: Pennsylvania Commission on Crime and Delinquency; 
Philadelphia Police Department; Dauphin County; City of Harrisburg. 

[End of figure] 

Texas: 

According to state and local officials, Recovery Act JAG funds largely 
helped support equipment purchases and technology improvements, as 
well as support law enforcement personnel, especially police officer 
overtime. 

[Figure: Refer to PDF for image: pie-chart] 

Prosecution and courts: $1,957,678 (2.5%); 
Law enforcement: $52,048,291 (65.8%); 
Crime victim and witness programs: $840,286 (1.1%); 
Crime prevention and education: $90,076 (0.1%); 
Program planning, evaluation and technology improvement: $22,060,930 
(27.9%); 
Drug treatment and enforcement: $98,155 (0.1%); 
Corrections: $1,992,491 (2.4%). 

Sources: Texas Criminal Justice Division; El Paso Police Department; 
City of Dallas. 

[End of figure] 

[End of section] 

Appendix VII: Comments from the Department of Justice: 

U.S. Department of Justice: 
Office of Justice Programs: 
Washington, D.C. 20531: 

October 8, 2010: 

Mr. David C. Maurer: 
Director: 
Homeland Security and Justice Issues: 
Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Maurer: 

Thank you for the opportunity to comment on the draft Government 
Accountability Office (GAO) report entitled "Recovery Act: Department 
of Justice Could Better Assess Justice Assistance Grant Program 
Impact" (GAO-11-87). The draft GAO report contains one Recommendation 
for Executive Action to the U.S. Department of Justice, which is 
restated in bold text below and is followed by our response. 

Recognizing that DOJ is already engaged in efforts to refine its JAG 
Performance Measures in the PMT, we recommend that the Acting Director 
of the Bureau of Justice Assistance take the following two actions to 
better monitor Recovery Act JAG program performance and demonstrate 
results through use of this instrument: 

* in revising the department's JAG performance measures consider, as 
appropriate, key attributes of successful performance measurement 
systems, such as clarity, reliability, linkage, objectivity, and 
measurable targets; and; 

* develop a mechanism to validate the integrity of JAG recipients' 
self-reported performance data. 

The Office of Justice Programs (OJP) agrees with the Recommendation 
for Executive Action. The GAO draft report acknowledges the progress 
OJP's Bureau of Justice Assistance (BJA) has made in implementing the 
Edward Byrne Memorial Justice Assistance Grant (JAG) Program, under 
the American Recovery and Reinvestment Act of 2009 (Recovery Act), and 
BIA's efforts in establishing meaningful performance measures to 
assist the Administration, the Congress, and the taxpayers in 
evaluating the effectiveness of the Recovery Act JAG Program. 

BJA initially established the JAG Program performance measures 
contained within this report in the fall of 2008. BJA worked directly 
with many State Administering Agencies (SAAs), OJP grantees, and 
stakeholders in the criminal justice field to develop these measures, 
which had a comprehensive review and comment period. When the Recovery 
Act was signed into law, BJA decided that the progress made on the JAG 
measures would be applied to the Recovery Act JAG Program, as well. 
BJA created broad performance measures for the seven purpose areas of 
the Recovery Act JAG Program with the recognition that most criminal 
justice functions could be supported with Recovery Act JAG funds. The 
mere establishment of performance measures that began to assess the 
value of this multi-purpose program was an initial success. 

BJA has always maintained that these performance measures are its 
first attempt at capturing the breadth of activities under the 
Recovery Act JAG Program, and in examining the performance of the JAG 
Program at the national level. BJA acknowledges the outstanding work 
of its partners in the JAG Program, the SAAs, in creating their own 
performance measures, which may give the taxpayers a better sense of 
the impact the JAG funds are having on crime and public safety within 
their community. BJA has recently initiated an effort to reconvene all 
the stakeholders to reline these performance measures and create new 
measures that capture the essence of the JAG Program on a national 
level. 

As BJA works toward revising the Recovery Act JAG performance measures 
to better monitor program performance, it will consider key attributes 
of successful performance measurement systems, such as clarity, 
reliability, linkage, objectivity, and measurable targets. In 
addition, BJA will develop and implement a mechanism to validate the 
integrity of Recovery Act JAG recipients' self-reported performance 
data. BJA anticipates completing both of these actions by October 1, 
2011. 

If you have any questions regarding this response, you or your staff 
may contact Maureen Henneberg, Director, Office of Audit, Assessment, 
and Management, at (202) 616-3282. 

Sincerely, 

Signed by: 

Laurie O. Robinson: 
Assistant Attorney General: 

cc: 

Beth McGarry: 
Deputy Assistant Attorney for Operations and Management: 

James H. Burch, II: 
Acting Director: 
Bureau of Justice Assistance: 

Leigh Benda: 
Chief Financial Officer: 

Maureen Henneberg: 
Director: 
Office of Audit, Assessment, and Management: 

Richard P. Theis: 
Audit Liaison: 
Department of Justice: 

[End of section] 

Appendix VIII: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

David Maurer, (202) 512-9627 or Maurerd@gao.gov: 

Staff Acknowledgments: 

In addition to the contact named above, Joy Gambino, Assistant 
Director, managed this assignment. Dorian Dunbar, George Erhart, 
Richard Winsor, and Yee Wong made significant contributions to the 
work. Geoffrey Hamilton provided significant legal support and 
analysis. Elizabeth Curda and Cindy Gilbert provided significant 
assistance with design and methodology. Adam Vogt and Linda Miller 
provided assistance in report preparation, and Tina Cheng made 
contributions to the graphics presented in the report. 

[End of section] 

Footnotes: 

[1] Pub. L. No. 111-5, 123 Stat. 115 (2009). 

[2] JAG awards are provided to all states, the District of Columbia, 
Guam, America Samoa, the Commonwealth of Puerto Rico, the Virgin 
Islands, and the Northern Mariana Islands. 

[3] Section 1512 of the Recovery Act requires recipients of recovery 
funds to report on those funds each calendar quarter. The term 
"recipient" means any entity, such as a state, other than an 
individual, that receives recovery funds directly from the federal 
government (including through grants, contracts, or loans). Quarterly 
reports are to include a list of each project or activity for which 
Recovery Act funds were expended or obligated and information 
concerning the amount and use of funds and an estimate of the number 
of jobs created and the number of jobs retained by these projects and 
activities. These recipient reports are to be filed for any quarter in 
which a recipient receives Recovery Act funds directly from the 
federal government. The recipient reporting requirement covers all 
funds made available by appropriations in division A of the Recovery 
Act. See Recovery Act: States' and Localities' Uses of Funds and 
Actions Needed to Address Implementation Challenges and Bolster 
Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] 
(Washington, D.C.: May 26, 2010). 

[4] In response to a requirement in section 901 of the Recovery Act 
mandating certain GAO reviews and reports, we have conducted bimonthly 
reviews of programs for which states and localities have received 
major funding. Two of these prior reviews address Recovery Act JAG: 
[hyperlink, http://www.gao.gov/products/GAO-10-604] as well as 
Recovery Act: States' and Localities' Current and Planned Uses of 
Funds While Facing Fiscal Stresses, [hyperlink, 
http://www.gao.gov/products/GAO-09-829] (Washington, D.C.: July 8, 
2009). 

[5] The seven states visited were Arizona, California, Illinois, 
Massachusetts, New York, Ohio, and Pennsylvania. 

[6] The 14 states we selected are a subset of a 16-state (plus the 
District of Columbia) sample that we used for our broader Recovery Act 
work as discussed in GAO-10-604 and GAO-09-829. The 16-state sample 
contains about 65 percent of the U.S. population and is estimated to 
receive collectively about two-thirds of the intergovernmental 
assistance available through the Recovery Act. The 16 states included 
Arizona, California, Colorado, Florida, Georgia, Illinois, Iowa, 
Massachusetts, Michigan, Mississippi, New Jersey, New York, North 
Carolina, Ohio, Pennsylvania, and Texas. We selected these states and 
the District of Columbia on the basis of federal outlay projections, 
percentage of the U.S. population represented, unemployment rates and 
changes, and a mix of states' poverty levels, geographic coverage, and 
representation of both urban and rural areas. 

[7] The Recovery Act requires recipients of funding under the act to 
report quarterly on the use of these funds, including an estimate of 
the number of jobs created and the number of jobs retained with 
Recovery Act funding. The first recipient reports filed in October 
2009 cover activity from February 2009 through September 30, 2009. The 
second quarterly recipient reports were filed in January 2010 and 
cover activity through December 31, 2009. The third quarterly 
recipient reports were filed in April 2010 and cover activity through 
March 31, 2010. The fourth quarterly recipient reports were filed in 
July 2010 and cover activity through June 30, 2010. 

[8] For information about Recovery Act data reliability, see prior 
reviews that address this: [hyperlink, 
http://www.gao.gov/products/GAO-10-604] and [hyperlink, 
http://www.gao.gov/products/GAO-09-829]. 

[9] Financial Guide, U.S. Department of Justice, Office of Justice 
Programs, Office of the Chief Financial Officer (October 2009). 

[10] DOJ characterizes these activity types as: "Personnel", 
"Equipment and Supplies", "Information Systems for Criminal Justice", 
and a category of "Outcomes for all Categories". 

[11] GAO, Tax Administration: IRS Needs to Further Refine Its Tax 
Filing Season Performance Measures, [hyperlink, 
http://www.gao.gov/products/GAO-03-143] (Washington, D.C.: Nov. 22, 
2002). 

[12] 42 U.S.C. § 3751(a)(1). 

[13] See [hyperlink, http://www.gao.gov/products/GAO-10-604]. 

[14] While BJA is responsible for overseeing the activities and 
reporting of the direct grant recipients, the SAA in each state is 
responsible for overseeing the activities and reporting of localities 
receiving pass-through awards. 

[15] According to DOJ officials, the PMT was officially launched in 
2007 with 2 pilot programs and Recovery Act JAG was added to the PMT 
in June 2009. Recipients were not required to officially report on the 
Recovery Act JAG program until 2010. 

[16] New programs include: John R. Justice; Project Safe 
Neighborhoods; Earmarks; Economic Cybercrime; Tribal Courts; and 
Indian Alcohol and Substance Abuse Prevention. 

[17] January to March 2010 represents the second quarter of fiscal 
year 2010. 

[18] Uniform Crime Report Part I violent crimes include murder, 
robbery, aggravated assault, and forcible rape (See FBI publication 
Crime in the United States). To be eligible for such funding, 
localities must have submitted such Uniform Crime Report data in at 
least 3 of the preceding 10 years. 

[19] SAAs are designated agencies in each state that establish funding 
priorities and coordinate JAG funds among state and local justice 
initiatives. 

[20] Some localities receive funds from both their SAA (via the 
competitive, pass-through process) and DOJ (via direct formula). 

[21] According to the Bureau of Justice Statistics (BJS), when a unit 
of local government (such as a county) bears more than 50 percent of 
the costs of prosecution or incarceration in association with violent 
crimes reported for another unit of local government (along with other 
factors), the local governments must submit a joint application for 
funds allocated to the units of local government and agree on the 
amount of funds allocated to each jurisdiction. 

[22] DOJ Office of the Inspector General Audit Division, Office of 
Justice Programs' Recovery Act and Non-Recovery Act Programs for 
Edward Byrne Memorial Justice Assistance Grants and Byrne Competitive 
Grants, Audit Report 10-43 (Washington, D.C.: August 2010). 

[23] The California State Auditor recently raised concerns about the 
pace of Recovery Act JAG expenditures; however, in response to the 
auditor's work, California officials stated they anticipate expending 
all Recovery Act JAG awards in 2 years, well within the 4-year 
spending period allowed. California State Auditor, Bureau of State 
Audits, California Emergency Management Agency: Despite Receiving $136 
Million in Recovery Act Funds in June 2009, It Only Recently Began 
Awarding These Funds and Lacks Plans to Monitor Their Use, Letter 
Report 2009-119.4 (Sacramento, Calif.: May.4, 2010). 

[24] TASER is a trademark and an acronym for Thomas A. Swift's 
Electric Rifle, which is a product line of hand-held devices that 
deliver an electric shock designed to incapacitate an individual. 
Ammunition includes TASER cartridges. 

[25] The Recovery Act's Community Oriented Policing Services (COPS) 
Hiring Recovery Program (CHRP) is a competitive grant program 
administered by DOJ that provided $1 billion in fiscal year 2009 
funding to law enforcement agencies to create and preserve jobs and to 
increase community policing capacity and crime-prevention efforts. 
CHRP grants to local law enforcement agencies provide 100 percent 
funding for approved entry-level salaries and benefits for 3 years for 
newly hired, full-time sworn police officers. See GAO-10-604. 

[26] BJA awarded over $400 million in direct grants to 1,338 
localities within our 14 sample states. However, while BJA requires 
grantees to identify the use of funds across seven broad program 
areas, BJA has not yet reported national data on how grantees use 
Recovery Act JAG funds within the seven broad program areas. 
Therefore, in order to determine how directly awarded Recovery Act JAG 
funds were used, we reviewed direct recipients' quarterly data 
submissions to Recovery.gov and assigned the awards to one of the 
seven allowable program categories based on our analysis. 

[27] When a local award recipient indicated that it was funding 
projects in more than one of the seven general purpose areas, it was 
categorized as having multiple purposes. In cases where awards were 
used across multiple purposes or could not be categorized clearly, we 
assigned them into those additional categories. 

[28] This detailed list must include (a) the name of the project or 
activity; (b) a description of the project or activity; (c) an 
evaluation of the completion status of the project or activity; (d) an 
estimate of the number of jobs created and the number of jobs retained 
by the project or activity; and (e) for infrastructure investments 
made by the state and local governments, the purpose, the total costs, 
and rationale of the agency for funding the Recovery Act. 

[29] Once recipients complete their respective projects, they are no 
longer required to submit data into Recovery.gov. For the period 
ending March 31, 2010, there were 814 Recovery Act JAG recipients, 
including SAAs, submitting reports into Recovery.gov. As of June 30, 
2010, there were 797 Recovery Act JAG recipients submitting reports. 
Based on Recovery.gov reports, 17 recipients stopped submitting 
reports into Recovery.gov over this time period, likely because they 
completed their Recovery Act JAG funded projects and have closed out 
their grants. 

[30] DOJ, Office of Justice Programs Financial Guide (Washington, 
D.C.: October 2009). 

[31] DOJ Office of the Inspector General Audit Division, Audit Report 
10-43. 

[32] We did not assess the quality of the information being shared. 

[33] Recovery Act JAG recipients are required to use the PMT to report 
on performance measures for activities funded by the Recovery Act. 
While recipients are not required to report on all 86 performance 
measures, DOJ requires them to select those associated with the 
activities their awards have funded and self-report on the measures 
they deem most applicable. 

[34] We selected 19 performance measures that were associated with the 
largest share of Recovery Act JAG expenditures, such as personnel, 
equipment and supplies, and information system improvements. For more 
information, see appendix I (Scope and Methodology). 

[35] [hyperlink, http://www.gao.gov/products/GAO-05-356] and 
[hyperlink, http://www.gao.gov/products/GAO-07-660]. 

[36] GAO, Managing for Results: Enhancing Agency Use of Performance 
Information for Management Decision Making, [hyperlink, 
http://www.gao.gov/products/GAO-05-927] (Washington, D.C.: Sept. 9, 
2005). 

[37] See GAO-03-143 for more information on these attributes. 

[38] See BJA, Guide Related to Program Evaluation and Performance 
Measurement (2010) available at: [hyperlink, 
http://www.ojp.usdoj.gov/BJA/evaluation/guide/ap1.htm].  

[39] GAO, Drug Control: DOD Needs to Improve Its Performance 
Measurement System to Better Manage and Oversee Its Counternarcotics 
Activities, GAO-10-835 (Washington, D.C.: July 2010). 

[40] [hyperlink, http://www.gao.gov/products/GAO-10-886] and 
[hyperlink, http://www.gao.gov/products/GAO-03-143]. 

[41] [hyperlink, http://www.gao.gov/products/GAO-10-835] and 
[hyperlink, http://www.gao.gov/products/GAO-10-886]. 

[42] Stated purposes of the Recovery Act are to preserve and create 
jobs and promote economic recovery; to assist those most impacted by 
the recession; to provide investments needed to increase economic 
efficiency by spurring technological advances in science and health; 
to invest in transportation, environmental protection, and other 
infrastructure that will provide long-term economic benefits; and to 
stabilize state and local government budgets, in order to minimize and 
avoid reductions in essential services and counterproductive state and 
local tax increases. 

[43] DOJ has three agencywide strategic goals: (1) Prevent Terrorism 
and Promote the Nation's Security; (2) Prevent Crime, Enforce Federal 
Laws, and Represent the Rights and Interests of the American People; 
and (3) Ensure the Fair and Efficient Administration of Justice. 

[44] GAO, The Results Act: An Evaluator's Guide to Assessing Agency 
Annual Performance Plans, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-10.1.20] (Washington, D.C.: April 
1998). 

[45] Programs using the PMT include: Targeting Violent Crime 
Initiative; Drug Courts; Comprehensive Anti-Gang Initiative; ARRA 
Rural Law Enforcement; Residential Substance Abuse Treatment; Justice 
Assistance Grant (FY09, FY10, and ARRA); ARRA Southern Border; ARRA 
Byrne Competitive; Justice and Mental Health; Training and Technical 
Assistance Grants; Tribal Construction; Second Chance; and 
Prescription Drug Monitoring Program. 

[46] See [hyperlink, http://www.gao.gov/products/GAO-10-604] and 
[hyperlink, http://www.gao.gov/products/GAO-09-829]. 

[47] See [hyperlink, http://www.gao.gov/products/GAO-03-143]. 

[48] See [hyperlink, http://www.gao.gov/products/GAO-05-457], 
[hyperlink, http://www.gao.gov/products/GAO-10-835] and [hyperlink, 
http://www.gao.gov/products/GAO-10-886]. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: