This is the accessible text file for GAO report number GAO-09-850R 
entitled 'Juvenile Justice: A Time Frame for Enhancing Grant Monitoring 
Documentation and Verification of Data Quality Would Help Improve 
Accountability and Resource Allocation Decisions' which was released on 
September 22, 2009. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

GAO-09-850R: 

United States Government Accountability Office: 
Washington, DC 20548: 

September 22, 2009: 

The Honorable Robert C. Scott:
Chairman:
Subcommittee on Crime, Terrorism, and Homeland Security: 
Committee on the Judiciary:
House of Representatives: 

Subject: Juvenile Justice: A Time Frame for Enhancing Grant Monitoring 
Documentation and Verification of Data Quality Would Help Improve 
Accountability and Resource Allocation Decisions: 

Dear Mr. Chairman: 

From fiscal years 2006 through 2008, the Office of Juvenile Justice and 
Delinquency Prevention (OJJDP) within the Department of Justice (DOJ) 
awarded $1.2 billion in funds through approximately 2,000 grants in 
support of its mission to help states and communities prevent and 
reduce juvenile delinquency and victimization and improve their 
juvenile justice systems.[Footnote 1] OJJDP awards grants to states, 
territories, localities, and organizations to address a variety of 
issues, such as reducing juvenile substance abuse, combating Internet 
crimes against children, preventing youth gang involvement, and 
providing youth mentoring services. The scope and administration of 
OJJDP grants also vary, ranging from private organization recipients 
that implement programs directly in a single community to states that 
administer grants by awarding the funds they receive to subgrantees to 
implement programs locally and statewide. 

Assessing the performance of these programs through grant monitoring is 
a key management tool to hold grantees accountable for implementing 
programs as agreed to in their awards, to verify they are making 
progress toward the objectives of their programs, and to ensure that 
grant funds are used in support of OJJDP's mission. DOJ's Office of 
Justice Programs (OJP) establishes grant monitoring policies for its 
components, including OJJDP.[Footnote 2] In 2008, the DOJ Office of the 
Inspector General identified grant management, including maintaining 
proper oversight of grantees to ensure grant funds are used as 
intended, as a critical issue and among the department's top management 
challenges. In the past, we have identified concerns specific to 
OJJDP's grant monitoring activities. In October 2001, we reported that 
OJJDP was not consistently documenting its grant monitoring activities, 
such as required phone contacts between grant managers and grantees, 
and as a result could not determine the level of monitoring being 
performed by grant managers.[Footnote 3] We recommended that OJJDP take 
steps to determine why it was not consistently documenting its grant 
monitoring activities and develop and enforce clear expectations 
regarding monitoring requirements. Since that time, partially in 
response to our recommendation, OJJDP has taken steps to address this 
recommendation. For example, OJJDP conducted an assessment of 
additional policies and procedures that were needed for grant 
monitoring, and developed a manual that outlined steps for completing 
specific monitoring activities, such as review of grantee 
documentation. 

To help Congress ensure effective use of funds for juvenile justice 
grant programs, you asked us to assess OJJDP's efforts to monitor the 
implementation of its grant programs. This report addresses the 
following questions: 

* What processes does OJJDP have in place to monitor the performance of 
its juvenile justice grants, and to what extent does it record results 
of its monitoring efforts to ensure transparency and accountability? 

* How, if at all, does OJJDP use performance measurement data to make 
programming and funding decisions, and to what extent does it verify 
the quality of these data? 

To identify the processes OJJDP has in place to monitor the performance 
of its grants and assess the extent to which it records the results of 
its efforts, we analyzed relevant OJJDP documentation, including grant 
program monitoring policies, procedures, guidelines, and records, such 
as desk reviews and site visit reports, which grant managers are to 
complete to document monitoring results, including a grantee's level of 
accomplishment relative to stated goals. We interviewed cognizant OJJDP 
officials, including officials responsible for conducting grant 
monitoring activities from each of the three OJJDP divisions that 
manage its grant programs, as well as officials responsible for 
overseeing monitoring across the divisions. To identify federal 
criteria for grant monitoring processes and recording grant monitoring 
activities, we analyzed applicable laws, regulations, and Standards for 
Internal Control in the Federal Government, as well as policies and 
guidelines from OJP.[Footnote 4] In addition, we interviewed OJP 
officials about its grant monitoring policies and procedures. We then 
compared OJJDP's processes for monitoring the performance of its grants 
and recording monitoring activities with federal grant monitoring 
criteria to determine the extent to which OJJDP processes meet federal 
criteria.[Footnote 5] We focused our analysis on OJJDP's processes for 
monitoring the performance of its discretionary, formula, and block 
grant programs.[Footnote 6] We did not assess the extent to which grant 
managers complied with OJJDP processes and procedures because many of 
OJJDP's monitoring processes have only been established since August 
2007 or are in the process of being implemented. 

To determine how OJJDP uses performance measurement data to make 
programming and funding decisions and the extent to which it verifies 
these data to ensure their quality, we analyzed relevant documentation, 
including performance measurement policies, summary data reports 
generated by OJJDP, and records of grant manager observations. We then 
compared OJJDP's processes for using performance measurement data and 
for verifying these data with federal standards, as articulated in 
sources, including the Office of Management and Budget (OMB) guidelines 
on data verification,[Footnote 7] our Standards for Internal Control in 
the Federal Government, standard practices for program management, 
[Footnote 8] and OJP's Information Quality Guidelines,[Footnote 9] to 
identify the extent to which OJJDP's processes meet these standards. We 
also interviewed the OJJDP official responsible for overseeing grant 
program performance measures as well as officials responsible for 
collecting data from grantees. Due to the volume of OJJDP performance 
measures--about 600 in total--we did not assess the quality of each 
individual metric. 

We conducted this performance audit from March 2009 through September 
2009 in accordance with generally accepted government auditing 
standards. Those standards require that we plan and perform the audit 
to obtain sufficient, appropriate evidence to provide a reasonable 
basis for our findings and conclusions based on our audit objectives. 
We believe that the evidence obtained provides a reasonable basis for 
our findings and conclusions based on our audit objectives. 

Results in Brief: 

In accordance with OJP requirements, OJJDP has processes in place to 
monitor the performance of its juvenile justice grants including desk 
reviews, site visits, and postsite visit follow-up; however, the office 
does not have a process to record all follow-up steps taken to resolve 
grantee issues identified during site visits. Grant managers are to 
conduct grantee monitoring through three phases: (1) desk review, used 
to review grantee documentation to understand a grantee's level of 
accomplishment relative to its stated goals; (2) site visit, used to 
verify that grantees are implementing programs consistent with proposed 
plans; and (3) postsite visit, used to resolve issues identified during 
the visit. We found, during our review of OJJDP monitoring 
documentation, that desk review and site visit activities are to be 
recorded in OJP's automated repository, the Grant Management System, in 
accordance with OJP requirements. In addition, OJJDP officials said 
that OJJDP requires grant managers to record postsite visit actions 
taken through OJP's Correction Action Plan process, which OJJDP 
reserves for egregious circumstances, such as a failure to meet basic 
programming requirements.[Footnote 10]However, OJJDP does not require 
that issues resolved informally, such as by e-mail, during the postsite 
visit phase be recorded in the system. Thus, OJJDP is not fully 
addressing OJP's requirement that all follow-up actions taken to 
resolve grantee issues identified during site visits be recorded in the 
Grant Management System. According to the OJJDP official who oversees 
monitoring, prior to OJP's May 2009 Grant Management System 
enhancements, it was cumbersome for grant managers to record resolution 
of grantee issues in the system. Therefore, although this official 
stated that OJJDP intends to fully implement OJP's requirement, the 
official said that the office does not have a time frame for 
implementing the requirement and anticipates grant managers will need 
time to adjust to the enhancements. Standard practices for program 
management state that the successful execution of any plan includes 
identifying in the planning process a timeline for delivering the plan. 
While we understand that it takes time to adjust to changes, 
establishing a time frame for implementing the requirement to record 
resolution of all grantee issues, including those resolved informally, 
will provide OJJDP with a concrete goal for fully implementing this 
requirement. Moreover, it will also help hold OJJDP accountable for 
developing processes, consistent with OJP requirements, to document all 
postsite visit actions taken--whether the issues identified are routine 
or egregious. 

While OJJDP has developed performance measures for its grant programs 
and collects performance measurement data from its grantees, the office 
is making limited use of these data because it is not verifying these 
data to ensure their quality, which is inconsistent with leading 
management practices in performance measurement. As we have reported in 
the past, data verification--assessment of data completeness, accuracy, 
consistency, timeliness, and related quality control practices--helps 
to ensure that users can have confidence in the reported performance 
information.[Footnote 11] According to OJJDP officials, they have not 
taken action to verify performance data because since 2002 they have 
focused on the collection of such data rather than on its utilization. 
Specifically, since 2002, OJJDP has developed performance measures for 
each of its grant programs and implemented requirements for all 
grantees to report on measures at least once a year. Although these 
officials said that OJJDP has processes in place to assess whether the 
data are appropriate for the performance measure, they stated that 
OJJDP does not have data verification processes in place and is 
dependent on grantees to report complete, accurate, consistent, and 
timely data. These officials also stated that because OJJDP does not 
know the quality of the data submitted, they do not use performance 
data to make resource allocation decisions. Due to the nature of OJJDP 
grant funding--of the $392 million it awarded in fiscal year 2008, 
OJJDP exercised its discretion to select recipients for 45 percent of 
these funds, and awarded the remaining 56 percent pursuant to a formula 
or fixed level, or based on direction from Congress--it is important 
that OJJDP officials and congressional decision makers have accurate 
performance information to be able to make funding decisions that are 
in line with performance results.[Footnote 12] According to OJJDP 
officials, OJJDP is developing an approach to verify the performance 
measurement data collected from grantees; however, they could not 
provide documentation or a timetable for such an approach. According to 
standard practices for program management,[Footnote 13] defining an 
approach, creating a plan for executing the approach, and developing 
timelines for the approach lay the groundwork for the successful 
execution of a program. Therefore, defining an approach to verify its 
performance measurement data and determining how it will execute the 
approach and the timelines for its implementation, would better 
position OJJDP to help ensure that it is providing quality information 
to the public, and to internal agency officials and congressional 
decision makers who play a role in determining where to allocate OJJDP 
funding resources. In addition, while OJJDP posts online performance 
reports that aggregate performance data for all grantees within a 
program as a mechanism to apprise the public of its programs' 
contributions to the juvenile justice field, it does not state in these 
reports that the data on which the reports are based have not been 
verified. OJJDP officials acknowledged that the reports do not state 
that the data have not been verified, but did not provide a reason for 
this omission. As we reported in March 2008 and according to OJP 
policy, it is important that the processes used to prepare 
communication products be transparent, clear, and understood.[Footnote 
14] Stating in each communication product containing performance 
measurement data that the limitations of the data are currently unknown 
as the data have not been verified would help OJJDP to ensure that the 
information it disseminates and specifically the limitations thereof 
are transparent, clear, and understood. 

To help ensure the resolution of grantee issues, as well as ensure 
accountability for grantees fulfilling the conditions of their grants, 
we are recommending that the Administrator of OJJDP establish a time 
frame for implementing the requirement established by OJP for grant 
managers to record in the Grant Management System actions taken to 
resolve grantee issues consistent with OJP requirements. To help ensure 
the quality of performance measurement data submitted by grantees and 
improve these data to support agency and congressional decision making, 
we recommend that the Administrator of OJJDP finalize a data 
verification approach that includes how it will execute the approach to 
assess data completeness, accuracy, consistency, and timeliness, and 
time frames for implementing the approach consistent with standard 
practices for program management. Prior to verifying the data, to help 
ensure that the performance information in the communication products 
OJJDP disseminates is transparent, clear, and understood, we recommend 
that the Administrator of OJJDP note in each document containing 
performance measurement data that the limitations of the data are 
currently unknown as the data have not been verified. In commenting on 
a draft of this report, OJP agreed with our recommendations and 
described efforts it plans to take to address them. OJP comments are 
reprinted in enclosure IV. 

Background: 

OJP is the main grant awarding office within DOJ and is comprised of 
OJJDP and four other components. In 2006, Congress passed the Violence 
Against Women and Department of Justice Reauthorization Act, which 
established the Office of Audit, Assessment and Management (OAAM) 
within OJP, among other things, to improve the monitoring of OJP grant 
programs and to undertake more timely action on grantees who do not 
comply with grant terms.[Footnote 15] This act also requires that OJP 
monitor the performance of at least 10 percent of the grant funds it 
awards annually.[Footnote 16] OAAM became operational in fiscal year 
2007 and is charged with conducting assessments of grant programs 
within OJP components, ensuring grantee compliance with the financial 
terms of their grants, auditing OJP internal controls, and acting as a 
central source for OJP grant management policy. As such, OAAM sets 
OJP's guidelines and minimum standards for grant monitoring. While OAAM 
is charged in an oversight capacity with assessing grant programs, 
direct responsibility for monitoring grantees and supporting them in 
carrying out their programs rests with OJP components, including OJJDP. 
These components are required to adhere to OJP's minimum standards for 
grant monitoring and are responsible for ensuring the quality of the 
monitoring performed within the component. However, they may also elect 
to establish additional requirements for the monitoring performed 
within their respective components. 

Within OJJDP, the Deputy Administrator for Programs oversees the 
office's three program divisions--Child Protection, State Relations and 
Assistance, and Demonstration Programs--which are responsible for 
administering OJJDP's grant programs and monitoring the individual 
grants within them.[Footnote 17] OJJDP administers 26 grant programs, 
and makes individual grant awards--a total of approximately 2,000 from 
fiscal years 2006 through 2008--within each of them. Recipients of 
OJJDP awards include a variety of grantees ranging from private 
organizations that implement grant programs directly, such as a youth 
mentoring program, to states that administer grants by awarding the 
funds they receive to various state or local government subgrantees. 
Further, while some grantees operate programs in a single local 
community, others oversee statewide or even national programs, such as 
the Girl Scouts or the Boys and Girls Clubs of America. There are 33 
grant managers across the three program divisions; within each of the 
divisions, grant managers are supervised by a first line supervisor and 
the head of the division. To hold grant managers accountable for the 
monitoring they are required to perform, OJJDP conducts annual 
performance reviews that assess managers on several dimensions. These 
dimensions include knowledge of the Grant Management System; use of the 
OJP tools to monitor grants; documentation of desk reviews, site visit 
reports, and other contact with grantees in the Grant Monitoring 
System; and completion of training to improve and maintain grant 
management skills. Figure 1 summarizes the components involved in OJJDP 
programmatic grant monitoring and their respective roles. 

Figure 1: OJJDP Programmatic Grant Monitoring Organization: 

[Refer to PDF for image: organizational chart] 

Top level: 
Office of Justice Programs: (principal Department of Justice grant 
awarding office);
Reporting to Office of Justice Programs: 
* Office of Audit Assessment and Management (sets minimum monitoring 
standards for OJP components); 

Second level, reporting to Office of Justice Programs: 
* Office of Juvenile Justice and Delinquency Prevention (responsible 
for monitoring grants in office); 
Reporting to Office of Juvenile Justice and Delinquency Prevention: 
* Grants Management Unit (supports OJJDP division monitoring efforts). 

Third level, reporting to Office of Juvenile Justice and Delinquency 
Prevention: 
* State Relations and Assistance Division (conducts monitoring of OJJDP 
grant programs); 
* Demonstrative Programs Division (conducts monitoring of OJJDP grant 
programs); 
* Child Protection Division (conducts monitoring of OJJDP grant 
programs). 

Source: GAO analysis of PJP data. 

Note: OJJDP's organization includes a Grant Management Unit that is to 
support its program divisions by providing grant administration 
assistance and guidance. As of July 2009, the Grants Management Unit 
was not staffed. The director of the division was selected for another 
position in the agency and the three staff members in the unit were 
detailed to the three program divisions. 

[End of figure] 

OJP directs that monitoring performed by its components include 
programmatic monitoring, through which grant managers assess the 
performance of grant programs by addressing the content and substance 
of a program.[Footnote 18] During programmatic monitoring, grant 
managers review qualitative information (such as progress reports 
submitted by grantees and site visit reports documenting observations 
of grantee program implementation), and quantitative information (such 
as performance measurement data submitted by grantees), to determine 
grant performance, innovation, and contributions to the field. In grant 
applications, grantees are required to propose grant goals that support 
OJJDP's stated program purpose, the activities through which they aim 
to achieve those goals, and an implementation plan describing timelines 
and steps for the activities. For instance, the purpose of OJJDP's 
Tribal Youth Program is to support tribal efforts to prevent and 
control delinquency and improve tribal justice systems for American 
Indian/Alaska Native Youth. The proposed goals of one grantee that 
received funds under this program were to lower the number of 
delinquent activities and increase the decision-making abilities of 
youth within a tribal community, through activities including 
recruiting youth from the community and providing them with mentoring 
services. Through programmatic monitoring, grant managers determine 
whether grant activities are consistent with the grant goals and 
objectives. Among other things, the OJP Grant Manager's Manual, which 
documents OJP grant monitoring policies and procedures, requires grant 
managers to review grantee documentation and converse with grantees to 
assess the extent to which implementation of a grant program is 
consistent with the grantee's planned timelines and proposed activities 
and goals. 

During programmatic monitoring, grant managers help ensure that 
grantees comply with relevant statutory and regulatory requirements. 
For example, the Juvenile Justice and Delinquency Prevention Act 
requires states receiving formula grant funding to submit a state plan 
annually that, among other things, provides for an advisory group. The 
members of this group must include, among others, at least one locally 
elected official, and representatives of law enforcement and juvenile 
justice agencies.[Footnote 19] Grant managers ensure that the state 
advisory groups meet these membership requirements and provide them 
with training as needed to support them in performing their required 
functions. Further, grant managers are to assess grantee performance in 
meeting the objectives of their grants, such as providing services to a 
specific number of program participants, and if they are not doing so, 
grant managers are to collaborate with the grantee to resolve the 
issue. Grant managers may also provide or recommend technical 
assistance to help grantees implement their programs consistent with 
their grant awards and help grantees address any challenges they may 
encounter in doing so.[Footnote 20] For instance, if a grantee 
confronts program implementation challenges, such as underenrollment of 
youth in their program or if they need assistance to design a necessary 
survey, grant managers may work with the grantee to identify solutions 
and ensure relevant training opportunities are available to the 
grantee. 

OJP's minimum standards for grant monitoring require that grant 
managers conduct programmatic monitoring activities defined by its 
three phases for conducting and following up on monitoring: (1) desk 
review, (2) site visit, and 3) postsite visit.[Footnote 21] 

* During the desk review phase, grant managers review grant 
documentation to obtain an understanding of a grantee's level of 
accomplishment relative to stated goals and prepare for the site visit 
phase. Among other things, grant managers are required to review 
progress reports submitted by grantees to determine if the reports 
contain information, such as performance measurement data, related to 
the grantee's performance in implementing the grant program.[Footnote 
22] 

* During the site visit phase, grant managers meet with the grantee, 
review documents, such as the grantee's planned implementation 
timeline, and observe program activity to assess how grant program 
objectives are being implemented and to compare planned versus actual 
progress. Grant managers use OJP's site visit checklist to guide their 
assessments. Among other things, this checklist instructs grant 
managers to identify any implementation challenges the grantee is 
experiencing and any technical assistance the grantee requires. 

* During the postsite visit phase, grant managers report their findings 
to the component and to the grantee being monitored, including any 
grantee issues they identify in need of resolution (e.g., program 
services that are untimely or of poor quality). When a grant manager 
identifies issues in need of resolution as part of a site visit, the 
grant manager is responsible for following up with the grantee to make 
sure that issues or questions are resolved. 

In addition to performing activities included in these three phases, 
OJP also requires that grant managers perform ongoing monitoring 
throughout the course of their monitoring efforts. Ongoing monitoring 
entails frequent communication between grant managers and grantees via 
telephone and e-mail, during which grant managers may identify problems 
a grantee is experiencing, provide programmatic advice, or assist in 
resolution of any problems impeding program implementation. For 
example, in the course of ongoing monitoring, a grant manager may 
advise a grantee about how to design a survey or discuss strategies for 
enrolling youth in grantee programs. 

Since 2007, OJP has taken steps to revise its grant monitoring policies 
and procedures and to develop more automated and standard monitoring 
tools. For example, in August 2007 it released a revised and expanded 
OJP Grant Manager's Manual, which documents OJP grant monitoring 
policies and procedures. OJP has also expanded the capabilities of its 
automated Grant Management System. Among other features, the Grant 
Management System now enables grant managers to collect monitoring 
information in a uniform fashion and provides checklists for grant 
managers to use during monitoring activities.[Footnote 23] 
Additionally, OJP has developed an assessment tool intended to provide 
a common approach for grant managers to assess risk associated with 
grantees and prioritize their monitoring activities based on their 
assessments. Using this tool, grant managers apply standard criteria, 
such as the grantee's performance and the complexity of the award, to 
assess the risk associated with the grant, which they then use to 
assign the grant a monitoring priority ranging from low to high. 
[Footnote 24] 

Internal controls for ensuring the effectiveness of agency operations 
require timely and accessible documentation of information in order for 
management to carry out its operational responsibilities, such as 
determining whether the agency is meeting its goals. Accordingly, it is 
important that grant managers consistently record their monitoring 
efforts to assess the performance of grant programs.[Footnote 25] OJP 
policy directs managers to record the range of their programmatic 
monitoring activities in the Grant Management System, including desk 
reviews, site visits, and site visit follow-up. Enclosure I provides 
more detailed information about OJP's minimum standards for performing 
and recording programmatic monitoring. 

According to leading practices in performance measurement, performance 
measures should be linked directly to the offices that have 
responsibility for making programs work to reinforce accountability and 
ensure that day-to-day activities contribute to the results the 
organization is trying to achieve.[Footnote 26] As we have previously 
reported, verification and validation--assessing whether data are 
appropriate for the performance measure--are fundamental to agency 
management. Specifically, verification and validation help to ensure 
that data are of sufficient quality and will be available when needed 
to document performance and support decision making.[Footnote 27] 
Additionally, according to OMB, an agency can increase the level of 
confidence Congress and the public have in its performance information 
by applying verification and validation criteria to the performance 
measurement data, such as providing information on data quality by 
defining and documenting data limitations.[Footnote 28] 

OJJDP Has Processes in Place to Monitor the Performance of Its Grants, 
but Could Better Ensure Accountability by Establishing a Time Frame for 
Implementing OJP's Requirement to Record Resolution of Grantee Issues: 

OJJDP has processes in place to monitor the performance of its juvenile 
justice grants and to record the results of activities performed during 
two of its three phases for conducting and following up on monitoring 
in the Grant Management System. According to OJJDP officials, the 
agency intends to fully implement OJP's requirement that grant managers 
record steps taken to resolve all issues identified during site visits, 
but as of August 4, 2009, it had not yet established a time frame for 
doing so. By establishing a time frame, OJJDP will have a concrete goal 
for fully implementing OJP's requirement. Furthermore, such a goal will 
help hold OJJDP accountable for developing processes to ensure that 
grant managers are resolving issues they identify during site visits 
and holding grantees accountable for achieving program goals as 
management intended. 

OJJDP Has Processes in Place to Monitor the Performance of Its Grants 
and to Record the Results of Two of Its Three Monitoring Phases in the 
Grant Management System: 

OJJDP has processes in place in accordance with OJP requirements to 
perform activities included in OJP's three phases for conducting and 
following up on monitoring--desk review, site visit, and postsite 
visit. The official responsible for overseeing grant monitoring across 
OJJDP divisions stated that the processes used by OJJDP grant managers 
to complete monitoring activities are guided by the policy and 
instructions provided in the OJP Grant Manager's Manual. Accordingly, 
at the beginning of each year, OJJDP grant managers use OJP's risk 
assessment tool in conjunction with their own discretion to assign 
grants a monitoring priority level and to select grants for which they 
plan to conduct monitoring site visits. If a site visit is planned for 
a grant, an OJJDP grant manager completes a desk review for the grant 
using an OJP desk review checklist in preparation for the visit. 
[Footnote 29] For all grants for which OJJDP has not planned a site 
visit, a grant manager completes a desk review at some time during the 
year. During site visits, grant managers are to follow the guidance 
provided in the OJP site visit checklist to guide their activities. 
This checklist includes eight standard programmatic review categories 
on which grant managers are to assess grantees, such as if the grantee 
is implementing grant activities and providing grant services as 
proposed in the grant application, if the grantee is experiencing any 
challenges in implementing its program, and if the grantee requests 
technical assistance. Grant managers determine their responses for each 
of these categories based on discussion with the grantee, documentation 
review, and observation of grantee activities. According to OJJDP 
records, in fiscal year 2008 the office completed 316 site visits of 
programs that had a collective value of 28 percent of the total dollar 
amount of OJJDP's open grant awards. 

Following site visits, grant managers are to write a site visit report 
summarizing their findings, including any grantee issues in need of 
resolution. For example, grant managers may find that a grantee has not 
enrolled as many youth from its target population as it had planned or 
that the grantee is not complying with special conditions of the grant, 
such as that at least 20 percent of its advisory board members are 
under the age of 24 at the time of their appointment. Grant managers 
are to work with grantees following site visits to resolve any issues 
they identify. If a grant manager is concerned about an issue, the 
manager may require the grantee to undergo the Corrective Action Plan 
process, which involves developing a plan that describes the issues in 
need of resolution and identifies tasks involved in resolving them, who 
will carry out the tasks, task deadlines, and how the problems will be 
corrected. However, according to OJJDP officials, grant managers 
reserve this course of action for egregious circumstances that rarely 
occur.[Footnote 30] Officials said that the majority of the time, grant 
managers elect to work with the grantee to resolve issues they identify 
during site visits less formally, through means such as troubleshooting 
via telephone or e-mail. According to OJJDP officials, findings for 
which a grant manager would not require a Corrective Action Plan could 
include circumstances such as if a grantee stated it would enroll 100 
children in its service program in its application and has only 
enrolled 10 children. To resolve this issue, a grant manager may 
discuss strategies with a grantee for enrolling youth from its target 
population by telephone or e-mail, or if the grant manager deems 
appropriate, assist the grantee to revise the target population stated 
in the grant award. 

OJJDP also has processes in place for recording the results of two of 
these monitoring phases--desk review and site visit--in the Grant 
Management System in accordance with OJP documentation requirements. 
Specifically, after completing desk reviews, grant managers are to 
upload a copy of the completed checklist used to conduct the review to 
the Grant Management System. Similarly, after completing site visits, 
grant managers are to record the results from the visit by generating a 
site visit report using the template provided in the Grant Management 
System and saving the report in the system. According to OJP officials, 
OJP counts monitoring performed by its components toward the 
requirement that it monitor at least 10 percent of the total value of 
its open grant awards annually if the monitoring is documented by a 
site visit report and site visit checklist in the Grant Management 
System. According to these officials, in 2008 OJJDP grant managers 
documented site visit reports for more than 10 percent of the total 
value of OJJDP's open awards in the Grant Management System. 
Specifically, according to OJJDP data, grant managers recorded site 
visit reports for 28 percent of the value of the office's $1.1 billion 
in open awards. 

According to the senior OJJDP official who oversees monitoring efforts, 
it is important for grant managers to record monitoring activities in 
the Grant Management System in order to assess the performance of grant 
managers and to ensure the completion and quality of monitoring 
activities. The Grant Management System is OJJDP's sole system that 
provides management with ready access to grant monitoring records for 
all of its grants. OJJDP also uses documentation in the system to keep 
track of the tasks grant managers have performed and those outstanding, 
to track the amount of time it takes grant managers to accomplish 
monitoring activities, and to assess the quality of the monitoring 
performed by grant managers. 

Establishing a Time Frame to Fully Implement OJP's Requirement for 
Managers to Record All Actions Taken to Resolve Grantee Issues 
Identified during Site Visits Would Improve Accountability: 

Since the beginning of fiscal year 2008, the OJP Grant Manager's Manual 
has required that grant managers record all of their actions to resolve 
grantee issues identified during site visits--whether they are 
addressed through the Corrective Action Plan process or less formally--
in the Grant Management System. OJP officials identify documentation in 
the Grant Management System as important in order to ensure that 
monitoring occurs and that grant funds are used as intended. Further, 
OJP officials stated that, beyond providing general accountability by 
offering a means to illustrate that monitoring has occurred, 
documentation in the Grant Management System also provides information 
about the performance of grantees that can be used to both determine if 
grantees are using funds to implement grant programs in accordance with 
the objectives of their awards and to inform future funding decisions. 
According to OJP officials, prior to fiscal year 2008, OJP required 
grant managers to document these activities, but they were permitted to 
do so outside of the Grant Management System. OJP officials identify 
recording efforts to resolve grantee issues identified during site 
visits as important in order to provide assurance that grant managers 
are fulfilling their responsibility to resolve the grantee issues they 
identify. In addition, federal internal control standards identify the 
importance of, and require timely and accessible documentation of, such 
information.[Footnote 31] 

OJJDP has not fully implemented the requirement for grant managers to 
record resolution of grantee issues identified during site visits in 
the Grant Management System. Although OJJDP requires grant managers to 
record resolution of issues addressed through Corrective Action Plans 
in the system, it does not require them to record actions they take to 
informally resolve issues for which a Corrective Action Plan has not 
been required, even though according to OJJDP officials this is the 
means by which they resolve these issues the vast majority of the time. 
According to OJP officials, OJJDP grant managers rarely, if ever, 
record these less formal efforts in the Grant Management System. As a 
result, officials have limited assurance that resolution of these 
issues occurs. 

According to the senior OJJDP official responsible for overseeing 
monitoring, OJJDP intends to implement OJP's requirement to record 
resolution of all issues identified during site visits in the Grant 
Management System, including those resolved informally. However, the 
official stated that OJJDP has not yet done so because grant managers 
are adjusting to this requirement, and prior to May 2009, the Grant 
Management System did not provide grant managers with a user-friendly 
means to record these efforts. The OJJDP official stated that, as of 
July 2009, OJJDP encouraged, but did not require, grant managers to 
record informal actions they took to resolve grantee issues and that 
OJJDP identified recording efforts to resolve issues, including through 
these informal methods, as important because recording these efforts 
allowed the office to ensure that resolution was achieved. The official 
explained that experienced grant managers pride themselves on their 
monitoring techniques and ability to provide assistance to grantees, 
and that some view recording their actions in the Grant Management 
System as burdensome. The official stated that the most recent version 
of the Grant Management System, which OJP deployed in May 2009, 
provides enhanced capabilities for tracking resolution of grantee 
issues that will enable grant managers to more easily record in the 
system their less formal efforts to resolve grantee issues. While OJP 
officials acknowledged that the most recent version of the Grant 
Management System provides a more user-friendly means for grant 
managers to record resolution of issues, they emphasized that the 
previous system provided a means to record these efforts, and that OJP 
policy required them to be recorded prior to the deployment of the new 
system as well. Although OJP has begun to provide training to OJJDP 
grant managers in using the new system, the senior official responsible 
for overseeing monitoring believes it will take time to get these grant 
managers to record their site visit follow-up work in the Grant 
Management System. 

However, OJJDP has not established specific time frames, in accordance 
with standard practices for program management, for fully implementing 
OJP's requirement. Standard practices for program management state that 
the successful execution of any plan includes identifying in the 
planning process the schedule that establishes the timeline for 
delivering the plan. The senior OJJDP official responsible for 
overseeing monitoring stated that currently documentation of issue 
resolution may be captured in e-mail correspondence between grant 
managers and grantees. While this may be the case, e-mail 
correspondence is not necessarily captured in the Grant Management 
System and, therefore, is not readily accessible to OJJDP management to 
facilitate its oversight of grant managers. In addition, we understand 
that OJP has been introducing new grant monitoring tools, such as 
checklists for grant managers to use during monitoring activities and 
Grant Management System modules that enable grant managers to collect 
monitoring information in a uniform fashion, and that it takes time to 
adjust to these changes. However, according to OJP officials, prior to 
establishing the requirement at the beginning of fiscal year 2008 for 
grant managers to document resolution of issues identified during site 
visits in the Grant Management System, OJP required that grant managers 
maintain documentation of these efforts outside of this system. 
According to OJJDP officials, OJJDP does not currently have a 
requirement in place to record informal actions taken to resolve issues 
within the Grant Management System or outside this system, nor does it 
have a time frame for implementing such a process as required by 
standard practices for program management. Furthermore, documentation 
of grant monitoring is not a new issue for OJJDP, as we reported in 
2001 that the agency was not consistently documenting its grant 
monitoring activities.[Footnote 32] 

Because OJJDP has not fully implemented the requirement for grant 
managers to record in the Grant Management System all of their efforts 
to resolve grantee issues, OJJDP does not have reasonable assurance 
that grantee issues identified during site visits are resolved, or have 
a means by which they can hold grant managers or grantees accountable 
for resolving such issues. Although OJJDP has processes in place to 
record desk review checklists in the Grant Management System, these 
checklists do not include information about less formal actions taken 
following site visits to resolve grantee issues. Also, while site visit 
reports recorded in the Grant Management System may include information 
about the status of issues identified during previous visits, OJJDP may 
only perform site visits for a grantee once every several years. 
Therefore, these processes do not provide a means to ensure issues are 
resolved in accordance with OJJDP's directives. 

Without OJJDP ensuring documentation of resolution of all issues 
identified during site visits, OJJDP lacks reasonable assurance that 
grant managers are fulfilling their responsibility to resolve issues 
identified during site visits for which a Corrective Action Plan is not 
required. Establishing a timeline for implementing this requirement 
consistent with standard practices for program management would provide 
OJJDP with a concrete goal for developing processes consistent with OJP 
requirements, to record all postsite visit actions taken and thereby 
better position OJJDP to help ensure that grant programs are being 
implemented in a manner that supports program objectives. 

OJJDP Has Taken Action to Collect Performance Measurement Data from 
Grantees, but Ensuring and Reporting on Performance Measurement Data 
Quality Would Help Inform and Improve Resource Allocation Decisions: 

As part of its programmatic monitoring efforts, OJJDP has developed 
performance measures for its grant programs and processes for 
collecting performance measurement data from its grantees. According to 
the senior OJJDP official who oversees monitoring efforts, OJJDP 
initiated efforts in 2002 to develop performance measures to collect 
and report data that measure the results of funded activities 
consistent with the Government Performance and Results Act of 1993 
(GPRA) requirements.[Footnote 33] Since this time, OJJDP has developed 
output and outcome performance measures for each of its grant programs 
and implemented requirements for all grantees to submit specific 
performance measurement data on at least an annual basis that measure 
the results of their work.[Footnote 34] Specific performance measures 
vary by the type of grant program, but examples of performance 
measurement data that grantees are required to report include the 
percentage of youth who reoffend, the percentage of youth who exhibit a 
positive change in behavior, and the extent to which the grantee is 
using an evidence-based program--that is, a program model that research 
has demonstrated to be effective. For more detailed information on 
OJJDP performance measures, see enclosure II.[Footnote 35] 

OJJDP has also developed an online system called the Data Collection 
and Technical Assistance Tool (the Data Tool) to collect performance 
measurement data from grantees. Grantees submit performance measurement 
data through this Data Tool, which centralizes the data and enables 
OJJDP to aggregate performance measurement data across grantees. 
[Footnote 36] According to OJJDP officials, 2006 was the first year 
that all grantees submitted performance measurement data through this 
tool. These officials stated that prior to the development of the Data 
Tool, some grantees reported performance measurement data in the 
narratives of their progress reports; however, it was difficult for 
OJJDP to use these data because the data could not be easily 
aggregated. The Data Tool has some edit checks built into the system 
that require grantees to submit answers within the range of response 
possibility (e.g., the Tool generates an error message if a grantee 
attempts to enter 30 when the appropriate response range is 1 through 
6). 

In addition to these checks in the Data Tool, grant managers are also 
to review performance measurement data to corroborate performance data 
reported in the narrative of grantee progress reports, and examine data 
over time to ensure data patterns are reasonable.[Footnote 37] In 
addition, according to OJJDP officials, as performance measures are 
being developed and during annual reviews, OJJDP validates performance 
measurement data to assess whether the data are appropriate for the 
performance measure. Lastly, OJJDP reported that it initiated several 
technical assistance efforts to help prepare grantees to collect and 
report data. These include conducting teleconference training calls, 
making presentations at grantee meetings, and developing a Web page 
that features performance measure training resources and guidelines. 

According to the OJJDP official who oversees the office's performance 
measures, OJJDP intends to implement a more systematic approach for 
verifying its performance measurement data so as to ensure its quality 
and use the data to make resource allocation decisions. However, 
according to OJJDP officials, their use of performance measurement data 
is limited as they do not have a documented, systematic approach for 
verifying the data to ensure its quality. As we have previously 
reported, verifying data should address key dimensions of data quality, 
such as: data completeness--the extent to which enough of the required 
data elements are collected from a sufficient portion of the target 
population or sample; data accuracy--the extent to which the data are 
free from significant error; data consistency--the extent to which data 
are collected using the same procedures and definitions across 
collectors and times; and data timeliness--whether data about recent 
performance are available when needed to improve program management and 
report to Congress.[Footnote 38] 

Data verification helps to ensure that data are of sufficient quality 
and will be available when needed to document performance and support 
decision making. That is, data are to be credible or seen by potential 
users to be of sufficient quality to be trustworthy. As we previously 
reported[Footnote 39] congressional leadership has stated that 
performance plans based on incomplete or inaccurate data would be of 
little use to Congress.[Footnote 40] We have also reported that 
agencies that understand the linkage between expended resources and 
performance results are better able to allocate and manage their 
funding resources effectively because agency officials and 
congressional decision makers are in a better position to make informed 
funding decisions that target resources to improve overall 
results.[Footnote 41] In addition, according to leading management 
practices in performance measurement, primary responsibility for the 
quality of a program's data rests with the manager of that program. As 
such, consistent with these practices, OJJDP has a responsibility to 
ensure the quality of the data it collects from grantees.[Footnote 42] 

OJJDP officials stated they have not yet performed verification efforts 
to ensure the quality of their data because they have focused on the 
collection of performance measurement data rather than on utilization. 
As a result, these officials stated that OJJDP is currently dependent 
on grantees to report complete, accurate, consistent, and timely data. 
This is an issue, according to OJJDP officials, because they do not 
know the quality of the data submitted by grantees and, therefore, do 
not use these data to make resource allocation decisions. According to 
OJP and OJJDP officials, if grantees are reporting that they are 
adhering to the requirements articulated for their awards and acting on 
their proposed activities and timelines, OJJDP is likely to continue 
funding the grant regardless of the program's performance data or 
performance results. The OJJDP official responsible for overseeing 
performance measurement data collection acknowledged a need to verify 
OJJDP's performance measurement data and has drafted an approach for 
program managers to verify the data they collect. However, the approach 
has not been internally vetted, and OJJDP officials hope to further 
develop this approach in the fall of 2009 in conjunction with the 
Office of Audit Assessment and Management (OAAM). Among other things, 
OJJDP officials anticipate developing specific processes for grant 
managers to verify data during site visits and developing activities 
for grantees to verify data submitted to them by subgrantees to include 
in this approach. Over the course of our audit work, OJJDP could not 
provide evidence of the approach because it is in draft and is subject 
to change. While OJJDP officials agree with the importance of an 
approach to verify performance measurement data and hope to further 
develop and implement an approach in the future, they stated that OJJDP 
has not established timelines or a schedule for its completion. 
Standard practices for program management state that the successful 
execution of any plan includes identifying in the planning process the 
schedule that establishes the timeline for delivering the plan, in this 
case an approach for verifying performance data.[Footnote 43] 

Having quality performance measurement data could help both Congress 
and OJJDP officials determine how to allocate and distribute OJJDP 
funding resources. This is especially important because of the $392 
million it awarded in fiscal year 2008, OJJDP exercised its discretion 
to select recipients for 45 percent of these funds, and awarded the 
remaining 56 percent pursuant to a formula or fixed level or based on 
direction from Congress.[Footnote 44] However, because OJJDP has not 
yet verified these data, it cannot be assured of the quality of the 
results it is reporting. For example, OJJDP generates summary 
performance reports through its Data Tool that aggregate performance 
data for all grantees within a program. OJJDP uses these reports, which 
it posts on-line, as a mechanism to apprise the public of its programs' 
contributions to the juvenile justice field. However, according to 
officials, because OJJDP does not verify the data on which these 
reports are based, it cannot know whether the information contained 
within them is complete, accurate, consistent, and timely. 

In addition, OJJDP does not state in the performance reports that the 
data on which the reports are based have not been verified. As we have 
previously reported, it is important that the processes that agencies 
use to prepare communication products be documented, transparent, and 
understood.[Footnote 45] Further, in order to ensure transparency, OJP 
Information Quality Guidelines require OJJDP to disseminate information 
that is accurate, clear, and complete, in part, by reporting the 
limitations of the information it disseminates. Because OJJDP does not 
acknowledge in the performance reports generated through its Data Tool 
that the source data for the reports have not been verified, it cannot 
provide reasonable assurance to the public, whom they are intended to 
inform, that the information they contain is transparent, clear, and 
understandable. OJJDP officials acknowledged that the reports do not 
state that the data have not been verified, but did not provide a 
reason for this omission. Additionally, according to OJJDP officials, 
the office provides information to OMB in intermittent updates on its 
performance. In the past, it has also provided information to OMB for 
the Program Assessment Rating Tool (PART) review process.[Footnote 46] 
In fiscal year 2006, OJJDP underwent a PART review and received an 
overall rating of "adequate." One of the three follow-up action items 
recommended included performance information in budget submissions to 
better link resources requested to program performance goals.[Footnote 
47] OJP's fiscal year 2009 budget includes annual and long-term 
performance measures and data, but verifying these data would better 
instill confidence in its quality. 

According to the OJJDP official in charge of OJJDP's performance 
measurement data, OJJDP's long-term goal is to be able to use 
performance measurement data to guide its program planning and grant 
award decisions so OJJDP can allocate its resources in accordance with 
performance results--i.e., toward the better performing programs. 
However, OJJDP has not finalized its approach, or established timelines 
for its completion. As articulated in The Standard for Program 
Management, the successful execution of any plan includes defining an 
approach for the plan and establishing timelines for its completion. 
Following these steps will assist OJJDP in the successful execution of 
its long-term performance measurement goal of verifying grantee data 
and better position it for allocating resources in accordance with 
program results. 

Conclusions: 

As the federal office charged with supporting states and communities in 
their efforts to prevent and reduce juvenile crime and victimization, 
OJJDP provides funding through numerous grants to support a variety of 
programs. Monitoring the performance of these grants and maintaining 
timely and accessible records of these monitoring efforts is critical 
to determine whether OJJDP, and those programs to which it provides 
funding, are meeting their goals and objectives. Although OJJDP has 
processes in place to monitor the performance of its grants and to 
record the results of two of its three monitoring phases, establishing 
a time frame for implementing OJP's requirement that the results from 
OJJDP's third follow-up phase--resolving grantee issues identified 
during site visits--are fully documented will provide OJJDP with a 
concrete goal for implementing this requirement. Furthermore, it will 
also help hold OJJDP accountable for developing processes, consistent 
with OJP requirements, to record all postsite visit actions taken. As a 
result, OJJDP will be better positioned to help ensure that its grant 
managers are conducting activities in accordance with its directives to 
hold grantees accountable for achieving program goals. 

With respect to performance measures, while OJJDP has developed 
measures and established processes to collect performance data from 
grantees, finalizing its approach to verify the completeness, accuracy, 
consistency, and timeliness of these data and determining the 
approach's timelines for implementation would provide OJJDP with 
additional assurance regarding the data's quality in order for agency 
officials and Congress to use these data to measure the performance of 
its grants and make funding decisions that are in line with results. In 
addition, discussing limitations of unverified performance measurement 
data included in its communication products as required by OJP 
standards would help OJJDP to ensure that the information it 
disseminates, and particularly the limitations thereof, are 
transparent, clear, and understood. 

Recommendations for Executive Action: 

To help ensure the resolution of grantee issues, as well as ensure 
accountability for grantees fulfilling the conditions of their grants, 
we recommend that the Administrator of OJJDP establish a time frame for 
fully implementing the requirement established by OJP for grant 
managers to record in the Grant Management System all actions taken to 
resolve grantee issues consistent with OJP requirements, standards for 
internal control, and standard practices for program management. 

To help ensure the quality of performance measurement data submitted by 
grantees and improve these data to support agency and congressional 
decision making, we recommend that the Administrator of OJJDP take the 
following actions: 

* finalize a data verification approach that includes how it will 
execute the approach to assess data completeness, accuracy, 
consistency, and timeliness; and establish time frames for implementing 
the approach consistent with leading management practices for 
performance measurement and standard practices for program management; 
and: 

* note in each document containing performance measurement data that 
the limitations of the data are currently unknown as the data have not 
been verified to help ensure that the information in the communication 
products OJJDP disseminates is transparent, clear, and understood. 

Agency Comments and Our Evaluation: 

We requested comments on a draft of this report from the Attorney 
General. On August 31, 2009, we received written comments from OJP, 
which are reprinted in enclosure IV. In commenting on the draft report, 
OJP stated that it concurred with our first recommendation that the 
Administrator of OJJDP establish a time frame for fully implementing 
the requirement established by OJP for grant managers to record in the 
Grant Management System all actions taken to resolve grantee issues. In 
addition, OJP noted that although in the past OJJDP grant managers have 
not documented their resolution of grantee issues as required by OJP 
policy, they have always been held accountable for resolving grantee 
performance issues and OJJDP supervisors meet regularly with grant 
managers to discuss grant management activities. While supervisors may 
have regularly discussed grant management activities with grant 
managers, internal control standards for ensuring an agency's 
accountability for stewardship of government resources require timely 
and accessible documentation of information. Therefore, to help ensure 
the resolution of grantee issues, as well as ensure accountability for 
grantees fulfilling the conditions of their grants, it is important 
that grant managers also record in the centralized Grant Management 
System actions taken to resolve grantee issues. To this end, OJP also 
described efforts it plans to take to implement this recommendation. It 
stated that beginning in fiscal year 2010, OJJDP will require that 
grant managers fully record activities in the Grant Management System 
and include this requirement as a dimension in grant managers' annual 
performance reviews. In addition, OJP agreed with our recommendations 
pertaining to performance measurement data and described steps OJJDP 
plans to take to implement them. By January 2010, OJJDP will develop 
and implement a plan for verifying performance measurement data, and by 
October 1, 2009, it will add a note to its performance measures Web 
site to clarify that performance measurement data have not been 
verified. OJP also provided technical comments, which we incorporated 
as appropriate. 

We are sending copies of this report to interested congressional 
committees and the Attorney General. In addition, this report will be 
available at no charge on GAO's Web site at [hyperlink, 
http://www.gao.gov]. 

If you or your staff have any questions concerning this report, please 
contact me at (202) 512-8777, or larencee@gao.gov. Contact points for 
our Offices of Congressional Relations and Public Affairs may be found 
on the last page of this report. Mary Catherine Hult, Assistant 
Director; David Alexander; Katherine Davis; Allyson Goldstein; Dawn 
Locke; Taylor Matheson; and Janet Temko made key contributions to this 
report. 

Sincerely yours, 

Signed by: 

Eileen Regen Larence:
Director, Homeland Security and Justice Issues: 

[End of section] 

Enclosure I: Summary of OJP Grant Monitoring Standards and Procedures: 

While OJP establishes the minimum standards for grant monitoring to 
which its offices must adhere, direct responsibility for monitoring 
grantees rests with the components. This enclosure summarizes the 
minimum standards for programmatic grant monitoring established by OJP 
at the beginning of fiscal year 2008 that its components are 
responsible for implementing.[Footnote 48] 

OJP monitoring standards identify five phases in the grant monitoring 
life cycle. The first two phases--planning and premonitoring--serve to 
prepare for the subsequent monitoring phases. The planning phase takes 
place at the beginning of each year, during which each OJP component is 
to contribute to an OJP-wide annual Monitoring Plan that identifies the 
grants for which each component intends to conduct site visits during 
that year. In order to select grants for inclusion in the Monitoring 
Plan, grant managers are required to use OJP's Grant Assessment Tool to 
appraise the vulnerabilities associated with grants identified by their 
component's leadership.[Footnote 49] Grant managers use the Grant 
Assessment Tool to determine the risk associated with each grant by 
assessing each grant against a set of defined criteria. Grant managers 
then use their discretion to decide whether or not to plan a monitoring 
site visit for the grant.[Footnote 50] The Monitoring Plan is an 
evolving document, and components may use their discretion to modify 
the grants for which they plan to conduct site visits throughout the 
course of the year. 

The premonitoring phase is intended to help grant monitors obtain the 
necessary background information to conduct a thorough monitoring site 
visit. If grant managers plan to conduct a site visit for a grant, OJP 
standards for premonitoring require that they complete OJP's Pre- 
Monitoring Checklist, which includes reviewing grant reference 
materials and notifying the grantee of the visit in writing at least 45 
days prior to conducting the visit. 

Once grant managers complete the planning and premonitoring phases, 
they are to move on to the three phases for conducting and following up 
on monitoring: (1) desk review, (2) site visit, and (3) postsite visit. 
Table 1 summarizes OJP's minimum standards for these phases. 

Table 1: Summary of OJP's Monitoring and Postmonitoring Phases: 

Monitoring phase: 1: Desk review--activities to further prepare for 
monitoring site visits and to facilitate monitoring of grants 
throughout the award period; 
Processes included in phase: Desk reviews are intended to further 
prepare grant managers for monitoring site visits and to facilitate 
monitoring of grants throughout the award period. Grant managers are 
required to complete desk reviews for all grants no less than annually. 
If a site visit is planned for a grant, a desk review is to be 
completed no more than 45 days prior to the visit. If a site visit is 
not planned, a desk review is to be completed at some point during the 
year. To complete a desk review, grant managers are to review 
documentation related to the grant, including the application, award 
documents, and results from previous desk reviews, to ensure a complete 
understanding of the project objectives, schedule, and status.[A] They 
are also to review all progress reports submitted by grantees within 
the last year to determine if the reports are complete and contain 
information related to the status of the project, such as: 
* performance measures and associated data as they relate to the 
grantee's performance in executing the grant program; 
* progress achieved on each task in relation to any approved schedule, 
and; 
* any problems or issues encountered in implementing the program and 
planned actions for resolution. 
If, after completing a desk review, there is evidence that a site visit 
is necessary, but one is not planned, the grant manager is to plan a 
site visit and add the grant to the Monitoring Plan; 
Standards for documentation: Grant managers are required to record desk 
reviews by uploading a copy of the checklist they use to complete the 
review in the Grant Management System. 

Monitoring phase: 2: Site visit--monitoring to determine how program 
objectives are being implemented; 
Processes included in phase: During site visits, grant managers are to 
visit a grantee to discuss specific issues related to the grantee's 
progress in implementing the program, observe grant activity, and 
assess planned versus actual progress. Site visits are guided by OJP's 
Site visit checklist, which directs grant managers, through discussion 
and documentation review, to determine how program objectives are being 
implemented. This is determined by several questions including the 
following: 
* Is there evidence that activities reported have actually occurred, 
and were reported accurately? 
* Are project milestones being achieved according to schedule? 
* Are there any problems implementing the program or is any technical 
assistance required? 
Standards for documentation: Grant managers are required to record 
their site visit observations by uploading a copy of the checklist they 
use to complete the review in the Grant Management System. 

Monitoring phase: 3: Postsite visit--activities to record site visit 
results and resolve grantee issues; 
Processes included in phase: Following site visits, grant managers are 
required to perform several activities. These include: 
* preparing a site visit report that records the results of the site 
visit, highlights promising practices, and identifies areas where the 
grantee is not complying with any terms or conditions of the grant or 
is in need of assistance from OJP; 
* preparing a follow-up letter to share the results from the visit with 
the grantee as articulated in the site visit report; 
* working with the grantee to develop a corrective action plan, if 
deemed necessary by the grant manager; and; 
* collaborating with the grantee to resolve issues and ensure their 
resolution. 
Site visit reports and follow-up letters prepared by grant managers are 
to receive supervisory approval; 
Standards for documentation: Grant managers are required to record site 
visit results by saving a site visit report stating their findings in 
the Grant Management System along with any correspondence submitted by 
grantees in response to the site visit or site visit report; 
Grant managers are also required to track any corrective actions taken 
to resolve issues identified through a site visit and their resolution 
in the Grant Management System. 

Source: GAO analysis of OJP documentation. 

Note: Although each of the monitoring activities required by OJP also 
involves administrative and financial monitoring activities, this 
summary includes only the required programmatic monitoring activities. 

[A] According to the OJP Grant Manager's Manual, grant managers are 
required to use OJP's Desk Review Checklist to complete and record desk 
reviews. However, according to OJP officials, as of April 2009, OJP 
policy now also permits grant managers to fulfill the desk review 
requirement by completing a Grant Assessment Tool assessment. According 
to OJP, these assessments require grant managers to respond to the same 
questions as the OJP Desk Review Checklist. 

[End of table] 

[End of section] 

Enclosure II: OJJDP's Performance Measures: 

As reported by OJJDP, its performance measures provide a system for 
tracking progress of the chosen activities in accomplishing specific 
goals, objectives, and outcomes. More specifically, according to OJJDP, 
performance measurement (1) is directly related to program goals and 
objectives, (2) measures progress of the activities quantitatively, (3) 
is not exhaustive, and (4) provides a temperature reading--a quick and 
reliable gauge of selected results. According to information OJJDP 
provides to grantees, all recipients of OJJDP funding are required to 
collect and report data that measure the results of funded activities 
consistent with GPRA requirements.[Footnote 51] In its online 
information for grantees, OJJDP states that according to GPRA reporting 
performance measures promotes public confidence in the federal 
government by systematically holding federal agencies accountable for 
achieving program results. Performance measures also promote program 
effectiveness, service delivery, and accountability by focusing on 
results, service quality, and customer satisfaction. Finally, 
performance measures promote enhanced congressional decision making. 
[Footnote 52] 

According to the senior OJJDP official who oversees performance 
measurement, in 2002 OJJDP intensified efforts to develop performance 
measures to collect and report data that measure the results of funded 
activities consistent with requirements established by GPRA. Since this 
time, for each of its grant programs, OJJDP has developed both output 
and outcome performance measures. According to OJJDP, output measures, 
or indicators, measure the products of a program's implementation or 
activities. These are generally measured in terms of the volume of work 
accomplished, such as the number of services delivered; staff hired; 
systems developed; sessions conducted; materials developed; and 
policies, procedures, or legislation created. Examples of OJJDP output 
measures include the number of juveniles served, number of hours of 
service provided to participants, number of staff trained, number of 
detention beds added, number of materials distributed, number of 
reports written, and number of site visits conducted. Outcome measures, 
or indicators, measure the benefits or changes for individuals, the 
juvenile justice system, or the community as a result of the program. 
Outcomes may be related to behavior, attitudes, skills, knowledge, 
values, conditions, or other attributes. Examples include changes in 
the academic performance of program participants, changes in the 
recidivism rate of program participants, changes in client satisfaction 
level, changes in the conditions of confinement in detention, and 
changes in the county-level juvenile crime rate. According to OJJDP, 
there are two levels of outcomes: 

* Short-term outcomes. For programs that provide a service directly to 
juveniles or families, short-term outcomes are the benefits or changes 
that participants experience by the time they leave or complete the 
program. These generally include changes in behavior, attitudes, 
skills, or knowledge. For programs designed to change the juvenile 
justice system, short-term outcomes include changes to the juvenile 
justice system that occur by the end of the grant funding. 

* Long-term outcomes. The key outcomes desired for participants, 
recipients, the juvenile justice system, or the community permanently 
or over an extended period. For programs that provide a service 
directly to juveniles or families, they generally include changes in 
recipients' behavior, attitudes, skills, or knowledge. They also 
include changes in practice, policy, or decision making in the juvenile 
justice system. They are measured within 6 to 12 months after a 
juvenile leaves or completes the program and they should relate back to 
the program's goals (e.g., reducing delinquency). 

OJJDP has developed numerous outcome and output measures for each of 
its formula and block grant programs and for its discretionary grants. 
OJJDP requires grantees to submit data on a minimum set of specific 
performance measures that it identifies for each grant program. 
Additionally, on its Web site, OJJDP provides a list of other measures 
grantees can opt to use to report on their performance, such as the 
number of program staff who have completed training in the program 
area, or the number of program youth who are satisfied with the 
program. Table 2 describes examples of performance measures provided by 
OJJDP to its grantees. 

Table 2: Examples of Five OJJDP Output and Five OJJDP Outcome 
Performance Measures: 

#; Output measure: 1; OJJDP grant funds awarded for intervention 
services (mandatory measure); 
Definition: The amount of OJJDP grant funds in whole dollars that are 
awarded for program intervention services, which are services designed 
to intervene after a juvenile has become involved in the juvenile 
justice system. Program records are the preferred data source. 

#; Output measure: 2; Number of intervention service slots created; 
Definition: The number of new intervention slots created during the 
reporting period as a result of OJJDP grant funds. Program records are 
the preferred reporting source. 

#; Output measure: 3; Number of youth or youth and families served 
(mandatory measure); 
Definition: An unduplicated count of the number of youth (or youth and 
families) served by the program during the reporting period. Definition 
of the number of youth (or youth and families) served for the reporting 
period is the number of youth (or youth and families) carried over from 
the previous reporting period plus new admissions during the reporting 
period. 

#; Output measure: 4; Number of programs that implement an evidence-
based program or practice (mandatory measure); 
Definition: Number and percentage of programs that implement an 
evidence-based program or practice. Evidence-based programs and 
practices include program models that have been shown, through rigorous 
evaluation and replication, to be effective at preventing or reducing 
juvenile delinquency or related risk factors, such as substance abuse. 
Model programs can come from many sources (e.g., OJJDP's Model Programs 
Guide, State Model Program resources). 

#; Output measure: 5; Number of youth or youth and families served by a 
program with an evidence-based program or practices intervention model 
(mandatory measure); 
Definition: Number and percentage of youth (or youth and families) 
served using an evidence-based program or practices intervention model. 
Program records are the preferred source of data. 

#; Outcome measure: 1; Number and percentage of youth or youth and 
families completing program; (mandatory measure); 
Definition: Number and percentage of youth (or youth and families) who 
have successfully met all program obligations and requirements. Program 
obligations will vary by program, but should be a predefined list of 
obligations or requirements that clients must meet prior to program 
completion. Program records are the preferred data source. 

#; Outcome measure: 2; Number and percentage of youth exhibiting the 
desired change in targeted behaviors (short-term); (mandatory measure); 
Definition: Must select at least one of OJJDP's 13 suggested measures 
in this category such as the number of youth who exhibited an increase 
in their GPA during the reporting period or the number of youth who had 
completed high school during the reporting period. Short-term data are 
captured by the time participants leave or complete the program. 

#; Outcome measure: 3; Number or percentage of youth who reoffend 
(short-term); (mandatory measure); 
Definition: The number and percentage of youth who were rearrested or 
seen at a juvenile court for a new delinquent offense. Official records 
(e.g., police, juvenile court) are the preferred data source. Short- 
term data are captured by the time participants leave or complete the 
program. 

#; Outcome measure: 4; Number or percentage of youth who reoffend (long-
term); (mandatory measure); 
Definition: The number and percentage of youth who were rearrested or 
seen at a juvenile court for a new delinquent offense. Long term data 
are captured 6 to 12 months after program completion. 

#; Outcome measure: 5; Number or percentage of youth who are 
revictimized; (mandatory measure); 
Definition: The number and percentage of youth who were revictimized. 
Long-term data are captured 6 to 12 months after program completion. 

Source: OJJDP. 

[End of table] 

[End of section] 

Enclosure III: OJJDP's Fiscal Year 2008 Grant Awards: 

According to OJJDP data, in fiscal year 2008 it awarded approximately 
$392 million under its formula, block, and discretionary grant 
programs. As described in figure 2, OJJDP exercised its discretion to 
select recipients for 45 percent of these funds, and awarded the 
remaining 56 percent pursuant to a formula or fixed level, or based on 
direction from Congress. 

Figure 2: Selection of Recipients for OJJDP Fiscal Year 2008 Grant 
Awards: 

[Refer to PDF for image: illustration] 

In fiscal year 2008, OJJDP awarded $392 million under formula, block, 
and discretionary grant programs; 
* $124 million (32%): In general, statutory requirements require 
formula and block grants to be awarded on the basis of states' juvenile 
population or a fixed level to all states; 

$268 million was awarded under discretionary grant programs to states, 
units of local government, and private organizations; 
* $93 million (24%) was awarded by OJJDP based on congressional 
direction; 
* 56% of funds were awarded pursuant to statutory requirements or based 
on direction from Congress; 

$175 million was awarded to recipients selected by OJJDP; 
* 45% of funds were awarded based on OJJDP discretion. 

Source: GAO analysis of OJJDP funding data. 

Note: Percentages total to greater than 100 due to rounding. 

[End of figure] 

[End of section] 

Enclosure IV: Comments from the Department of Justice: 

U.S. Department of Justice: 
Office of Justice Programs: 
Office of the Assistant Attorney General: 
Washington, D.C. 20531: 

August 31, 2009: 

Ms. Eileen R. Larence: 
Director, Homeland Security and Justice Issues: 
Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Ms. Larence: 

Thank you for the opportunity to comment on the draft Government 
Accountability Office (GAO) letter report entitled "Juvenile Justice: A 
Timeframe for Enhancing Grant Monitoring Documentation and Verification 
of Data Quality Would Help Improve Accountability and Resource 
Allocation Decisions" (GAO-09-850R). The Office of Justice Programs 
(OJP) agrees with the Recommendations for Executive Action, which are 
restated in bold text below and are followed by our response. 

1. To help ensure the resolution of grantee issues, as well as ensure 
accountability for grantees fulfilling the conditions of their grants, 
we recommend that the Administrator of OJJDP establish a timeframe for 
fully implementing the requirement established by OJP for grant 
managers to record in the Grant Management System all actions taken to 
resolve grantee issues consistent with OJP requirements, standards for 
internal control, and standard practices for program management. 

ON agrees with this recommendation. Beginning in Fiscal Year (FY) 2010, 
OJP's Office of Juvenile Justice and Delinquency Prevention (OJJDP) 
will require grant managers to systematically and fully record grant 
monitoring activities in the Grants Management System (GMS). OJJDP will 
include the requirement in the performance work plans for grant 
managers. 

Of the 316 site visits conducted in FY 2008, 109 site visits required 
follow-up with the grantee after the site visit. Although grant 
monitoring documentation was not included in GMS or in the format 
prescribed by OJP policy, OJJDP grant managers followed up with 
grantees to address issues noted during site visits. Further, OJJDP 
grant managers have always been held accountable for grant monitoring 
responsibilities and resolving grantee performance issues. OJJDP 
supervisors meet regularly with grant managers to discuss workload and 
grant management activities, which encompass more than site visits. 

2. To help ensure the quality of performance measurement data submitted 
by grantees and improve these data to support agency and congressional 
decision making, we recommend that the Administrator of OJJDP: 

* finalize a data verification approach that includes how it will 
execute the approach to assess data completeness, accuracy, 
consistency, and timeliness; and establish timeframes for implementing 
the approach consistent with leading management practices for 
performance measurement, and standard practices for program management; 
and; 

* note in each document containing performance measurement data that 
the limitations of the data are currently unknown as the data have not 
been verified to help ensure that the information in the communication 
products that OJJDP disseminates is transparent, clear, and understood. 

OJP agrees with this recommendation. OJJDP is in the process of 
auditing and refining its performance measures. Once that process is 
complete, by January 2010, OJJDP will develop and implement a plan for 
verifying performance measurement data. In addition, by October 1, 
2009, OJJDP will add a note to its performance measures website to 
clarify that performance measurement data has not been verified. 

If you have any questions regarding this response, you or your staff 
may contact Maureen Henneberg, Director, Office of Audit, Assessment, 
and Management, on (202) 616-3282. 

Sincerely, 

Signed by: 

Laurie O. Robinson: 
Acting Assistant Attorney General: 

cc: 
Beth McGarry: 
Deputy Assistant Attorney for Operations and Management: 

Jeffrey Slowikowski: 
Acting Director: 
Office of Juvenile Justice and Delinquency Prevention: 

Maureen Henneberg: 
Director: 
Office of Audit, Assessment, and Management: 

LeToya A. Johnson: 
OJP Audit Liaison: 

Richard P. Theis: 
Audit Liaison: 
Department of Justice: 

[End of section] 

Footnotes: 

[1] As of September 9, 2009, OJJDP was still in the process of 
announcing its grant awards for fiscal year 2009. 

[2] OJP is comprised of OJJDP and four other components (i.e., bureaus, 
offices, and institutes). OJP's four other components include the 
Bureau of Justice Assistance, the Bureau of Justice Statistics, the 
National Institute of Justice, and the Office for Victims of Crime. 

[3] GAO, Juvenile Justice: Better Documentation of Discretionary Grant 
Monitoring Is Needed, [hyperlink, 
http://www.gao.gov/products/GAO-02-65] (Washington, D.C.: Oct. 10, 
2001). 

[4] GAO, Standards for Internal Control in the Federal Government, 
[hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1] 
(Washington, D.C.: November 1999). 

[5] In addition to Standards for Internal Control in the Federal 
Government and policy articulated by OJP, we also used in this analysis 
applicable laws, such as the Violence Against Women and Department of 
Justice Reauthorization Act of 2005, Pub. L. No. 109-162, 119 Stat. 
2960, and guidance provided in OMB circulars A-102 and A-110. 

[6] Discretionary grants provide funds to states, units of local 
government, and organizations to administer programs. OJJDP awards 
discretionary grants to recipients through an application process or 
based on congressional direction. In general, formula and block grant 
awards provide funds to states in accordance with statutory 
requirements. OJJDP allocates some formula and block grants to states 
on the basis of states' juvenile populations, while others may be 
awarded at a fixed level to all states. The term state means any state 
of the United States, the District of Columbia, the Commonwealth of 
Puerto Rico, the Virgin Islands, Guam, American Samoa, and the 
Commonwealth of the Northern Mariana Islands. 42 U.S.C. § 5603. OJJDP 
awards include grants for training, technical assistance, and research 
and evaluation efforts, which we did not include in our review. 

[7] OMB Circular No. A-11, Preparation, Submission, and Execution of 
the Budget. 

[8] Program management standards we reviewed are reflected in the 
Project Management Institute's The Standard for Program Management © 
(2006). 

[9] Office of Justice Programs, Information Quality Guidelines 
(Washington, D.C.: December 2002), [hyperlink, 
http://www.ojp.usdoj.gov/about/info_quality.htm] (accessed Mar. 18, 
2009). 

[10] OJP's Corrective Action Plan process includes requiring a grantee 
to develop a Corrective Action Plan that describes the issues in need 
of resolution and identifies tasks involved in resolving them, who will 
carry out the tasks, task deadlines, and how the problems will be 
corrected. 

[11] GAO, Performance Plans: Selected Approaches for Verification and 
Validation of Agency Performance Information, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-99-139] (Washington, D.C.: July 30, 
1999). 

[12] In general, requirements pertaining to how OJJDP allocates funding 
for its formula and block grant programs--i.e., based on a formula or 
fixed level--are derived from statutory requirements. 

[13] The Project Management Institute, The Standard for Program 
Management © (2006). 

[14] GAO, Health and Safety Information: EPA and OSHA Could Improve 
Their Processes for Preparing Communications Products, [hyperlink, 
http://www.gao.gov/products/GAO-08-265] (Washington, D.C.: Mar. 31, 
2008). 

[15] Pub. L. No. 109-162, § 1158, 119 Stat. 2960, 3114-16. 

[16] Many of OJP grant awards provide several years of funding. This 
requirement covers all open, active awards OJP provides to grantees at 
the beginning of the fiscal year, both through new grant awards, as 
well as those under way from previous years. 

[17] OJJDP's three program divisions administer grants to implement 
programs. OJJDP's Office of Policy Development, which assists the OJJDP 
Administrator in coordinating national policy on juvenile justice, also 
administers grants awarded for research and evaluation. 

[18] OJP also requires its components to perform administrative and 
financial monitoring activities. Administrative monitoring addresses 
compliance with grant terms and grantee reporting and documentation 
requirements (e.g., inventory records for property used for the grant), 
and financial monitoring reviews expenditures compared to an approved 
budget. 

[19] 42 U.S.C. § 5633(a)(1). 

[20] OJJDP maintains a network of technical assistance providers who 
are available to offer a range of services and training on a variety of 
issues. 

[21] These standards are articulated in the OJP Grant Manager's Manual, 
which documents grant monitoring policies and procedures for OJP 
bureaus. This manual also identifies two phases that precede the three 
phases described above. For the purposes of this review and examining 
what processes OJJDP has in place to monitor the performance of its 
grants and record the results of its monitoring efforts, we did not 
focus on these first two phases in which grant managers plan and 
prepare for monitoring activities such as reviewing background 
material--e.g., the grantee's organizational charts or lists of key 
personnel--and making appointments for site visits. Instead, we focused 
on the two phases that address the specific monitoring activities laid 
out by the OJP Grant Manager's Manual--i.e., desk reviews and site 
visits--and the final phase that provides postsite visit policies and 
procedures as a result of the monitoring efforts. For more information 
about the first two phases, see enclosure I. According to OJP 
officials, as of June 2009, OJP was in the process of revising the 
current version of this manual, which it expects to complete by the end 
of fiscal year 2009. 

[22] All grantees are required to submit progress reports to OJJDP at 
least annually. While discretionary grantees and select formula and 
block grantees are required to submit reports semiannually, other 
formula and block grantees are required to submit reports annually. 

[23] OJP initially provided these features to grant managers through 
its Grant Monitoring Tool. After recording monitoring activities within 
this tool, grant managers were to upload the tool to the Grant 
Management System. In May 2009, OJP embedded this tool within the Grant 
Management System. 

[24] Grant managers are to consider grantee performance to be of high 
risk if there are concerns related to the implementation of the program 
and ability to meet program objectives, such as if a grantee repeatedly 
requests scope changes, key personnel changes, or project budget 
modifications. Grant managers are to consider awards of high complexity 
as higher risk compared to other programs if they require additional 
oversight (e.g., awards with several distinct purpose areas). 

[25] According to our Standards for Internal Control in the Federal 
Government, internal control is an integral component of an 
organization's management that provides reasonable assurance that the 
following objectives are being achieved: reliability of financial 
reporting; compliance with applicable laws and regulations; and 
effectiveness and efficiency of operations, including the use of the 
entity's resources. [hyperlink, 
http://www.gao.gov/products/GAO/AIMD-00-21.3.1]. 

[26] These practices are based on the 1993 Government Performance and 
Results Act of 1993 (GPRA), which requires cabinet-level agencies to 
identify performance measures to measure the results of program 
activities, in order to, among other things, promote public confidence 
in the federal government by systematically holding agencies 
accountable for achieving program results; improve program 
effectiveness by focusing on results, service quality, and customer 
satisfaction; and improve congressional decision making. Pub. L. No. 
103-62, 107 Stat. 285 (codified as amended in scattered sections of 5 
U.S.C., 31 U.S.C., and 39 U.S.C.). 

[27] In 1999, we reported that the approaches agencies use to verify 
and validate performance measurement data should address key dimensions 
of data quality, which include but are not limited to: completeness-- 
the extent to which enough of the required data elements are collected 
from a sufficient portion of the target population or sample; accuracy--
the extent to which the data are free from significant error; 
consistency--the extent to which data are collected using the same 
procedures and definitions across collectors and times; timeliness-- 
whether data about recent performance are available when needed to 
improve program management and report to Congress; and ease of use--how 
readily intended users can access data, aided by clear data 
definitions, user-friendly software, and easily used access procedures. 
GAO, Performance Plans: Selected Approaches for Verification and 
Validation of Agency Performance Information, [hyperlink, 
http://www.gao.gov/products/GAO/GGD-99-139] (Washington, D.C.: July 30, 
1999). 

[28] OMB, Circular No. A-11, Preparation, Submission, and Execution of 
the Budget. 

[29] The OJP desk review checklist instructs grant managers to, among 
other things, review the original grant application and official 
correspondence with the grantee to ensure a complete understanding of 
the project's objectives and status, review progress reports submitted 
by the grantee within the last year to ensure they are complete (e.g., 
that they describe the grantee's progress in achieving each task in 
relation to project milestones), and create a list of questions for the 
grantee based on the review. According to OJJDP officials, consistent 
with OJP policy, grant managers may elect to use an assessment 
completed using OJP's risk assessment tool in lieu of the OJP desk 
review checklist to complete a desk review. According to OJP, risk 
assessments require grant managers to respond to the same questions as 
the OJP desk review checklist. 

[30] According to OJJDP, an example of an egregious circumstance in 
fiscal year 2007 was when a grant manager required a grantee to develop 
a Corrective Action Plan following a visit in which the grant manager 
observed that the grantee was failing to meet several basic 
programmatic requirements, including hiring a coordinator for the 
program and implementing the project design. 

[31] [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-21.3.1]. 

[32] GAO, Juvenile Justice: Better Documentation of Discretionary Grant 
Monitoring Is Needed, [hyperlink, 
http://www.gao.gov/products/GAO-02-65] (Washington, D.C.: Oct. 10, 
2001). 

[33] GPRA requires DOJ to collect and report performance measurement 
data as part of its annual budget process. 31 U.S.C. §§1115-16. 
Although GPRA only requires collecting and reporting of performance 
data at the executive agency level, according to OJJDP, as a component 
of DOJ it assumes responsibility for collecting and reporting on 
performance data to meet DOJ GPRA requirements. 

[34] According to OJJDP, output measures, or indicators, measure the 
products of a program's implementation or activities. These are 
generally measured in terms of the volume of work accomplished, such as 
amount of service delivered; staff hired; systems developed; sessions 
conducted; materials developed; and policies, procedures, or 
legislation created. Outcome measures, or indicators, measure the 
benefits or changes for individuals, the juvenile justice system, or 
the community as a result of the program. Outcomes may be related to 
behavior, attitudes, skills, knowledge, values, conditions, or other 
attributes. 

[35] While we assessed how OJJDP uses performance measurement data to 
make programming and funding decisions and the extent to which OJJDP 
verifies these data, due to the volume of OJJDP performance measures-- 
approximately 600 in total--we did not assess the quality of each of 
these individual metrics. 

[36] OJP officials stated that OJP is looking to integrate the Data 
Tool into the Grant Management System to make it an OJP-wide 
performance measurement data collection system. 

[37] According to the OJJDP official who oversees the office's 
performance measures, grant managers would benefit from more training 
and guidance on how to review performance measurement data. This 
official stated that with their current level of training, it is 
difficult for grant managers to determine if data patterns are 
reasonable. 

[38] [hyperlink, http://www.gao.gov/products/GAO/GGD-99-139], 13. 

[39] [hyperlink, http://www.gao.gov/products/GAO/GGD-99-139], 1. 

[40] GPRA requires each agency to submit a performance plan with its 
annual budget submission. Each agency's plan should: establish 
performance goals to define the level of performance to be achieved by 
a program activity, express such goals in a measurable form, establish 
performance indicators to be used in measuring the goals, provide a 
basis for comparing actual results with goals, and describe a means to 
verify and validate measured values. 31 U.S.C. § 1115. 

[41] GAO, Executive Guide: Effectively Implementing the Government 
Performance and Results Act, [hyperlink, 
http://www.gao.gov/products/GAO-GGD-96-118] (Washington, D.C.: June 
1996). 

[42] [hyperlink, http://www.gao.gov/products/GAO/GGD-99-139], 25. 

[43] The Project Management Institute, The Standard for Program 
Management © (2006). 

[44] See enclosure III for a detailed description of OJJDP grant 
funding. 

[45] [hyperlink, http://www.gao.gov/products/GAO-08-265], 43. 

[46] PART assessments are aimed to assess and improve program 
performance so that the federal government can achieve better results. 

[47] The other two action items were (1) make juvenile justice 
programs' performance results available to the public through program 
publications and the Internet, and (2) develop a comprehensive 
evaluation plan for the juvenile justice programs to obtain better 
information on the programs' effects. 

[48] These standards are articulated in the OJP Grant Manager's Manual. 
According to OJP officials, as of June 2009, OJP was in the process of 
revising the current version of this manual, which it expects to 
complete by the end of fiscal year 2009. 

[49] According to OJP officials, in fiscal year 2009 OJJDP's leadership 
required Grant Assessment Tool assessments for approximately 60 percent 
of the bureau's open awards during the grant assessment period at the 
beginning of the year. 

[50] These criteria include the dollar value of the award; past 
performance of the grantee, if applicable; and complexity of the grant 
program. 

[51] GPRA requires DOJ to collect and report performance measurement 
data as part of its annual budget process. 31 U.S.C. §§1115-16. 
Although GPRA only requires collecting and reporting of performance 
data at the executive agency level, according to OJJDP, as a component 
of DOJ it assumes responsibility for collecting and reporting on 
performance data to meet DOJ GPRA requirements. 

[52] Due to the volume of OJJDP performance measures--more than 600 in 
total--we did not assess the quality of these metrics. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: