This is the accessible text file for GAO report number GAO-04-198 
entitled 'Law Enforcement: Better Performance Measures Needed to Assess 
Results of Justice's Office of Science and Technology' which was 
released on December 09, 2003.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to the Honorable Jane Harman, House of Representatives:

United States General Accounting Office:

GAO:

November 2003:

Law Enforcement:

Better Performance Measures Needed to Assess Results of Justice's 
Office of Science and Technology:

GAO-04-198:

GAO Highlights:

Highlights of GAO-04-198, a report to the Honorable Jane Harman, House 
of Representatives 

Why GAO Did This Study:

The mission of the Office of Science & Technology (OST), within the 
Department of Justice’s National Institute of Justice (NIJ), is to 
improve the safety and effectiveness of technology used by federal, 
state, and local law enforcement and other public safety agencies. 
Through NIJ, OST funds programs in forensic sciences, crime 
prevention, and standards and testing. To support these programs, 
Congress increased funding for OST from $13.2 million in 1995 to 
$204.2 million in 2003 (in constant 2002 dollars). GAO reviewed (1) 
the growth in OST’s budgetary resources and the changes in OST’s 
program responsibilities, (2) the types of products OST delivers and 
the methods used for delivering them; and (3) how well OST’s efforts 
to measure the success of its programs in achieving intended results 
meet applicable requirements. 

What GAO Found:

OST's budgetary resources grew significantly in recent years, along 
with the range of its program responsibilities. From fiscal year 1995 
through fiscal year 2003, OST received over $1 billion through 
Department of Justice appropriations and the reimbursement of funds 
from other federal agencies in exchange for OST’s agreement to 
administer these agencies' projects. Of the over $1 billion that OST 
received, approximately $749 million, or 72 percent, was either 
directed to specific recipients or projects by public law, subject to 
guidance in congressional committee reports, or directed through 
reimbursable agreements. At the same time that spending expanded, 
OST’s program responsibilities have changed—from primarily law 
enforcement and corrections to broader public safety technology.

OST delivers three groups of products through various methods. The 
three groups include (1) information dissemination and technical 
assistance; (2) the application, evaluation, and demonstration of 
existing and new technologies for field users; and (3) technology 
research and development. According to OST, as of April 2003, it has 
delivered 945 products since its inception. Furthermore, OST 
identified an additional 500 products associated with ongoing awards. 
OST makes its products available through a variety of methods, such as 
posting information on its Web site and providing research prototypes 
to field users for testing and evaluation.

OST has been unable to fully assess its performance in achieving its 
goals as required by applicable criteria because it does not use 
outcome measures to assess the extent to which it achieves the 
intended results of its programs. OST’s current measures primarily 
track outputs, the goods and services produced, or in some cases OST 
uses intermediate measures, which is a step toward developing outcome 
measures. The Government Performance and Results Act of 1993 provides 
that federal agencies measure or assess the results of each program 
activity. While developing outcome measures for the types of 
activities undertaken by OST is difficult, we have previously reported 
on various strategies that can be used to develop outcome measures,
or, at least intermediate measures, for similar types of activities.

What GAO Recommends:

GAO recommends that the Director of NIJ reassess the measures used to 
evaluate OST’s progress toward achieving its goals and to better focus 
on outcome measures to assess results where possible. In those cases 
where measuring outcomes is, after careful consideration, deemed 
infeasible, we recommend developing appropriate intermediate measures 
that will help to discern program effectiveness.

www.gao.gov/cgi-bin/getrpt?GAO-04-198.

To view the full product, including the scope and methodology, click 
on the link above. For more information, contact Laurie Ekstrand at 
(202) 512-8777 or Ekstrandl@gao.gov.

[End of section]

Contents:

Letter:

Results in Brief:

Background:

OST's Budgetary Resources Have Grown and Program Responsibilities Have 
Changed:

OST Delivers Three Groups of Products Through Various Methods:

OST's Performance Measurement Efforts Do Not Fully Meet Requirements:

Conclusions:

Recommendation:

Agency Comments and Our Evaluation:

Appendix I: Scope and Methodology:

Appendix II: Bugetary Resources for OST's Programs in Current Year 
Dollars:

Appendix III: OST's 10 Categories of Products:

Appendix IV: OST's Portfolio Areas:

Appendix V: OST's Operations:

Appendix VI: OST's Goals in its Fiscal Year 2004 Performance Plan and 
GAO's Assessment:

Appendix VII: Comments from the Department of Justice:

Appendix VIII: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Staff Acknowledgments:

Tables:

Table 1: Flow of Budgetary Resources to OST's Programs:

Table 2: Budgetary Resources in Constant 2002 Dollars for OST's 
Programs by NIJ Allocation, Fiscal Years 1995-2003:

Table 3: Budgetary Resources in Constant 2002 Dollars for OST's 
Investigative and Forensic Sciences by NIJ Allocation, Fiscal Years 
1995-2003:

Table 4: GAO's Assessment of the 42 Measures OST Developed for 11 of 
Its Initiatives:

Table 5: OST's Outside Studies of Its Initiatives:

Table 6: Budgetary Resources in Current Dollars for OST's Programs by 
NIJ Allocation, Fiscal Years 1995-2003:

Table 7: GAO's Groupings of OST's Categories of Products and Examples 
of Each Category:

Table 8: Total Funds Awarded for the Operations, Maintenance, and 
Technical Support of OST's 10 Technology Centers, Fiscal Years 1995-
2003:

Table 9: OST's Technology Centers, Their Affiliated Partners, and the 
Amounts Awarded to Support the Centers:

Table 10: OST's Performance Goals, Initiatives, and Measures for Fiscal 
Year 2004, and GAO's Assessment:

Figures:

Figure 1: OST's Budgetary Resources in Constant 2002 Dollars, Fiscal 
Years 1995-2003:

Figure 2: GAO's Grouping of OST's 945 Delivered Products, as of April 
2003:

Figure 3: OST's Organizational Structure:

Figure 4: OST's 10 Technology Centers and the Regions They Serve:

Figure 5: Stakeholders and Customers that Contribute to the Setting of 
OST's priorities:

Abbreviations:

AAG: Assistant Attorney General:

CITA: Crime Identification Technology Act:

CLIP: Crime Lab Improvement Program:

CODIS: Combined DNA Index System:

COPS: Community-Oriented Policing Services:

DNA: deoxyribonucleic acid:

DOD: Department of Defense:

FBI: Federal Bureau of Investigation:

GAO: General Accounting Office:

GPRA: Government Performance and Results Act:

LLEBG: Local Law Enforcement Block Grant:

NFSIA: Paul Coverdell National Forensic Sciences Improvement Act:

NIJ: National Institute of Justice:

NLECTC: National Law Enforcement and Corrections Technology Centers:

OJP: Office of Justice Programs:

OMB: Office of Management and Budget:

OST: Office of Science and Technology: 

R&D: research and development: 
 
SSLEA: State and Local Law Enforcement Assistance:

United States General Accounting Office:

Washington, DC 20548:

November 14, 2003:

The Honorable Jane Harman: 
House of Representatives:

Dear Ms. Harman:

To enhance public safety and bring criminals to justice, it is 
important for law enforcement officials to benefit from the latest 
advances in science and technology. The mission of the Office of 
Science and Technology (OST), within the Department of Justice's 
National Institute of Justice (NIJ), is to improve the safety and 
effectiveness of technology used by federal, state, and local law 
enforcement, corrections, and other public safety agencies. OST awards 
funds to research and develop more effective technology and improve 
access to technology in a wide range of areas. For example, OST funds 
programs in the areas of crime prevention technologies, investigative 
and forensic sciences, and electronic crime. Examples of products 
resulting from OST's programs include a guide on school safety, an 
evaluation of police protective gear, a prototype for ground-
penetrating radar, and a report on gunshot residue detection and 
interpretation. To support OST's programs, Congress has significantly 
increased its funding, from $13.2 million in fiscal year 1995 to $204.2 
million in fiscal year 2003 (in constant 2002 dollars).

In response to your interest about whether OST's programs are achieving 
their intended results, we reviewed certain aspects of OST's 
operations. Specifically, this report assesses (1) the growth in OST's 
budgetary resources, from fiscal year 1995 to fiscal year 2003, and 
changes in OST's program responsibilities; (2) what types of products 
OST delivers and the methods used to deliver these products to public 
safety agencies; and (3) how well OST's efforts to measure the success 
of its programs in achieving intended results meet applicable 
requirements.

To address our objectives, we collected and analyzed relevant data and 
reports and interviewed OST officials and NIJ officials, including NIJ 
executive staff and the Assistant NIJ Director for OST, division 
chiefs, and managers. We also collected data and interviewed officials 
at OST technology centers in Rockville, Maryland; and El Segundo and 
San Diego, California. Appendix I contains detailed information on the 
scope and methodology we used for this assessment. We conducted this 
engagement in accordance with generally accepted government auditing 
standards.

Results in Brief:

OST has grown in terms of both budgetary resources and the range of 
programs it operates.[Footnote 1] From fiscal year 1995 through fiscal 
year 2003, OST received over $1 billion through several Department of 
Justice (Justice) appropriations accounts as well as the reimbursement 
of funds from other federal agencies in exchange for OST's agreement to 
administer these agencies' projects. Of the over $1 billion that OST 
has received, approximately $749.7 million, or about 72 percent, was 
either directed for specific recipients or projects by public law, 
subject to guidance in congressional committee reports designating 
specific recipients or projects, or directed from reimbursable 
agreements with other federal agencies for OST to manage their 
projects. At the same time that spending has expanded, OST's program 
responsibilities have changed--from primarily law enforcement and 
corrections technologies to broader public safety technologies, 
including safe school initiatives.

OST delivers three groups of products through various methods. The 
three groups include (1) information dissemination and technical 
assistance; (2) the application, evaluation, and demonstration of 
existing and new technologies for field users; and (3) technology 
research and development (R&D). According to OST, as of April 2003, it 
had delivered 945 products since its inception. Furthermore, OST 
identified an additional 500 products associated with ongoing awards. 
Depending on its research agenda, OST makes its products available 
through a variety of methods, such as posting information on its Web 
site and providing research prototypes to field users for testing and 
evaluation. While OST does not directly commercialize the results of 
its technology R&D, it does help link prototypes with potential 
developers.

OST has been unable to fully assess its performance in achieving its 
goals because it does not measure the extent to which it achieves the 
intended outcomes of its programs. OST's current measures primarily 
track outputs (goods and services produced). In some cases OST uses 
intermediate measures--a step closer to developing outcome measures--
but has not taken this step toward better measurement in many cases 
where it may be possible to do so. The Government Performance and 
Results Act of 1993 (GPRA) provides, among other things, that federal 
agencies establish performance measures, including, the assessment of 
relevant outputs and outcomes of each program activity. Office of 
Management and Budget (OMB) guidance suggests that, to the extent 
possible, federal agencies measure or assess the extent to which they 
are achieving the intended outcomes of their programs. As part of 
Justice's efforts to comply with GPRA, OST established goals and 
developed output, and some intermediate, measures to track its 
progress. While developing outcome measures for the types of activities 
undertaken by OST is difficult, we have previously reported on various 
strategies that can be used to develop outcome measures or at least 
intermediate measures for activities that are similar to those in OST's 
portfolio of programs.

So that OST does all that is possible to assess whether its programs 
are achieving their intended results, we are recommending that the 
Attorney General instruct the Director of NIJ to reassess OST's 
performance measures to better focus on outcome measures. In commenting 
on a draft of this report, the Assistant Attorney General (AAG) for 
Justice's Office of Justice Programs (OJP) agreed with our 
recommendation. The AAG made additional comments concerning the 
challenge of developing outcome measures for R&D activities, OST's 
overall performance record, and the amount of OST's funds that are 
directed for specific recipients and projects. We respond to these 
comments in the Agency Comments and Evaluation section of the report. 
OJP also provided technical comments, which have been incorporated in 
this report where appropriate.

Background:

The Office of Science and Technology (OST) was created in fiscal year 
1995 following a long history of science and technology efforts within 
the National Institute of Justice (NIJ).[Footnote 2] NIJ is a component 
of the Office of Justice Programs (OJP), a Justice agency that, among 
other things, provides assistance to state, tribal, and local 
governments. In establishing OST's objectives and allocating funds for 
OST's programs, the NIJ Director considers the priorities of many 
stakeholders, including the President, Congress, Justice, and state and 
local law enforcement and public safety agencies.

OST Established in Statute by the Homeland Security Act of 2002:

In November 2002, Congress established OST and its mission and duties 
in statute as part of the Homeland Security Act of 2002 (the 
Act).[Footnote 3] The Act specified OST's mission "to serve as the 
national focal point for work on law enforcement technology; and to 
carry out programs that, through the provision of equipment, training, 
and technical assistance, improve the safety and effectiveness of law 
enforcement technology and improve access to such technology by 
federal, state, and local law enforcement agencies." The Act defined 
the term "law enforcement technology" to include "investigative and 
forensic technologies, corrections technologies, and technologies that 
support the judicial process."[Footnote 4] The Act also specified OST's 
duties to include the following, among others:

* establishing and maintaining advisory groups to assess federal, 
state, and local technology needs;

* establishing and maintaining performance standards, and testing, 
evaluating, certifying, validating, and marketing products that conform 
to those standards;

* carrying out research, development, testing, evaluation, and cost-
benefit analysis of certain technologies; and:

* developing and disseminating technical assistance and training 
materials.

OST's Operations:

OST's operations have multiple levels of internal organization and 
multiple kinds of external partners. (For a more detailed description 
of OST's operations, see app. V.) OST's multiple levels of organization 
include a Washington, D.C., office and a network of 10 technology 
centers that provide technical assistance to OST's customers around the 
country.[Footnote 5] To fulfill its mission, OST also collaborates with 
entities such as the Departments of Defense and Energy and public and 
private laboratories to take advantage of established technical 
expertise and resources.

NIJ has three main types of awards for funding OST's programs: grants, 
interagency agreements, and cooperative agreements.[Footnote 6]

* Grants are generally awarded annually by NIJ to state and local 
agencies or private organizations for a specific product and amount.

* Interagency agreements are used by NIJ for creating partnerships with 
federal agencies.

* Cooperative agreements are a type of NIJ grant to nonfederal entities 
that prescribes a higher level of monitoring and federal involvement.

NIJ also uses memorandums of understanding (MOU) to coordinate programs 
and projects between agencies. The MOUs specify the roles, 
responsibilities, and funding amounts to be provided by participating 
agencies. Through NIJ, OST can provide supplemental funding to 
interagency and cooperative agreements that may be used to contract for 
special projects.

OST awards are administered by managers at its Washington, D.C., office 
who have final oversight and management responsibility. These managers 
may delegate some responsibility to another federal R&D agency 
receiving the award. In March 2003, 21 managers were responsible for 
overseeing 336 active awards totaling $636 million.

Guidance has been established for measuring the performance of 
government operations. To assist Justice to follow the Government 
Performance and Results Act of 1993 (GPRA),[Footnote 7] OST establishes 
goals and develops performance measures to track its progress. In 
addition, in May 2002, the White House Office of Management and Budget 
(OMB) and Office of Science and Technology Policy issued a memorandum 
setting forth R&D investment criteria that departments and agencies 
should implement. The investment criteria require an explanation of why 
the investment is important, how funds will be allocated to ensure 
quality, and how well the investment is performing. According to the 
memorandum, program managers must define appropriate outcome measures, 
and milestones that can be used to track progress toward goals and 
assess whether funding should be enhanced or redirected. The memorandum 
encourages federal R&D agencies to make the processes they use to 
satisfy GPRA consistent with these criteria.

OST's Budgetary Resources Have Grown and Program Responsibilities Have 
Changed:

OST's budgetary resources have grown and the range of program 
responsibilities has changed. Budgetary resources for OST increased 
significantly, from $13.2 million in fiscal year 1995 to $204.2 million 
in fiscal year 2003 (in constant 2002 dollars), totaling over $1 
billion.[Footnote 8] This increase can be attributed to the 
introduction of new allocations and large increases for existing ones. 
The NIJ director decides how to allocate certain appropriated funds to 
the various NIJ components, including OST. About $749.7 million, or 72 
percent, of OST's total budgetary resources was either directed to 
specific recipients or projects by public law, subject to congressional 
committee report guidance designating specific recipients or projects, 
or directed from the reimbursements from other Justice and federal 
agencies in exchange for OST managing their projects. Corresponding 
with the designation of spending for specific recipients and projects, 
the range of OST's programs changed, from primarily law enforcement and 
corrections to include broader public safety technology R&D, such as 
for improving school safety and combating terrorism.

Budgetary Resources for OST's Programs:

OST's budgetary resources[Footnote 9] include both funding received via 
Justice appropriations accounts as well as reimbursements from other 
Justice and federal agencies. First, OST receives funding via three 
appropriations accounts enacted in the appropriations law for the 
Justice Department. From these appropriations accounts, OJP allocates 
amounts to NIJ. The NIJ director suballocates part of the NIJ funds for 
OST programs. In addition, OST receives reimbursements from other 
Justice and federal agencies in exchange for OST's management of 
specific projects of those agencies, such as ballistic imaging 
evaluation for the FBI. Table 1 lists NIJ allocations from the Justice 
appropriations accounts that go toward funding OST programs.

Table 1: Flow of Budgetary Resources to OST's Programs:

Justice appropriation accounts: 

Justice Assistance; NIJ's allocations 
to OST programs: NIJ Base: NIJ uses base funds for research, 
development, demonstration, and dissemination activities.

NIJ's allocations to OST programs: Counterterrorism R&D:[A] 
NIJ sponsors research, development, and evaluations and tools to help 
criminal justice and public safety agencies deal with critical 
incidents, including terrorist acts.

Justice appropriation accounts: State and Local Law Enforcement 
Assistance (SLLEA); NIJ's allocations to OST programs: Local Law 
Enforcement Block Grant (LLEBG): NIJ allots its R&D portion of LLEBG 
funds to OST to assist local units of government to identify, select, 
develop, modernize, and purchase new technologies for law enforcement 
use.

Justice appropriation accounts: Community Oriented Policing Services 
(COPS); NIJ's allocations to OST programs: Crime Identification 
Technology Act (CITA): CITA activities include upgrading and 
integrating national, state, and local criminal justice record, 
identification systems, and funding multi-jurisdictional, multi-agency 
communications systems, and improving forensic science capabilities, 
including DNA analysis.

NIJ's allocations to OST programs: Safe 
Schools Technology R&D: OST's Safe Schools Technology R&D program uses 
three methods for improving school safety: needs assessments and 
development of technical partners, technology R&D, and technical 
assistance.

NIJ's allocations to OST programs: 
Crime Lab Improvement Program (CLIP): CLIP activities include providing 
equipment, supplies, training, and technical assistance to state and 
local crime laboratories to establish or expand their capabilities and 
capacities to perform various types of forensic analyses.

NIJ's allocations to OST programs: DNA 
Backlog Reduction: This seeks to eliminate public crime laboratories' 
backlogs of DNA evidence as soon as possible.

NIJ's allocations to OST programs: Paul 
Coverdell National Forensic Sciences Improvement Act (NFSIA): This 
provides funding to state and local laboratories to improve the 
quality, timeliness, and credibility of forensic science services for 
criminal justice purposes.

NIJ's allocations to OST programs: Reimbursements of funds from other 
Justice Department and federal agencies' accounts: Reimbursable 
activities have included ballistic imaging evaluation from the FBI, a
study of communications interoperability (the ability to communicate 
across different public safety agencies and jurisdictions) 
requirements from the Defense Advanced Research Projects Agency, and 
death investigator guidelines from the Centers for Disease Control and 
Prevention.

Source: GAO analysis of OST data.

[A] In fiscal year 1999, OST's counterterrorism R&D programs received 
funding through the Justice Department's Counterterrorism Fund 
appropriation account.

[End of table]

OST's budgetary resources almost quadrupled from fiscal year 1995 to 
1996, increased 70 percent from fiscal year 1999 to 2000, and increased 
63 percent from fiscal year 2001 to 2002. While resources decreased 24 
percent from fiscal year 2002 to 2003, OST's fiscal year 2003 level 
still represents a 157 percent increase over the fiscal year 1999 
level.

Figure 1: OST's Budgetary Resources in Constant 2002 Dollars, Fiscal 
Years 1995-2003:

[See PDF for image]

Notes: Figures do not include funding for management and administration 
expenses, such as salaries.

The $103.4 million increase from fiscal year 2001 to 2002 is largely 
attributable to increases of $55.6 million in reimbursable agreements, 
$24.3 million in DNA Backlog Reduction allocation, and $15.4 million in 
the Crime Lab Improvement Program allocation.

The sharp decrease in OST's budgetary resources from fiscal years 2002 
to 2003 is largely attributed to the elimination of counterterrorism 
R&D allocation (from $45.3 million in fiscal year 2002), which moved to 
the Department of Homeland Security, and a decrease of $26.2 million 
from reimbursable agreements.

[End of figure]

Certain Allocations Contributed to the Increase in Budgetary Resources 
since 1995:

Our analysis of OST's yearly budgetary resources from fiscal year 1995 
to fiscal year 2003 showed that the overall increase can be attributed 
to the introduction of new NIJ allocations and large increases for 
existing ones. The NIJ allocations that contributed to the overall 
increase in OST's budgetary resources are most notably the Crime Lab 
Improvement Program, DNA Backlog Reduction, Safe Schools Technology 
R&D, and Counterterrorism R&D allocations. Table 2 shows figures for 
all years in constant 2002 dollars.

All dollar figures used in this narrative are in constant 2002 dollars, 
except as noted otherwise.

Fiscal years 1995-1996: The $39.4 million (298 percent) increase from 
$13.2 million to $52.6 million primarily came from two NIJ allocations 
totaling $35.4 million.

* Local Law Enforcement Block Grant (LLEBG) initiated with $22.2 
million.

* Reimbursement of funds increased by $13.2 million (471 percent) from 
$2.8 million to $16.0 million.

Fiscal years 1999-2000: The $55.6 million (70 percent) increase from 
$79.5 million to $135.1 million primarily came from three NIJ 
allocations totaling $51.7 million.

* DNA Backlog Reduction initiated with $15.6 million.

* Safe Schools Technology R&D allocation initiated with $15.6 
million.[Footnote 10]

* Counterterrorism R&D increased by $20.5 million (193 percent) from 
$10.6 million to $31.1 million.

Fiscal years 2001-2002: The $103.4 million (63 percent) increase from 
$164.6 million to $268.0 million primarily came from three NIJ 
allocations totaling $95.3 million.

* Reimbursement of funds increased by $55.6 million (209 percent) from 
$26.6 million to $82.2 million.

* DNA Backlog Reduction increased by $24.3 million (227 percent) from 
$10.7 million to $35 million.

* Crime Lab Improvement Program increased by $15.4 million (79 percent) 
from $19.6 million to $35 million.

To be consistent with the report narrative and to show trends, figures 
in table 2 are in constant 2002 dollars. A table with the figures in 
current dollars can be found in appendix II.

Table 2: Budgetary Resources in Constant 2002 Dollars for OST's 
Programs by NIJ Allocation, Fiscal Years 1995-2003:

Dollars in millions: 

NIJ allocations for OST programs: NIJ Base; 1995: 
10.4; 1996: 13.3; 1997: 12.7; 1998: 14.8; 1999: 20.3; 2000: 19.1; 2001: 
29.0; 2002: 27.1; 2003: 32.3; Total[A]: 179.1.

NIJ allocations for OST programs: Local Law 
Enforcement Block Grant (LLEBG); 1995: 0; 1996: 22.2; 1997: 21.7; 1998: 
21.4; 1999: 21.2; 2000: 20.8; 2001: 20.2; 2002: 20.0; 2003: 19.6; 
Total[A]: 167.1.

NIJ allocations for OST programs: Crime 
Identification Technology Act (CITA); 1995: 0; 1996: 0; 1997: 0; 1998: 
0; 1999: 0; 2000: 4.4; 2001: 4.3; 2002: 1.4; 2003: 0; Total[A]: 10.1.

NIJ allocations for OST programs: Safe Schools 
Technology Research and Development; 1995: 0; 1996: 0; 1997: 0; 1998: 
0; 1999: 0; 2000: 15.6; 2001: 17.7; 2002: 17.0; 2003: 16.6; Total[A]: 
66.9.

NIJ allocations for OST programs: Crime Lab 
Improvement Program (CLIP); 1995: 0; 1996: 1.1; 1997: 3.3; 1998: 13.4; 
1999: 15.9; 2000: 15.6; 2001: 19.6; 2002: 35.0; 2003: 39.6; Total[A]: 
143.4.

NIJ allocations for OST programs: DNA Backlog 
Reduction[B]; 1995: 0; 1996: 0; 1997: 0; 1998: 0; 1999: 0; 2000: 15.6; 
2001: 10.7; 2002: 35.0; 2003: 35.2; Total[A]: 96.5.

NIJ allocations for OST programs: Paul Coverdell 
National Forensic Sciences Improvement Act (NFSIA)[B]; 1995: 0; 1996: 
0; 1997: 0; 1998: 0; 1999: 0; 2000: 0; 2001: 0; 2002: 5.0; 2003: 4.9; 
Total[A]: 9.9.

NIJ allocations for OST programs: Counterterrorism 
R&D; 1995: 0; 1996: 0; 1997: 10.9; 1998: 12.9; 1999: 10.6; 2000: 31.1; 
2001: 36.5; 2002: 45.3; 2003: 0; Total[A]: 147.2.

NIJ allocations for OST programs: Reimbursements 
from other Justice and federal agencies; 1995: 2.8; 1996: 16.0; 1997: 
0; 1998: 8.9; 1999: 11.5; 2000: 13.0; 2001: 26.6; 2002: 82.2; 2003: 
56.0; Total[A]: 217.1.

NIJ allocations for OST programs: Total[A]; 1995: 
13.2; 1996: 52.6; 1997: 48.6; 1998: 71.4; 1999: 79.5; 2000: 135.1; 
2001: 164.6; 2002: 268.0; 2003: 204.2; Total[A]: 1037.1.

Source: GAO analysis of OST data.

[A] Totals might not add due to rounding.

[B] In fiscal years 2000 and 2001, DNA Backlog Reduction was funded as 
DNA Combined DNA Index System (CODIS) Backlog Reduction. In fiscal 
years 2002 and 2003, both the DNA Backlog Reduction and Coverdell NFSIA 
allocations were funded within DNA CODIS Backlog Reduction.

[End of table]

OST had a $63.8 million (24 percent) decrease in total budgetary 
resources from fiscal years 2002 to 2003, largely attributed to its not 
receiving fiscal year 2003 Counterterrorism R&D resources, which 
totaled $45.3 million in fiscal year 2002. According to OST, its 
counterterrorism resources were transferred to the Department of 
Homeland Security's Office of Domestic Preparedness. There was also a 
$26.2 million decrease in the reimbursement of funds from other 
agencies. However, OST's fiscal year 2003 level still represents a 157 
percent increase from fiscal year 1999.

Range of OST's Program Responsibilities Has Changed:

The range of OST's program responsibilities has changed over the years 
from primarily law enforcement and corrections to include broader 
public safety technology R&D. This has happened as more and more of 
OST's budgetary resources were directed to be spent on specific 
recipients and projects. Appropriated funds, for example, are sometimes 
designated for specific recipients or projects in public law. In 
addition, guidance on the spending of appropriated funds may be 
provided through congressional committee reports. Of the more than $1 
billion (in constant 2002 dollars) that OST programs received from 
fiscal years 1995 to 2003, $532.6 million, or 51 percent, was 
designated for specific recipients and projects in public law or 
subject to guidance in committee reports designating specific 
recipients or projects.[Footnote 11] Of the $532.6 million, $249.8 
million was designated in public law for specific recipients or 
projects while $282.8 million was specified in committee report 
guidance for specific recipients or projects.[Footnote 12]

In addition to the $532.6 million designated in public law for specific 
recipients or projects or subject to guidance in committee reports for 
specific recipients or projects, another $217.1 million was 
reimbursements from other Justice and federal agencies in exchange for 
OST's management of specific projects of those agencies. Thus, the 
total spending either directed for specific recipients and projects 
through public law, subject to committee report guidance designating 
specific recipients or projects, or received as reimbursements, amounts 
to $749.7 million, or 72 percent, of OST's total budgetary resources.

The range of OST's program responsibilities has changed to include such 
areas as school safety and counterterrorism. In fiscal year 1999, a 
Safe Schools Initiative program was established pursuant to conference 
committee report guidance[Footnote 13] with $10 million[Footnote 14] 
directing NIJ to develop school safety technologies. In another 
example, OST's counterterrorism R&D program, initially funded by public 
law in fiscal year 1997,[Footnote 15] received $147.3 million through 
fiscal year 2002, $96.6 million of which was specified in conference 
report guidance for three recipients from fiscal years 2000 to 
2002[Footnote 16]--Oklahoma City National Memorial Institute for the 
Prevention of Terrorism ($37.8 million), Dartmouth College's Institute 
for Security Technology Studies ($51.8 million), and the New York 
University's Center for Catastrophe Preparedness and Response ($7 
million).

OST's program responsibilities have also changed to expand the focus on 
investigative and forensic sciences. Our review of OST's budgetary 
resources for fiscal years 1995 through 2003 shows that budgetary 
resources for investigative and forensic sciences equals at least 
$342.1 million in constant fiscal year 2002 dollars,[Footnote 17] or 
about one-third, of its $1 billion in budgetary resources, as shown in 
table 3. The proportion of investigative and forensic sciences annual 
funding to total OST funding rose from 6 percent ($800,000) in fiscal 
year 1995 to 52 percent ($106.0 million) in fiscal year 2003.

Table 3: Budgetary Resources in Constant 2002 Dollars for OST's 
Investigative and Forensic Sciences by NIJ Allocation, Fiscal Years 
1995-2003:

Dollars in millions: 

NIJ allocation containing funds for investigative 
and forensic sciences: NIJ Base; 1995: 0.6; 1996: 0.6; 1997: 0.4; 1998: 
1.5; 1999: 6.2; 2000: 5.6; 2001: 5.5; 2002: 5.0; 2003: 4.3; Total[A]: 
29.6.

NIJ allocation containing funds for investigative 
and forensic sciences: LLEBG; 1995: 0; 1996: 0; 1997: 0; 1998: 0; 1999: 
0; 2000: 1.1; 2001: 0; 2002: 0; 2003: 0; Total[A]: 1.1.

NIJ allocation containing funds for investigative 
and forensic sciences: CITA; 1995: 0; 1996: 0; 1997: 0; 1998: 0; 1999: 
0; 2000: 0.8; 2001: 1.3; 2002: 0; 2003: 0; Total[A]: 2.0.

NIJ allocation containing funds for investigative 
and forensic sciences: Safe Schools Technology R&D; 1995: 0; 1996: 0; 
1997: 0; 1998: 0; 1999: 0; 2000: 0; 2001: 0; 2002: 0; 2003: 0; 
Total[A]: 0.

NIJ allocation containing funds for investigative 
and forensic sciences: CLIP; 1995: 0; 1996: 1.1; 1997: 3.3; 1998: 13.4; 
1999: 15.9; 2000: 15.6; 2001: 19.6; 2002: 35.0; 2003: 39.6; Total[A]: 
143.4.

NIJ allocation containing funds for investigative 
and forensic sciences: DNA Backlog Reduction; 1995: 0; 1996: 0; 1997: 
0; 1998: 0; 1999: 0; 2000: 15.6; 2001: 10.7; 2002: 35.0; 2003: 35.2; 
Total[A]: 96.5.

NIJ allocation containing funds for investigative 
and forensic sciences: Coverdell NFSIA; 1995: 0; 1996: 0; 1997: 0; 
1998: 0; 1999: 0; 2000: 0; 2001: 0; 2002: 5.0; 2003: 4.9; Total[A]: 
9.9.

NIJ allocation containing funds for investigative 
and forensic sciences: Counterterrorism R&D; 1995: 0; 1996: 0; 1997: 0; 
1998: 0; 1999: 0; 2000: 0; 2001: 0; 2002: 0; 2003: 0; Total[A]: 0.

NIJ allocation containing funds for investigative 
and forensic sciences: Reimbursement of funds from other agencies; 
1995: 0.2; 1996: 8.9; 1997: 0; 1998: 0; 1999: 0; 2000: 1.6; 2001: 1.1; 
2002: 25.4; 2003: 22.0; Total[A]: 59.1.

NIJ allocation containing funds for investigative 
and forensic sciences: Total[A]; 1995: 0.8; 1996: 10.5; 1997: 3.6; 
1998: 14.9; 1999: 22.1; 2000: 40.2; 2001: 38.5; 2002: 105.4; 2003: 
106.0; Total[A]: 342.1.

Source: GAO analysis of OST data.

[A] Totals might not add due to rounding.

[End of table]

OST Delivers Three Groups of Products Through Various Methods:

OST delivers many products, which we categorized into three groups, and 
uses various methods to deliver them. These three groups are (1) 
information dissemination and technical assistance; (2) the 
application, evaluation, and demonstration of existing and new 
technologies for field users; and (3) technology R&D. According to OST, 
as of April 2003, it had delivered 945 products since its 
inception.[Footnote 18] Furthermore, OST identified an additional 500 
products expected from ongoing awards. Figure 2 shows our distribution 
of OST's delivered products by group. We recognize, as OST officials 
told us, that the groups overlap and there is not a clean division 
between them. For example, while reports are associated with 
information dissemination, they may also result from the technology R&D 
group. OST has reviewed our classification of products and agrees that 
it is generally accurate. Because classification of some products is 
based on a judgment call, the proportions of products in each group 
should be considered approximations.

OST's Range of Products:

The following examples, while not exhaustive, indicate the wide range 
of OST's products.

* Reports on topics such as analysis of DNA typing data, linguistic 
methods for determining document authorship, a pepper spray projectile 
and disperser, and gunshot residue detection and interpretation.

* Prototypes of products including ground-penetrating radar, ballistics 
matching using 3-dimensional images of bullets and cartridge cases, and 
an optical recognition system to identify and track stolen vehicles.

* Evaluations of technology including prison telemedicine networks, 
police vehicles, and protective gear.

* Guides on topics such as electronic crime scene investigation, use of 
security technologies in schools, and antennas for radio 
communications.

For a more detailed description of OST's products and further examples, 
see appendix III.

Figure 2: GAO's Grouping of OST's 945 Delivered Products, as of April 
2003:

[See PDF for image]

Notes: See appendix III, table 7 for examples of the products within 
each group. Proportions should be considered approximations because 
some products overlap categories.

[End of figure]

Information Dissemination and Technical Assistance:

Information dissemination and technical assistance represents about 63 
percent of OST's delivered products. OST provides information to its 
customers in a variety of ways. For example, OST provides guidance to 
R&D laboratories on the needs of public safety practitioners. To public 
safety practitioners, OST recommends certain public safety practices, 
tools, and technologies. Through its Office of Law Enforcement 
Standards, OST develops performance standards to ensure that 
commercially available public safety equipment, such as handheld and 
walk-through metal detectors, meets minimum performance requirements. 
OST also helps its customers enhance their technical capacities by 
providing them with training and technical assistance through its Crime 
Lab Improvement Program (which also provides supplies and equipment), 
DNA Backlog Reduction Program, and network of technology centers. OST 
also uses the R&D expertise and experience of already established 
laboratories and other R&D organizations to provide additional guidance 
for managing specialized technology projects. Further, OST helps its 
customers receive surplus federal equipment by acting as their liaison 
to the equipment transfer program of the Department of Defense. For 
example, equipment transferred ranges from armored vehicles to boots 
and uniforms.

In addition, OST sponsors conferences, workshops, and forums that bring 
together its customers, technologists, and policymakers. For example, 
it sponsors the Mock Prison Riot, an annual event demonstrating 
emerging technologies in riot training scenarios held at the former 
West Virginia Penitentiary in Moundsville, West Virginia. This event 
brings together corrections officers and vendors for technology 
showcases and training exercises. Also, OST sponsors the Innovative 
Technologies for Community Corrections Annual Conference, among others.

Application, Evaluation, and Demonstration of New and Existing 
Technologies:

Another OST product group is the application, evaluation, and 
demonstration of new and existing technologies, which represents about 
20 percent of OST's delivered products. Some of OST's programs apply 
existing technology solutions in new ways to assist public safety 
operations. Examples of the application of new and existing 
technologies include developing methods for the collection and analysis 
of chemical trace evidence left from explosives and a handheld computer 
device provided to bomb technicians in order to access bomb data at the 
scene of incidents. In addition, OST tests commercially available 
products through NIJ-certified laboratories to determine whether they 
are in accordance with national performance standards. Examples of 
products evaluated against standards include body armor, handcuffs, and 
semiautomatic pistols. OST's evaluations also include conducting field 
tests to compare different commercially available products of the same 
type to allow users to select the product that best suits their needs. 
OST also demonstrates technology resulting from R&D directly to its 
customers through OST-sponsored events. For example, the Critical 
Incident Response Technology Seminar, formerly known as the Operation 
America, demonstrates live-fire simulation for bomb technicians. The 
annual Mock Prison Riot demonstrates emerging technologies for use by 
corrections officers and tactical team members.

Technology R&D:

About 17 percent of OST's delivered products were related to technology 
R&D, which involves the development of prototype devices, among other 
efforts.[Footnote 19] According to OST, R&D in its early stages 
includes development of prototypes and demonstration that a principle 
can be proven. Applied R&D, which also involves the development of 
prototypes, includes technologies that are made available to public 
safety agencies, generally through OST-assisted commercialization. 
Examples of products resulting from OST's applied R&D range from a bomb 
threat training simulator, facial recognition technology for internet-
based gang tracking, to a personal alarm and location monitoring system 
for corrections officers.

According to OST, R&D in its early stages begins with testing 
technology concepts, exploring solutions, and deciding whether 
continued development is warranted. If OST decides to support product 
development and if it has available funds, it awards funding to 
develop, demonstrate, and evaluate an experimental prototype, which is 
then further developed into an initial engineering prototype, and then 
demonstrated and evaluated. If the prototype proves successful, OST 
demonstrates a "near commercial" model to its customers for their 
evaluation.

While OST does not directly commercialize the results of its technology 
R&D, it does provide prototypes to local users for field-testing and 
assists in linking prototypes with potential commercial developers. OST 
officials believe it would be a conflict of interest and therefore 
inappropriate for them to promote one vendor or technology over another 
or try to dictate what equipment their customers should purchase. OST's 
role in commercialization is to bring technologies and potential 
manufacturers together so that the manufacturers can determine the 
feasibility of commercializing the technologies.

OST's Methods for Delivering its Products:

OST delivers its products to its customers through a variety of 
methods. (We recognize that products are sometimes delivery methods. 
For example, a publication can be both a product resulting from 
research and a method of information dissemination.) Besides 
publications, OST's methods for delivering information and technical 
assistance include mass mailings; downloadable material from its Web 
site; panels, boards, and working groups; training, support, and 
presentations; and programs to enhance the capacity of public safety 
agencies.

OST also delivers its products related to application, evaluation, and 
demonstration through various means. For example, private industry 
provides new and existing technologies to OST; in turn, OST informs its 
customers of the results of using these technologies in new ways. OST 
publishes user guides and the test results of its evaluations of 
commercially available equipment (both standards-based and comparison-
based). Seeking to further educate its customers, OST demonstrates new 
technology at technology fairs, providing "hands on" opportunities to 
use it.

For its R&D products, OST may test "near commercial" prototypes in 
particular settings. For example, OST may install in a police agency a 
prototype technology that facilitates communications among public 
safety agencies and across jurisdictions. If the technology is 
effective, the police agency may incorporate the technology directly 
into its operations, before the technology has become a commercial 
product.

OST's Performance Measurement Efforts Do Not Fully Meet Requirements:

OST's efforts to measure its performance results, including the 
usefulness and effectiveness of its products, do not fully meet 
applicable requirements. To help Justice comply with the Government 
Performance and Results Act of 1993 (GPRA), OST establishes goals and 
develops performance measures to track its progress. GPRA, which 
mandates performance measurements by federal agencies, requires, among 
other things, that each agency measure or assess relevant outputs and 
outcomes of each program activity.[Footnote 20] According to GPRA, the 
Office of Management and Budget (OMB), and GAO, outcomes assess actual 
results as compared with the intended results or consequences that 
occur from carrying out a program or activity. Outputs count the goods 
and services produced by a program or organization. Intermediate 
measures can be used to show progress to achieving intended results. 
Subsequent OMB and committee report guidance on GPRA, and previous GAO 
reports[Footnote 21] recognize that output measures can provide 
important information in managing programs. However, committee report 
guidance emphasizes using outcome measures to aid policy makers because 
such measures are key to assessing intended results.

OST Performance Measures Do Not Measure Results:

The performance measures that OST has developed do not measure results. 
According to the NIJ director, the Assistant Attorney General (AAG) in 
April 2002 issued a memorandum requiring NIJ, including OST, to develop 
outcome measures for fiscal year 2004. In August 2002, the NIJ Director 
responded by stating that OST had indeed developed outcome measures for 
its programs. In its fiscal year 2004 performance plan,[Footnote 22] 
OST established goals for 11 of its initiatives[Footnote 23] and 
developed 42 measures for assessing the achievement of those goals. 
However, based on our review of OST's performance plan, OMB guidance on 
GPRA, and GAO definitions of outcome, output, and intermediate 
measures, we determined that of the 42 measures, none were outcome-
oriented, 28 were output-oriented, and 14 were intermediate. See table 
4 for GAO's determination of the measures and appendix VI for further 
details of our results.

Table 4: GAO's Assessment of the 42 Measures OST Developed for 11 of 
Its Initiatives:

OST's initiatives: 1. Convicted Offender DNA Backlog Reduction Program; 
Type of measure: Output: 0; Type of measure: Intermediate: 3; Type of 
measure: Outcome: 0.

OST's initiatives: 2. No Suspect DNA Backlog Reduction Program; Type of 
measure: Output: 0; Type of measure: Intermediate: 1; Type of measure: 
Outcome: 0.

OST's initiatives: 3. Paul Coverdell National Forensic Sciences 
Improvement Grants Program; Type of measure: Output: 0; Type of 
measure: Intermediate: 1; Type of measure: Outcome: 0.

OST's initiatives: 4. Critical Incident Response Technology Initiative; 
Type of measure: Output: 4; Type of measure: Intermediate: 1; Type of 
measure: Outcome: 0.

OST's initiatives: 5. DNA Research & Development; Type of measure: 
Output: 4; Type of measure: Intermediate: 0; Type of measure: Outcome: 
0.

OST's initiatives: 6. Law Enforcement Technology Research and 
Development; Type of measure: Output: 4; Type of measure: Intermediate: 
1; Type of measure: Outcome: 0.

OST's initiatives: 7. School Safety Technology; Type of measure: 
Output: 3; Type of measure: Intermediate: 0; Type of measure: Outcome: 
0.

OST's initiatives: 8. Crime Lab Improvement Program; Type of measure: 
Output: 4; Type of measure: Intermediate: 6; Type of measure: Outcome: 
0.

OST's initiatives: 9. Office for Law Enforcement Standards; Type of 
measure: Output: 3; Type of measure: Intermediate: 0; Type of measure: 
Outcome: 0.

OST's initiatives: 10. Smart Gun; Type of measure: Output: 4; Type of 
measure: Intermediate: 0; Type of measure: Outcome: 0.

OST's initiatives: 11. OST's network of regional centers (known as the 
National Law Enforcement and Corrections Technology Center system); 
Type of measure: Output: 2; Type of measure: Intermediate: 1; Type of 
measure: Outcome: 0.

Source: GAO analysis of OST data.

[End of table]

According to Justice officials, R&D activities present measurement 
challenges because outcomes are difficult or costly to measure. As the 
NIJ Director pointed out, a May 2002, White House OMB and Office of 
Science and Technology Policy memorandum concluded that agencies should 
not have the same expectations for measuring the results of basic R&D 
as they do for applied R&D.[Footnote 24] According to NIJ, relatively 
little of OST's work is basic R&D. As shown earlier, most of OST's 
products are related to information dissemination and technical 
assistance and the application, evaluation, and demonstration of 
existing and new technologies for field users.

We recognize that OST's task in relation to measuring the results of 
even non-basic research is complex in part because of the wide array of 
activities it sponsors, and because of inherently difficult measurement 
challenges involved in assessing the types of programs it undertakes. 
For example, programs that are intended to deter crime face measurement 
issues in assessing the extent to which something (crime) does not 
happen. Nevertheless, improvement in measurement of program results is 
important to help OST ensure it is doing all that is possible to 
achieve its goals. It is worth noting that an outcome measure in 
relation to one OST program was discussed by the NIJ Director in a May 
2002 statement to Congress. In this statement, the Director provided an 
example of an outcome from the Convicted Offender DNA Backlog Reduction 
Program. The Director stated that as a direct result of the program, 
approximately 400,000 convicted offender samples and almost 11,000 
cases with no suspect were analyzed. According to the NIJ Director, as 
of May 14, 2002, more than 900 "hits" had been made on the FBI's 
Combined DNA Index System (CODIS) database as a direct result of the 
program, that is, 900 cases previously unsolved had been reopened. This 
information indicates how the program is achieving its intended results 
in addressing unsolved cases. Although this example seems to be a 
credible outcome measure, it is not included in OST's fiscal year 2004 
performance plans.

Limitations in OST's Efforts to Measure Effectiveness of Information 
Dissemination:

OST efforts to measure information dissemination effectiveness have 
been limited. One of the purposes of GPRA is to improve federal program 
effectiveness and public accountability by promoting a new focus on 
results, service quality, and customer satisfaction. Surveys to gauge 
customer satisfaction represent one step toward finding out whether 
customers have received information and whether they deem it of value. 
However, these surveys have limitations in determining the extent to 
which the information has been acted upon and resulted in intended 
improvements. Thus, surveys such as these are more likely to be 
intermediate measures (Did information get transferred?) than outcome 
measures (Did information get transferred, acted upon, and achieve a 
result?).

In 1998, NIJ initiated an effort to report the results of surveys to 
measure the satisfaction of participants at all conferences, workshops, 
and seminar series.[Footnote 25] OST reported on the "grantee level of 
satisfaction with NIJ conferences" for fiscal years 1998-2000. However, 
in the fiscal years 2001-2004 GPRA performance plans, OST discontinued 
tracking the surveys because OJP and NIJ had ceased tracking these data 
as a performance measure.

In fiscal year 2001, OST attempted to evaluate the effectiveness and 
value of its TECHbeat newsletter. The survey sample of 5,500 was taken 
from a distribution of major readership groups on TECHbeat's mailing 
list of 20,674. According to OST, the response rate for the survey was 
too low to produce statistically valid results: only 124 completed or 
substantially completed responses were collected. The surveyors also 
experienced a very low return on follow-up phone queries. According to 
the study, the primary reason for the exceedingly low response rate was 
that so many individuals on the mailing list had either changed jobs or 
were completely unfamiliar with TECHbeat. Given these results, OST is 
trying to improve the management and distribution of TECHbeat.[Footnote 
26]

In fiscal year 2001, OST attempted to launch another effort to measure 
program results, service quality, and customer satisfaction, but 
funding for the effort was not provided. OST requested funding for an 
evaluation to measure the success of its outreach efforts, including 
those by its technology centers. The evaluation was to determine 
customer satisfaction with its strategies for outreach and 
communication and with its products. Specifically, OST planned to 
measure user satisfaction of the content, format, and delivery 
mechanisms of its efforts, such as technology information and 
assistance.

Most Studies of Other OST Initiatives Have Focused Primarily on 
Process:

In fiscal years 1998 and 1999, OST funded eight outside studies of some 
of its science and technology initiatives (see table 5).[Footnote 27] 
Our review of these studies showed that seven of the eight studies 
focused on management and organizational processes, and one was 
outcome-oriented.[Footnote 28] Management and process evaluations can 
be useful tools for examining how a program is operating and can offer 
insights into best practices. They do not assess whether a program is 
achieving its intended results.

Table 5: OST's Outside Studies of Its Initiatives:

Outside study topics: 1. National Law Enforcement and Corrections 
Technology Centers (NLECTC) Program[A]; Focus of study: Management, 
oversight, structure, organization, and operations; Type of study: 
Process; Date completed: October 1998.

Outside study topics: 2. Counterterrorism Technology Portfolio; Focus 
of study: Organization, funding, program process; Type of study: 
Process; Date completed: June 1999.

Outside study topics: 3. Investigative and Forensic Sciences Technology 
Portfolio; Focus of study: Program and structure, management, policies, 
procedures, lines of control, and funding; Type of study: Process; Date 
completed: August 1999.

Outside study topics: 4. Less-Than-Lethal Technology Portfolio; Focus 
of study: Management, processes, and organization; Type of study: 
Process; Date completed: September 1999.

Outside study topics: 5. Southwest Border States Antidrug Information 
System; Focus of study: Program efficacy, including awareness of the 
program, and its value and usefulness or benefits to customers; Type of 
study: Outcome; Date completed: October 1999.

Outside study topics: 6. Law Enforcement and Corrections Technology 
Advisory Council Priorities and Technology Portfolio Interaction; Focus 
of study: Management and coordination, processes, organizational 
challenges; Type of study: Process; Date completed: February 2000.

Outside study topics: 7. Critical Incident Response and Management 
Crime fighting Technology Program for State and Local First Responder 
Teams; Focus of study: Options for planning, organization, mission, 
management, budget, and recommendations; Type of study: Process; Date 
completed: September 2000.

Outside study topics: 8. Standards Initiative; Focus of study: 
Recommendations for the planning, organization, and management of the 
proposed initiative expected to be a part of #7 above; Type of study: 
Process; Date completed: September 2000.

Source: OST.

[A] In this report we refer to the National Law Enforcement and 
Corrections Technology Centers as OST's network of technology centers.

[End of table]

Efforts Are Under Way to Address Performance Measurement of Technology 
Centers:

The Homeland Security Act of 2002 requires NIJ[Footnote 29] to transmit 
to Congress by late November 2003 a report assessing the effectiveness 
of OST's existing system of technology centers and to identify the 
number of centers necessary to meet the technology needs of federal, 
state, and local law enforcement in the United States. According to 
NIJ, in response to the Homeland Security Act requirement, it has 
initiated a study to assess the impact and effectiveness of the 
technology center system and how it can be enhanced to meet the 
evolving science and technology research and technology needs of the 
state and local public safety community. NIJ also stated that the 
report would address the functions that the technology center system 
must provide to transfer NIJ's research and development results to 
practice in the criminal justice system. NIJ and OST have failed to 
provide us with information detailing the methodology of the study, so 
we cannot comment on the likelihood that this study will produce the 
information sought by Congress. Additionally, according to OJP, the 
technology centers are in the process of developing outcome measures to 
demonstrate the impact of their activities.

According to NIJ, OJP has implemented additional performance measures 
developed in May 2003 that will apply to NIJ, including OST. However, 
OJP said it would defer implementing the measures related to the 
technology centers until the results of the technology center study are 
known and NIJ has a chance to take action, if warranted.

Measuring Results Is Difficult but Feasible:

We acknowledge that measuring results using outcome measures is 
difficult, and may be especially so in relation to some of the types of 
activities undertaken by OST. Indeed, given the types and wide range of 
program goals for OST efforts--solving old crimes, saving lives, and 
reducing property loss--it may be the case that for some programs 
intermediate measures represent the best feasible measure of results. 
We note that approximately 63 percent of OST's products fall into the 
category of information dissemination and technical assistance, aimed 
at informing customers and ultimately encouraging adoption of research 
results that lead to increased efficiency and effectiveness. There are 
strategies available that have been used by other federal agencies to 
take steps toward assessing the effectiveness of information 
dissemination and technical assistance efforts. For example, a recent 
GAO report[Footnote 30] outlines various strategies to assess media 
campaigns and informational seminars, including immediate post workshop 
surveys and follow-up surveys and the use of logic models to define 
measures of a program's progress toward intended results and long-term 
goals.

Conclusions:

Given the wide range of its products, OST has the potential to 
significantly improve the technological capabilities of federal, state, 
and local public safety agencies. However, the lack of information 
about the results of program efforts, or the assessment of progress 
toward goals, means that little is known about their effectiveness. 
While developing outcome measurements in many research-related programs 
is extremely difficult, there are various performance measurement 
strategies that other federal programs have used for assessing 
information dissemination, technical assistance and other R&D 
activities that might be applied to OST's programs. It is important to 
develop outcome measurements where feasible, or intermediate 
measurements where appropriate, to assist Congress, OST and NIJ 
management, and OST's customers to better assess whether investment in 
OST's programs is paying off with improved law enforcement and public 
safety technology.

Recommendation:

To help ensure that OST does all that is possible to measure its 
progress in achieving goals through outcome-oriented measures, we 
recommend that the Attorney General instruct the Director of NIJ to 
reassess the measures OST uses to evaluate its progress toward 
achieving its goals and to better focus on outcome measures to assess 
results where possible. In those cases where measuring outcome is, 
after careful consideration, deemed infeasible, we recommend developing 
appropriate intermediate measures that will help to discern program 
effectiveness.

Agency Comments and Our Evaluation:

We provided a copy of a draft of this report to the Attorney General of 
the United States for review and comment. In an October 30, 2003, 
letter, the Assistant Attorney General (AAG) for OJP commented on the 
draft. Her comments are summarized below and presented in their 
entirety in appendix VII. OJP also provided technical comments, which 
have been incorporated into this report where appropriate.

In the AAG's written response, the Justice Department concurred with 
our recommendation that NIJ reassess the measures OST uses to assess 
program outcomes. In response to our recommendation, the AAG reported 
that she has directed the NIJ Director, to reassess NIJ's performance 
measures for OST and to refine them, where possible, in order to focus 
them more toward measuring outcomes.

While the AAG agreed with our recommendation, she also made several 
other comments. First, she commented that developing numerical outcome 
measures like those urged by GAO is a particular challenge for R&D 
activities. As stated in our report, we recognize that measuring 
results using outcome measures is difficult and may be especially so in 
relation to some of the types of activities undertaken by OST. Our 
reference to a numerical measure is meant only as an example of how one 
of OST's program activities can be linked to intended results. We 
believe that further consideration of measures, both quantitative and 
qualitative, could improve the assessment of results for R&D as well as 
other OST programs. Our report also notes that relatively little of 
OST's work is R&D. The majority of OST's products are in the category 
of information dissemination and technical assistance.

Second, the AAG noted that GAO did not reach any conclusions in its 
discussion of OST's growth in budgetary resources, changes in program 
responsibilities, management of programs, and delivery of its products. 
The AAG noted that Justice believed that OST's record is outstanding. 
Neither OST nor we can determine whether OST's efforts in these areas 
are successful or otherwise, given that OST has not developed measures 
to assess their outcomes. Therefore, it is not possible to draw 
conclusions.

Third, the AAG indicated that GAO did not discuss in detail that over 
one-half of OST's funds were designated by Congress for specific 
recipients and projects. She noted that GAO missed an opportunity to 
inform the requester of the impact of Congress' recent decisions 
regarding OST. We reported that 51 percent of OST's budgetary resources 
were designated for specific recipients and projects in public law or 
subject to guidance in committee reports.

As agreed with your office, unless you publicly announce its contents 
earlier, we plan no further distribution of this report until 10 days 
from its issue date. At that time, we will send copies of the report to 
the Attorney General, appropriate congressional committees and other 
interested parties. We will also make copies available to others upon 
request. In addition, the report will be available at no charge on 
GAO's Web site at http://www.gao.gov. Major contributors to this report 
are listed in appendix VIII.

If you or your staff have any questions concerning this report, contact 
me on (202) 512-8777.

Sincerely yours,

Laurie E. Ekstrand: 
Director, Homeland Security and Justice Issues:

Signed by Laurie E. Ekstrand: 

[End of section]

Appendix I: Scope and Methodology:

To answer our objectives, we interviewed National Institute of Justice 
(NIJ) and Office of Science and Technology (OST) officials and 
collected documents at OST's office in Washington, D.C., and at three 
of OST's technology centers--the National center in Rockville, 
Maryland; West center in El Segundo, California; and Border Research 
and Technology Center in San Diego, California. We selected the 
Rockville center because of its proximity to Washington, D.C., and the 
other two centers because of their locations and particular areas of 
technology and technical concentrations. We also interviewed a small 
group of OST's customers--federal, state, and local law enforcement, 
and corrections and public safety officials--who were selected by 
officials at the El Segundo and San Diego centers. In addition, we 
analyzed information that is available on the National Institute of 
Justice's public Web site.

To determine OST's budgetary resources and amounts from fiscal year 
1995 to fiscal year 2003 and the changes in OST program 
responsibilities, we reviewed NIJ and OST budget documents, interviewed 
officials in OST's Technology Management and the OJP's Office of Budget 
and Management Services, and reviewed pertinent appropriations laws and 
committee reports covering that period. To determine the amount of OST 
budgetary resources that were directed to specific recipients and 
projects, we compared OST's budget documents that listed individual 
recipients and projects with the public laws and reports. We defined 
directed spending as spending for specific recipients and projects 
designated in appropriations laws or subject to congressional committee 
report guidance designating specific recipients or projects. We did not 
determine the amount of reimbursable projects designated in public laws 
or specified in committee reports because those projects were not 
originally allocated to OST. Instead, we considered all the 
reimbursable projects to be specific projects for which OST was 
directed pursuant to its agreements with other agencies on spending its 
reimbursable funds.

To determine the changes in OST's program responsibilities, we analyzed 
the year-to-year changes in its budget and program scope. To determine 
the amount of OST's budgetary resources used for investigative and 
forensic sciences for fiscal years 1995-2003, we compared OST's 
portfolio description and NIJ's definition of forensic sciences with 
the individual budget program and project items listed in OST's budget 
documents for each fiscal year. However, while we recognize that OST's 
technology centers and their technical partners include investigative 
and forensic sciences in their provision of technical assistance, we 
did not attempt to determine the amount of center funds associated with 
investigative and forensic sciences because the budget documents we 
received from OST did not break out such amounts within the funding 
awarded to the centers. Therefore, our determination that $342.1 
million of OST's total funding supported investigative and forensic 
sciences did not include such amounts.

To determine the amounts of funding awarded to the technology centers, 
we analyzed databases on all of the products OST has produced through 
April 2003 and the associated grants, interagency agreements, and 
cooperative agreements and their amounts.

To determine the composition of OST's products and how OST delivers the 
products to its customers, we analyzed OST documents and a database of 
all the products associated with its past and ongoing awards, from 
inception through April 2003, that were delivered or anticipated to be 
delivered. While the database included the award amounts associated 
with the products, it was not possible to reliably associate the award 
amounts for each product or type of product because multiple types of 
products could result from individual awards. We also conducted 
interviews with the parties mentioned above.

For the budget and product data that OST provided us, we assessed the 
reliability of these data by examining relevant fields for missing 
data, conducting range checks to identify any potentially erroneous 
outliers and inspecting a subset of selected data elements that were 
common to two or more data sets. In addition, we independently verified 
selected budget data back to appropriations legislation and Committee 
reports. In conducting our analyses, we identified some potential data 
errors or reliability problems. When this occurred, we contacted agency 
officials to address and resolve these matters. However, we did not 
verify the budget or product data back to source materials. Overall, we 
determined that budget or product data provided to us is adequate for 
the descriptive purposes it is used in this report.

We examined OST's efforts to measure performance by interviewing 
officials on this matter at OJP, NIJ, and OST in the Washington, D.C., 
office along with officials and staff at the technology centers, and 
current and previous Advisory Council officials. We also reviewed 
related agency documents, such as the OJP mission statement and 
performance plans; NIJ strategic planning documents and website pages, 
annual performance plans and performance reports, and GPRA documents; 
policies and procedures, Department of Justice memoranda, OST internal 
planning and reporting documents, program descriptions and 
documentation, and other related documents.

As part of our examination, we reviewed OST's fiscal year 1997 to 2004 
goals and measures as presented in OST's GPRA performance 
plans.[Footnote 31] We focused our review on OST's fiscal year 2004 
performance plan and measures. As part of our review of these goals and 
measures, we made a determination as to whether the performance measure 
was output, outcome, or intermediate-oriented. To make this 
determination about the types of performance measures contained in 
OST's performance plans, we compared the measures used in the plans 
with the requirements of GPRA, accompanying committee report, OMB's 
guidance on performance measurement challenges (Circular A-11), 
Justice's guidance to its components for preparing performance 
measures, and previous GAO work on GPRA.[Footnote 32]

Also included in our examination of OST performance measurement efforts 
were studies prepared by external parties under grants from OST that 
reviewed selected OST initiatives such as portfolio areas, projects, 
and programs. In response to our request for all of OST's efforts to 
assess its programs, OST provided eight outside studies funded from 
fiscal years 1998 to 1999. For example, the Pymatuning Group, Inc., 
conducted an "Assessment of the National Law Enforcement and 
Corrections Technology Center (NLECTC) Program," which described the 
operations of the OST's regional technology centers network. We 
reviewed all eight of the outside studies for performance information 
on the OST initiatives being examined in the report. We examined the 
studies to determine whether they provided information that would be 
considered consistent with an outcome-oriented evaluation as defined by 
our criteria.[Footnote 33]

The scope of this review was limited to OST, and therefore we cannot 
compare OST's efforts to measure the performance of its programs or the 
amount of funding directed to specific recipients and projects with the 
efforts and funding of any other federal R&D agencies. We performed our 
audit work from September 2002 to September 2003 in Washington, D.C., 
and other cited locations in accordance with generally accepted 
government auditing standards.

Appendix II: Bugetary Resources for OST's Programs in Current Year 
Dollars:

Table 6: Budgetary Resources in Current Dollars for OST's Programs by 
NIJ Allocation, Fiscal Years 1995-2003:

Dollars in millions: 

NIJ allocations for OST programs: NIJ Base; 1995: 
9.2; 1996: 12.0; 1997: 11.7; 1998: 13.8; 1999: 19.2; 2000: 18.4; 2001: 
28.6; 2002: 27.1; 2003: 32.8; Total[A]: 172.9.

NIJ allocations for OST programs: Local Law 
Enforcement Block Grant (LLEBG); 1995: 0; 1996: 20.0; 1997: 20.0; 1998: 
20.0; 1999: 20.0; 2000: 20.0; 2001: 20.0; 2002: 20.0; 2003: 19.9; 
Total[A]: 159.8.

NIJ allocations for OST programs: Crime 
Identification Technology Act (CITA); 1995: 0; 1996: 0; 1997: 0; 1998: 
0; 1999: 0; 2000: 4.2; 2001: 4.2; 2002: 1.4; 2003: 0; Total[A]: 9.9.

NIJ allocations for OST programs: Safe Schools 
Technology Research and Development; 1995: 0; 1996: 0; 1997: 0; 1998: 
0; 1999: 0; 2000: 15.0; 2001: 17.5; 2002: 17.0; 2003: 16.9; Total[A]: 
66.4.

NIJ allocations for OST programs: Crime Lab 
Improvement Program (CLIP); 1995: 0; 1996: 1.0; 1997: 3.0; 1998: 12.5; 
1999: 15.0; 2000: 15.0; 2001: 19.4; 2002: 35.0; 2003: 40.3; Total[A]: 
141.1.

NIJ allocations for OST programs: DNA Backlog 
Reduction[B]; 1995: 0; 1996: 0; 1997: 0; 1998: 0; 1999: 0; 2000: 15.0; 
2001: 10.6; 2002: 35.0; 2003: 35.8; Total[A]: 96.3.

NIJ allocations for OST programs: Paul Coverdell 
National Forensic Sciences Improvement Act (NFSIA)[B]; 1995: 0; 1996: 
0; 1997: 0; 1998: 0; 1999: 0; 2000: 0; 2001: 0; 2002: 5.0; 2003: 5.0; 
Total[A]: 10.0.

NIJ allocations for OST programs: Counterterrorism 
R&D; 1995: 0; 1996: 0; 1997: 10.0; 1998: 12.0; 1999: 10.0; 2000: 30.0; 
2001: 36.0; 2002: 45.3; 2003: 0; Total[A]: 143.3.

NIJ allocations for OST programs: Reimbursements 
from other Justice and federal agencies; 1995: 2.5; 1996: 14.5; 1997: 
0; 1998: 8.3; 1999: 10.9; 2000: 12.5; 2001: 26.3; 2002: 82.2; 2003: 
56.9; Total[A]: 214.1.

NIJ allocations for OST programs: Total[A]; 1995: 
11.7; 1996: 47.5; 1997: 44.7; 1998: 66.6; 1999: 75.1; 2000: 130.2; 
2001: 162.6; 2002: 268.0; 2003: 207.6; Total[A]: 1,013.8.

Source: GAO analysis of OST data.

[A] Totals might not add due to rounding:

[B] In fiscal years 2000 and 2001, DNA Backlog Reduction allocations 
was funded as DNA CODIS Backlog Reduction. In fiscal years 2002 and 
2003, both the DNA Backlog Reduction and Coverdell NFSIA allocations 
were funded within DNA CODIS Backlog Reduction.

[End of table]

[End of section]

Appendix III: OST's 10 Categories of Products:

While we divided OST's products into three groups for our reporting 
purposes, OST divides them into 10 categories. (See table 7 for GAO's 3 
groupings of OST's 10 categories.) In regrouping OST's 10 categories, 
we recognized, as OST officials told us, that the 10 categories overlap 
and there is not a clean division between them. We also recognized that 
many of OST's products could also be considered a delivery method. For 
example, publications, such as the TECHbeat newsletter, are OST 
products that can also represent a method of delivery for OST 
technology information. OST has reviewed our classification of products 
and agrees that it is generally accurate.

Table 7: GAO's Groupings of OST's Categories of Products and Examples 
of Each Category:

GAO's 3 groupings of OST's 10 categories: 1. Technology R&D; 

OST's 10 categories: 1. Results of the early stages of technology R&D 
include the development of prototypes and demonstration that a 
principle or concept can be proven; Examples of products: Results of 
investigating forensic techniques, studying potential less-than-lethal 
incapacitation technologies, and researching advanced weapons 
detection.

OST's 10 categories: 2. New applied technologies made available to 
public safety agencies, generally through commercialization; Examples 
of products: Improved bomb robots and electromagnetic concealed 
weapons detection.

GAO's 3 groupings of OST's 10 categories: 2. Application, evaluation, 
and demonstration of new and existing technologies for field users; 

OST's 10 categories: 3. Existing technologies applied to new 
situations; Examples of products: Communications interoperability (the 
ability to communicate across different public safety agencies and 
jurisdictions), handheld computer devices for bomb investigators, and 
software tools to measure levels of school safety.

OST's 10 categories:  4. Product evaluations based on voluntary 
national performance standards and comparisons with like products; 
Examples of products:  Ballistic and stab-resistant body armor, 
handcuffs, semi-automatic pistols, walk-through metal detectors; 
patrol vehicles, patrol vehicle tires, and replacement brake pads; 
cut-, puncture-, and pathogen-resistant protective gloves.

OST's 10 categories: 5. Technology demonstrations; Examples of 
products: Information dissemination and technical assistance: Annual 
Mock Prison Riot meeting demonstrates emerging technologies for use in 
hands-on riot training scenarios, and the annual Critical Incident 
Response Technology seminar (formerly called Operation America), in 
which bomb technicians practice live-fire simulations.

GAO's 3 groupings of OST's 10 categories: 3. Information dissemination 
and technical assistance; 

OST's 10 categories: 6. Information and guidance for public safety 
practitioners and those in R&D; Examples of products: Needs 
assessments of what public safety practitioners require, such as for 
combating electronic crime and terrorism; funding requirements for 
forensic science; investigative, selection, and application guides; 
and technology and training for small agencies.

OST's 10 categories:  7. Standards to ensure that commercially 
available public safety equipment meets minimum performance; Examples 
of products:  Ballistic resistance of personal body armor and handheld 
and walk-through metal detectors.

OST's 10 categories:  8. Enhanced capacity that gives agencies access 
to technologies and tools they otherwise might not have had funding 
for or access to; Examples of products:  Technology assistance 
provided to OST's customers by its regional centers; Crime Lab 
Improvement Program for establishing or expanding laboratories' 
capacities for forensic analysis; the DNA Backlog Reduction Program 
for helping to eliminate DNA backlog, leading to the resolution of 
unsolved crimes.

OST's 10 categories:  9. Conferences, forums, and workshops that bring 
together practioners, technologists, and policymakers to form 
partnerships, communicate needs, and educate participants; Examples of 
products:  Technical working groups of experienced practitioners and 
researchers working to improve investigation techniques and issue 
procedural guides. Panels and councils of public safety leaders, 
experts, and policymakers assisting OST and its regional centers in 
setting development priorities, launching technologies, identifying 
equipment problems, and enhancing understanding of technological 
issues and advances. Commercialization planning workshops involving 
developers and entrepreneurs interested in commercializing public 
safety technologies.

OST's 10 categories: 10. Technical expertise and oversight of 
technology projects provide additional oversight and guidance; 
Examples of products: Space and Naval Warfare Systems Command 
providing oversight, contracting, and administrative support for the 
NIJ User Centric Information Technology Program and Critical Incident 
Management System Testbed; the U.S Air Force Research Laboratory 
providing oversight and administrative support to the NIJ Concealed 
Weapons Detection and Personnel Location Technology Programs and 
hosting the NIJ-sponsored National Cyberscience Laboratory. 

[End of table]

Source: GAO analysis of OST data.

[End of section]

Appendix IV: OST's Portfolio Areas:

OST has organized its individual projects to develop, improve, and 
implement technology for public safety agencies into nine portfolio 
areas. As of April 2003, these portfolio areas included:

* critical incident technology, for first responders and investigators 
protecting the public in the event of critical incidents such as 
natural disasters, industrial accidents, or terrorist acts;

* communications interoperability[Footnote 34] and information 
sharing, enhancing communication among public safety agencies through 
wired links, wireless radios, and information networks, even when 
disparate systems are involved;

* electronic crime, supporting computer forensic laboratories, 
publishing guides for handling electronic evidence, and developing 
computer forensic tools;

* investigative and forensic sciences, funding at the state and local 
levels for DNA-typing of convicted offenders and use of DNA-typing in 
the investigation of unsolved cases, and developing tools for forensic 
casework;

* crime prevention technologies, including contraband detectors, 
sensors and surveillance systems, and biometric technologies;

* protective systems technologies, including body armor; "smart" 
handguns, which fire only upon recognition of, for example, a certain 
handprint or password; puncture resistant gloves; better handcuffs; 
better concealed weapon detection; and personnel tracking and location 
technologies;

* less-than-lethal technologies, developing alternatives to lethal 
force, including technologies involving electrical or chemical effects, 
light barriers, vehicle stopping, and blunt trauma, and evaluating and 
modeling the effects of these technologies;

* learning technologies, developing technology tools for agencies to 
use in training their personnel, including use of the internet, CD-
ROMs, and video-based and interactive simulations; and:

* standards and testing, ensuring that the equipment public safety 
agencies buy is safe, dependable, and effective.

[End of section]

Appendix V: OST's Operations:

As with other federal agencies, OST's operations involve multiple 
levels of internal organization and multiple kinds of external 
partners. OST's multiple levels of organization include a Washington, 
D.C., office that manages its technology programs and a network of 
technology centers around the country that provide technical assistance 
to OST's regional customers. OST also collaborates with other R&D 
entities, such as those in the Departments of Defense and Energy and 
public and private laboratories, by forming technical partnerships in 
order to leverage already established technical expertise and resources 
to support their program efforts. Another aspect of OST's complex 
operations is the need to determine OST's own priorities and the 
priorities of its customers, which involves Washington and regional 
center staff collaborating formally and informally with a myriad of 
federal, state, and local officials, as well as with one another.

OST Has Multiple Levels of Organization:

OST's multiple levels of organization include a Washington, D.C., 
office and technology centers, as well as technical partnerships with 
government, public and private R&D and public safety organizations. As 
of September 2003, OST's Washington office consisted of 25 full-time-
equivalent Justice staff divided into three divisions and under the 
Assistant NIJ Director for OST.[Footnote 35] Responsibility for 
managing these programs is divided among the three divisions. (See 
figure 3 for OST's organizational structure.):

Figure 3: OST's Organizational Structure:

[See PDF for image]

[End of figure]

* Research and Technology Development Division manages electronic crime 
(including cybercrime), critical incidents and counterterrorism, 
communications interoperability and information sharing, crime 
prevention, learning technology tools, less-than-lethal technologies, 
standards development, school safety, and corrections technologies.

* Investigative and Forensic Sciences Division manages DNA-related R&D 
and other investigative and forensic sciences, such as fingerprint 
analysis, and includes the Crime Laboratory Improvement Program 
projects, DNA Backlog Reduction projects, and DNA research and 
development projects.

* Technology Assistance Division, through the technology center 
network, provides training and technical advice to, and identifies 
technologies for, OST's customers, and oversees OST's network of 10 
technology centers (see figure 4). The technology centers are another 
source of technical advice for OST's customers.

Figure 4: OST's 10 Technology Centers and the Regions They Serve:

[See PDF for image]

[End of figure]

OST's Technology Centers:

OST's network of 10 technology centers provides technical assistance, 
among other things, to OST's customers. From fiscal year 1995 to fiscal 
year 2003 (as of July 2003), funding support for the centers totaled 
$171.7 million. (See table 8 for funding by center.) The technology 
centers comprise six regional centers and four specialty centers. While 
the regional centers assist OST's customers by region--Northwest, West, 
Rocky Mountain, Northeast, Southeast, and National--they are expected 
to coordinate and collaborate among one another regardless of where the 
resources and capabilities are located. Each of these 6 centers works 
with a regional advisory council comprising state and local law 
enforcement, corrections, and public safety representatives.

As described below, the four specialty centers provide specialized 
expertise and services.

* The Office of Law Enforcement Standards tests commercially available 
equipment and develops minimum performance standards for such 
equipment.

* The Office of Law Enforcement Technology Commercialization, Inc., 
assists inventors and developers, among others, in commercializing 
technologies.

* The Border Research and Technology Center aids in the development of 
technologies for agencies concerned with law enforcement at the 
northern and southern borders.

* The Rural Law Enforcement Technology Center aids rural and small-
community law enforcement and corrections agencies.

Table 8: Total Funds Awarded for the Operations, Maintenance, and 
Technical Support of OST's 10 Technology Centers, Fiscal Years 1995-
2003:

Dollars in millions: 

Regional centers: National, Rockville, Md; Total funding: 20.4.

Regional centers: Northeast, Rome, N.Y; Total funding: 11.7.

Regional centers: Southeast, North Charleston, S.C; Total funding: 
23.5.

Regional centers: Northwest, Anchorage, Alaska; Total funding: 2.8.

Regional centers: Rocky Mountain, Denver, Colo; Total funding: 16.2.

Regional centers: West, El Segundo, Calif; Total funding: 12.7.

Regional centers: Specialty centers: 

Regional centers: Border Research Technology Center, San Diego, 
Calif; Total funding: 8.2.

Regional centers: Office of Law Enforcement Standards, Gaithersburg, 
Md; Total funding: 53.6.

Regional centers: Office of Law Enforcement Technology 
Commercialization, Wheeling, W.Va; Total funding: 19.6.

Regional centers: Rural Law Enforcement Technology Center, 
Hazard, Ky; Total funding: 3.0.

Regional centers: Total funding; Total funding: $171.7.

Source: OST.

Notes: Figures are based on the current year values of each award. 
According to OST documents, the first award year for the Office of Law 
Enforcement Standards in support of OST efforts was 1994. All of the 
centers had award years of 1995 or later.

[End of table]

OST's Technical Partnerships for Long-Term Support:

In addition to forming divisions and technology centers, OST has also 
formed partnerships with governmental, public and private R&D 
organizations, agencies, and working groups. According to OST 
officials, an advantage of these partnerships is that OST can leverage 
the expertise and resources of already established R&D facilities 
without having to create their own. These partners have included:

* corporations, such as Georgia Tech Research Corporation and L-3 
Communications, Analytics Corporation;

* state and local agencies, such as the Houston Police Department and 
the Washington Metropolitan Area Transit Authority;

* academic institutions, such as the University of Virginia and 
Syracuse University;

* other federal government agencies, such as the Department of 
Defense's Army Training and Doctrine Command, and the Department of 
Transportation's Federal Aviation Administration; and:

* foreign government organizations, such as the Royal Canadian Mounted 
Police, the United Kingdom Police Scientific Development Branch, and 
the government of Israel.

Each of OST's technology centers is affiliated with one or more of 
OST's technical partners. These technical partners are awarded funding 
in exchange for providing staff and facilities to the technology 
centers. Table 9 lists OST's partners and their affiliations, and 
funding they received to support the centers through June of fiscal 
year 2003.

Table 9: OST's Technology Centers, Their Affiliated Partners, and the 
Amounts Awarded to Support the Centers:

Dollars in millions: 

Technology centers: National, Rockville, Md; 
Affiliated OST partner: Aspen Systems Corporation, Rockville, Md; 
Amount awarded to support center: 20.4.

Technology centers: Northeast, Rome, N.Y; 
Affiliated OST partner: Air Force Research Laboratory, U.S. Air Force, 
Rome, N.Y; Amount awarded to support center: 11.7.

Technology centers: Southeast, North Charleston, 
S.C; Affiliated OST partner: South Carolina Research Authority, North 
Charleston, S.C; Amount awarded to support center: 21.3.

Affiliated OST partner: Space and Naval Warfare 
Systems Center, U.S. Navy, Columbia, S.C; Amount awarded to support 
center: 0.6.

Affiliated OST partner: Oak Ridge National 
Laboratory, U.S. Department of Energy, Oak Ridge, Tenn; Amount awarded 
to support center: 0.3.

Affiliated OST partner: Savannah River Site, Department of Energy, 
Aiken, S.C; Amount awarded to support center: 1.3.

Technology centers: Northwest, Anchorage, Alaska; 
Affiliated OST partner: Chenega Technology Services Corporation, and 
National Business Center, U.S. Department of Interior, Anchorage, 
Alaska; Amount awarded to support center: 2.8.

Technology centers: Rocky Mountain, Denver, Colo; 
Affiliated OST partner: University of Denver - Colorado Seminary, 
Denver, Colo; Amount awarded to support center: 16.2.

Technology centers: West, El Segundo, Calif; 
Affiliated OST partner: Aerospace Corporation, El Segundo, Calif; 
Amount awarded to support center: 12.7.

Technology centers: Specialty centers: 

Technology centers: Border Research Technology 
Center, San Diego, Calif; Affiliated OST partner: Aerospace 
Corporation, El Segundo, Calif; Amount awarded to support center: 1.4.

Affiliated OST partner: Space and Naval Warfare 
Systems Center, U.S. Navy, San Diego, Calif; Amount awarded to support 
center: 1.7.

Affiliated OST partner: Sandia National 
Laboratories, U.S. Department of Energy, Albuquerque, N. Mex; Amount 
awarded to support center: 5.1.

Affiliated OST partner: U.S. Attorney's Office, Southern District 
of California, Department of Justice, San Diego, Calif; Amount awarded 
to support center: 0.0[A].

Technology centers: Office of Law Enforcement 
Standards, Gaithersburg, Md; Affiliated OST partner: National 
Institute of Standards and Technology, U.S. Department of Commerce, 
Gaithersburg, Md; Amount awarded to support center: 53.6.

Technology centers: Office of Law Enforcement 
Technology Commercialization, Wheeling, W.Va; Affiliated OST partner: 
OLETC, Inc., Wheeling, W.Va; Amount awarded to support center: 2.8.

Affiliated OST partner: Wheeling Jesuit 
University, Wheeling, W.Va; Amount awarded to support center: 14.0.

Affiliated OST partner: National Aeronautics and Space Administration; 
Amount awarded to support center: 2.8.

Technology centers: Rural Law Enforcement 
Technology Center, Hazard, Ky; Affiliated OST partner: Eastern 
Kentucky University, Hazard, Ky; Amount awarded to support center: 
3.0.

Technology centers: Total funding[B]; $171.7.

Source: OST.

Note: Figures are based on the current year values of each award. Award 
amounts are for the operations, maintenance and technical support of 
the centers.

[A] Actual amount is $25,000.

[B] Total might not add due to rounding.

[End of table]

OST Collaborates with Many Customers and Partners to Determine Program 
Priorities:

To determine its program priorities, OST collaborates with its many 
customers and partners. Staff at both OST's Washington, D.C., office 
and its technology centers are involved in helping OST to set program 
priorities. The staff report the results of their collaboration through 
formal meetings, periodic reports, and informal communication. Input is 
exchanged continually between OST's customers and staff and within its 
multiple levels of organization. Using their input, the NIJ Director 
determines OST's program priorities. (See figure 5 for the 
stakeholders, partners, and customers that contribute to the setting of 
OST's priorities.):

Figure 5: Stakeholders and Customers that Contribute to the Setting of 
OST's Priorities:

[See PDF for image]

[End of figure]

OST Collaborates with Government Agencies, Research and Professional 
Communities, and Centers:

OST's three divisions collaborate with other U.S. government agencies, 
the research and professional communities, and its technology centers 
to solicit input for setting priorities. Also, the divisions work with 
public safety practitioners at the state and local levels by, for 
example, meeting with grantees and assessing their needs.

* The Investigative and Forensic Sciences Division collaborates with, 
and receives input from, researchers, academia, and the forensic 
laboratory community to help set program priorities. It also 
collaborates with, for example, the FBI and the interagency Technical 
Support Working Group.

* The Research and Technology Development Division receives input 
through its collaboration with other federal agencies, such as the FBI, 
Drug Enforcement Administration, U.S. Secret Service, and White House 
Office of National Drug Control Policy. The division also participates 
in interagency working groups, such as for school safety and the 
Technical Support Working Group. Through these collaborations, OST can 
develop and share technologies used by both its customers and other 
agencies. For example, OST works with the Department of Defense to 
conduct less-than-lethal weapons R&D for law enforcement.

* The Technology Assistance Division is primarily responsible for 
receiving input from OST's technology centers. The centers solicit 
input from customers through their outreach efforts, such as technical 
assistance, e-mail exchanges, and telephone calls. The centers are also 
required to use OST's web-based reporting system to record information 
on their customers' requests for technical assistance. The centers are 
also required to submit monthly reports on their activities and 
finances.

Advisory Councils and Federal, State, and Local Public Safety Agencies 
Collaborate with OST's Technology Centers:

OST's technology centers solicit input from the national and regional 
advisory councils that OST created to determine and advocate for the 
particular needs of its customers. Members of the national advisory 
council are selected by the technology centers and represent federal, 
state, and local public safety agencies, as well as international 
criminal justice organizations. Among its duties, this national 
advisory council identifies the present and future equipment and 
technology needs of OST's customers and reviews the programs of the 
technology centers. In addition, the national advisory council 
recommends (1) ways to improve the technology centers' programs' 
relevance to the needs of the centers' customers and (2) broad 
priorities for the technology center network and OST that are 
consistent with the needs of their customers.

Each technology center has a regional advisory council. The regional 
advisory councils consist of a cross-section of law enforcement and 
other public safety officials who represent the interests of state and 
local officials. The regional advisory councils solicit input from the 
state and local agencies serviced in their regions, advise and support 
their respective center directors on their customers' problems and 
needs, and advocate for resource support and improvements required by 
their customers. Through this method of sharing information, OST can 
better understand the needs of its customers. For example, OST's 
regional councils can represent the unique needs of their customers 
that the national advisory council or the technology centers might not 
be aware of.

[End of section]

Appendix VI: OST's Goals in its Fiscal Year 2004 Performance Plan and 
GAO's Assessment:

Table 10: OST's Performance Goals, Initiatives, and Measures for Fiscal 
Year 2004, and GAO's Assessment:

OST's initiatives: 1. Convicted Offender DNA Backlog Reduction Program; 
Goals for initiatives: Reduce DNA backlog and support a functioning, 
active system, which can solve old crimes and prevent new ones from 
occurring; Measures for assessing achievement of goals: 2. Number of 
labs demonstrating improved access to external capabilities and 
increased lab capabilities; Type of measure: Output: No; Type of 
measure: Intermediate: Yes; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 3. Number of samples (1) 
analyzed using the selected DNA markers that are required by the FBI's 
national Combined DNA Index System (CODIS) database, and (2) made 
available for CODIS; Type of measure: Output: No; Type of 
measure: Intermediate: Yes; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 4. Number of states that 
have experienced an increase in the number of samples they have 
contributed to the national database; Type of measure: Output: 
No; Type of measure: Intermediate: Yes; Type of measure: Outcome: 
No.

OST's initiatives: 5. No Suspect DNA Backlog Reduction Program; Goals 
for initiatives: Reduce DNA backlog and support a functioning, active 
system, which can solve old crimes and prevent new ones from 
occurring; Measures for assessing achievement of goals: 6. Number of 
DNA samples from cases where there is no known suspect; Type of 
measure: Output: No; Type of measure: Intermediate: Yes; Type of 
measure: Outcome: No.

OST's initiatives: 7. Paul Coverdell National Forensic Sciences 
Improvement Grants Program; Goals for initiatives: Improve quality, 
timeliness, and credibility of forensic science services; Measures for 
assessing achievement of goals: 8. Number of forensic labs with 
improved analytic and technological resources; Type of measure: 
Output: No; Type of measure: Intermediate: Yes; Type of measure: 
Outcome: No.

OST's initiatives: 9. Critical Incident Response Technology Initiative; 
Goals for initiatives: Improve the ability of public safety responders, 
including law enforcement and corrections officers, to deal with 
critical incidents, save lives, and reduce property loss; Measures for 
assessing achievement of goals: 10. Number of technology demonstrations 
and test indicators that describe the goods and services produced; 
Type of measure: Output: Yes; Type of measure: Intermediate: No; 
Type of measure: Outcome: No.

Measures for assessing achievement of goals: 11. Number of prototype 
technologies developed; Type of measure: Output: Yes; Type of measure: 
Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 12. Number of guides, 
standards, and assessments in progress; Type of measure: Output: Yes; 
Type of measure: Intermediate: No; Type of measure: Outcome: 
No.

Measures for assessing achievement of goals: 13. Number of guides, 
standards, and assessments completed; Type of measure: Output: Yes; 
Type 
of measure: Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 14. Number of technologies 
introduced in law enforcement and corrections agencies; Type of 
measure: Output: No; Type of measure: Intermediate: Yes; Type of 
measure: Outcome: No.

OST's initiatives: 15. DNA Research & Development; Goals for 
initiatives: Develop faster and more powerful tools and techniques for 
the analysis of DNA evidence. These new tools and techniques will 
result in more crimes prevented and solved and more perpetrators 
brought to justice; Measures for assessing achievement of goals: 16. 
Number of projects researching new forensic DNA markers; Type of 
measure: Output: Yes; Type of measure: Intermediate: No; Type of 
measure: Outcome: No.

Measures for assessing achievement of goals: 17. Number of development/
validation studies for forensic DNA techniques; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 18. Number of computer 
programs developed for forensic DNA analysis; Type of measure: Output: 
Yes; Type of measure: Intermediate: No; Type of measure: Outcome: 
No.

Measures for assessing achievement of goals: 19. Number of prototypes 
and tools for forensic DNA analysis; Type of measure: Output: Yes; 
Type of measure: Intermediate: No; Type of measure: Outcome: No.

OST's initiatives: 20. Law Enforcement Technology Research and 
Development; Goals for initiatives: Assist in applying technology to 
reduce the vulnerability of critical infrastructure; detect weapons and 
other contraband; improve technologies to locate and differentiate 
between individuals in structures; leverage information technology to 
enhance the responder community's ability to anticipate and deal with 
critical incidents; identify and respond to terrorist attacks involving 
chemical, biological, and other unconventional weapons; and develop 
needed standards.[A]; Measures for assessing achievement of goals: 21. 
Number of technology demonstrations and tests; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 22. Number of prototype 
technologies developed; Type of measure: Output: Yes; Type of measure: 
Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 23. Number of guides, 
standards, and assessments in progress; Type of measure: Output: Yes; 
Type of measure: Intermediate: No; Type of measure: Outcome: 
No.

Measures for assessing achievement of goals: 24. Number of guides, 
standards, and assessments completed; Type of measure: Output: Yes; 
Type of measure: Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 25. Number of technologies 
introduced in law enforcement and corrections agencies; Type of 
measure: Output: No; Type of measure: Intermediate: Yes; Type of 
measure: Outcome: No.

OST's initiatives: 26. School Safety Technology; Goals for initiatives: 
Assist school administrators and law enforcement in creating a safer 
and more productive learning environment. Safe, effective, appropriate, 
and affordable technologies can affect the perception and reality of 
safe schools; Measures for assessing achievement of goals: 27. Number 
of technology demonstrations; Type of measure: Output: Yes; Type of 
measure: Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 28. Number of conferences 
and forums; Type of measure: Output: Yes; Type of measure: 
Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 29. Number of school 
safety technology products; Type of measure: Output: Yes; Type of 
measure: Intermediate: No; Type of measure: Outcome: No.

OST's initiatives: 30. Crime Lab Improvement Program; Goals for 
initiatives: Provide immediate results in solving more crimes, bringing 
to justice more criminals, and improving administration of justice 
through the presentation of strong, reliable forensic evidence at 
trial; Measures for assessing achievement of goals: 31. Number of 
crime labs receiving specialized forensic services; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 32. Number of capacity-
building forensic R&D and validation projects funded; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 33. Number of forensic 
technology training tools developed and distributed; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 34. Number of labs 
providing continuing education or advanced training to crime analysts; 
Type of measure: Output: Yes; Type of measure: Intermediate: No; 
Type of measure: Outcome: No.

Measures for assessing achievement of goals: 35. Number of crime labs 
with increased capacity for implementation of new forensic capabilities 
(including DNA analysis); Type of measure: Output: No; Type of 
measure: Intermediate: Yes; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 36. Number of capacity-
building forensic R&D and validation projects completed and impacting 
crime labs; Type of measure: Output: No; Type of measure: 
Intermediate: Yes; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 37. Number of labs 
establishing new forensic capabilities; Type of measure: Output: 
No; Type of measure: Intermediate: Yes; Type of measure: Outcome: 
No.

Measures for assessing achievement of goals: 38. Number of labs 
expanding current forensic capabilities; Type of measure: Output: 
No; Type of measure: Intermediate: Yes; Type of measure: Outcome: 
No.

Measures for assessing achievement of goals: 39. Number of labs 
experiencing a reduction in time needed for evidence analysis; Type of 
measure: Output: No; Type of measure: Intermediate: Yes; Type of 
measure: Outcome: No.

Measures for assessing achievement of goals: 40. Number of labs 
experiencing a reduction in backlogged evidentiary sample analysis; 
Type of measure: Output: No; Type of measure: Intermediate: Yes; 
Type of measure: Outcome: No.

OST's initiatives: 41. Office for Law Enforcement Standards; Goals for 
initiatives: Help the public safety community make informed decisions 
about products being marketed for public safety personnel; Measures 
for assessing achievement of goals: 42. Number of methods for examining 
evidentiary materials developed; Type of measure: Output: Yes; Type of 
measure: Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 43. Number of standards 
for equipment and operating procedures developed; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 44. Law enforcement 
technology deliverables (standards, product performance evaluations, 
product guides); Type of measure: Output: Yes; Type of measure: 
Intermediate: No; Type of measure: Outcome: No.

OST's initiatives: 45. Smart Gun; Goals for initiatives: Develop a 
firearm that could save the lives of law enforcement officers and 
members of the public that they encounter while performing their 
duties; Measures for assessing achievement of goals: 46. Successful 
demonstration of prototype recognition system for smart gun; Type of 
measure: Output: Yes; Type of measure: Intermediate: No; Type of 
measure: Outcome: No.

Measures for assessing achievement of goals: 47. Failure mode analysis 
for prototype recognition system for smart gun; Type of measure: 
Output: Yes; Type of measure: Intermediate: No; Type of measure: 
Outcome: No.

Measures for assessing achievement of goals: 48. Incorporation and 
demonstration of recognition system into firearm (where applicable); 
Type of measure: Output: Yes; Type of measure: Intermediate: No; 
Type of measure: Outcome: No.

Measures for assessing achievement of goals: 49. Identification of 
appropriate biometric solutions for recognition system (where 
applicable); Type of measure: Output: Yes; Type of measure: 
Intermediate: No; Type of measure: Outcome: No.

OST's initiatives: 50. OST's network of technology centers (known as 
the National Law Enforcement and Corrections Technology Center system); 
Goals for initiatives: Help state and local law enforcement, 
corrections, and public safety personnel do their jobs more safely and 
efficiently, thereby leading to greater administrative efficiencies, 
more crimes solved, and more lives saved; Measures for assessing 
achievement of goals: 51. Number of technology information documents 
distributed; Type of measure: Output: Yes; Type of measure: 
Intermediate: No; Type of measure: Outcome: No.

Measures for assessing achievement of goals: 52. Number of 
practitioners trained through the Crime Mapping Program; Type of 
measure: Output: Yes; Type of measure: Intermediate: No; Type of 
measure: Outcome: No.

Measures for assessing achievement of goals: 53. Savings to criminal 
justice agencies through the DOD's Section 1033 Military Surplus 
Program. Section 1033 of the National Defense Authorization Act for 
Fiscal Year 1997[B] authorizes DOD to transfer excess military property 
to federal and state agencies to support law enforcement activities 
including counterdrug and counterterrorism activities; Type of 
measure: Output: No; Type of measure: Intermediate: Yes; Type of 
measure: Outcome: No.

Source: OST.

[A] Because the goal for this initiative was not outcome-oriented 
according to our criteria, we used the initiative's mission statement 
as the goal.

[B] P.L. 104-201, 110 Stat. 2422 (1996).

[End of table]

[End of section]

Appendix VII: Comments from the Department of Justice:

U.S. Department of Justice: 
Office of Justice Programs:
Office of the Assistant Attorney General: 
Washington, D. C. 20531:

Laurie E. Ekstrand:

Director, Homeland Security and Justice Issues General Accounting 
Office:

441 G Street, N.W. Mail Stop 2440A Washington, DC 20548:

OCT 30 2003:

Dear Ms. Ekstrand:

This letter responds to the General Accounting Office (GAO) draft 
report entitled "LAW ENFORCEMENT: Better Performance Measures Needed to 
Assess Results of Justice's Office of Science and Technology" (GAO-04-
198).

In the draft report, GAO recommended that the National Institute of 
Justice (NIJ), Office of Science and Technology (OST), reassess the 
measures that OST uses to evaluate its progress toward achieving its 
goals and to better focus on outcome measures to assess results where 
possible. In cases where measuring outcome is, after careful 
consideration, deemed infeasible, GAO recommended that OST develop 
appropriate intermediate measures that will help to discern program 
effectiveness.

We agree with the GAO's recommendation. The NIJ is participating in 
extensive planning processes to create and refine appropriate feasible 
measures for OST. In response to the Government Performance and Results 
Act (GPRA) and the President's Management Agenda, in Fiscal Year 2002, 
the Office of Justice Programs (OJP) initiated several agency-wide 
activities aimed at improving OJP's ability to more effectively 
identify, collect, analyze, and report program performance. As part of 
these processes, OJP developed the agency's first Strategic Plan;

restructured its budget to align programs, strategies, and goals with 
funding to support performance-based budgeting;

clarified overarching goals, strategies, and measures for all major 
program areas, including those areas with OST involvement; and:

developed internal operational performance measures to track, on a 
quarterly basis, progress of those activities supportive of program 
success.

We are now in the process of implementing an agency-wide plan to 
collect and analyze the data necessary under the approved OJP 
performance measures. This baseline data will inform OJP, NIJ, and OST 
as to future performance goals that should be established to help 
assess the relevance, quality, and performance of OST's activities.

As NIJ's mission statement notes, the ultimate outcome that it seeks is 
"... to enhance the administration of justice and public safety" which 
most closely falls under the part of the Department of Justice's 
mission that states, ". . . to provide federal leadership in preventing 
and controlling crime . . . ." With respect to measuring such outcomes, 
the Department has stated, "Measuring law enforcement performance 
presents unique challenges. Success for the Department is highlighted 
when justice is served fairly and impartially and the public is 
protected. In many areas, our efforts cannot be reduced to simplistic 
numerical counts of activities such as convictions. Therefore, although 
the Department provides retrospective data on a limited number of these 
activities, it does not target levels of performance. The Department is 
concerned that doing so would lead to unintended and potentially 
adverse consequences. Additionally, it is extremely difficult to 
isolate the effects of our work from other factors that affect outcomes 
over which the Department of Justice has little or no control. Although 
during the last 7 years the annual violent crime rate has decreased by 
about 50 percent, the Department does not rely on this macro-level 
indicator in measuring its performance. Many factors contribute to the 
rise and fall of the crime rates, including federal, state, local, and 
tribal law enforcement activities and sociological, economic, and other 
factors.":

Developing numerical outcome measures like those urged by GAO is a 
particular challenge for research and development (R&D) activities, as 
acknowledged in this draft report. Joint guidance issued in May 2002 by 
the Directors of the Office of Management and Budget (OMB) and the 
Office of Science and Technology Policy (OSTP) acknowledges the view 
that ultimate socially-desired outcomes, such as reduced crime, are not 
appropriate outcome measures for support organizations like research 
agencies, and that descriptions of performance should not be limited 
only to quantitative measures. The May 2002 guidance outlined the 
criteria to use in evaluating Federal R&D programs, which considers 
the: 1) relevance of the R&D; 2) quality of the R&D; and 3) performance 
of the R&D (i.e., research management).

In developing measures of the performance of its offices, including 
OST, NIJ benchmarks its activities with those of other Federal R&D 
agencies. All of the performance plans of Federal R&D agencies that NIJ 
reviewed[NOTE 1] use output measures more frequently than outcome 
measures to evaluate their performance. Further, OMB approved a 
qualitative scale initially used by the National Science Foundation 
that has since been implemented by other agencies, such as the 
National Institutes of Health, an agency that reports its performance 
utilizing descriptive criteria allowed under the "alternative form" 
provisions of GPRA.

In the draft report, GAO described OST's budget resources and how 
funding from multiple sources has grown from $13.2 million in FY 1995 
to $204.2 million in FY 2003 (in constant 2002 dollars). In addition, 
GAO described how program responsibilities have changed since OST was 
created in 1995, growing broader and more complex with the inclusion of 
investigative and forensic sciences, school safety, and 
counterterrorism programs. Also, GAO described OST's nine research and 
development focus areas ("portfolios") which, in addition to OST's 
capacity building and technology assistance programs for the field, 
are managed and administered by 25 Federal positions. Further, GAO 
listed OST's principle products and characterized how nearly 1,000 OST 
products have been delivered to a wide variety of different customers 
using various dissemination methods.

We believe this record is an outstanding one and worthy of comment, 
even praise, from the GAO. However, while squarely within the purpose 
of the GAO study, GAO did not reach any conclusions about any of the 
material just summarized. Additionally, the GAO noted that over half of 
DST's funds were designated by Congress for specific recipients and 
projects (i.e., "earmarks") outside of the agency's normal nationwide 
peer-reviewed competitive process. This point, also falling within the 
scope of the GAO review, was not discussed in any detail in the report, 
which we view as a missed opportunity to inform the requestor of this 
report as to the impact of Congress's recent decision making with 
respect to OST.

In response to GAO's comments, I have directed the NIJ Director, Sarah 
Hart, to work to reassess NIJ's performance measures for OST and to 
refine them, where possible, in order to focus them more toward 
measuring outcomes. The Office of Justice Programs appreciates the 
opportunity to comment on the draft report.

Sincerely,

Deborah J. Daniels: 
Assistant Attorney General:

Signed by Deborah J. Daniels:

cc:	Sarah V. Hart, Director: National Institute of Justice:

Cynthia J. Schwimer: Comptroller, OR:

LeToya A. Johnson: Audit Liaison, OR:

Vickie L. Sloan: Audit Liasion, DOJ:

OAAG Executive Secretariat: Control Number 20032121:

NOTES:

[1] The NIJ reviewed the performance goals and measures of 12 Federal 
R&D agencies, including those within the Departments of Agriculture, 
Education, Energy, Health and Human Services, and Transportation; the 
Food and Drug Administration; the Environmental Protection Agency; and 
the National Science Foundation.

[End of section]

Appendix VIII: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Laurie Ekstrand (202) 512-8777 Weldon McPhail (202) 512-8644:

Staff Acknowledgments:

In addition to those named above, the following individuals contributed 
to this report: Samuel L. Hinojosa, Debra L. Picozzi, Katherine M. 
Davis, Richard Hung, Geoffrey R. Hamilton, Denise M. Fantone, Kristeen 
McLain, Elizabeth H. Curda, Rebecka Derr, Thomas M. Beall, and Leo M. 
Barbour.

FOOTNOTES

[1] We are using "programs" to indicate the broad categories of OST's 
individual projects. NIJ and OST have referred to these categories as 
both portfolio areas and programs. Our use of the term "programs" 
encompasses "portfolio areas" (see app. IV for OST's portfolio areas) 
and the safe school technology, counterterrorism technology, and 
correction technology programs. NIJ and OST delineations between the 
various programs and various portfolio areas are flexible. For example, 
some of the projects to develop metal detectors and personnel locator 
devices would apply to both school safety technologies and corrections 
technologies programs and therefore could be placed in different 
portfolio areas.

[2] NIJ was established in statute by the Justice System Improvement 
Act of 1979 (P.L. 96-157, 93 Stat. 1167 (1979)), which, among other 
things, amended the Omnibus Crime Control and Safe Streets Act of 1968 
(P.L. 90-351, 82 Stat. 197 (1968)).

[3] P.L. 107-296, 116 Stat. 2135, 2159 (2002). These mission and duties 
are not unlike what OST had been carrying out previously. The Act 
codified the mission and duties in statute. 

[4] According to NIJ, forensic science is the application of 
established scientific techniques to the identification, collection, 
and examination of evidence from crime scenes; the interpretation of 
laboratory findings; and the presentation of reported findings in 
judicial proceedings.

[5] These 10 technology centers are OST's National Law Enforcement and 
Corrections Technology Center (NLECTC) system.

[6] We did not include contracts because NIJ uses them for the purchase 
of goods and services rather than for awarding funds for carrying out 
OST programs and projects. 

[7] P.L.103-62, 107 Stat. 285 (1993).

[8] Figures do not include funding for management and administration 
expenses, salaries, and unobligated balances carried from one year to 
the next.

[9] For the purposes of this report, we will refer to both the funds 
OST receives via several Justice appropriations accounts as NIJ 
allocations as well as the reimbursements it receives as OST's 
budgetary resources.

[10] In fiscal year 1999, NIJ used the LLEBG allocation to meet 
congressional guidance to spend $10 million on a new Safe School 
Initiative. The following year NIJ's Safe Schools Technology R&D 
funding was introduced with $15 million. The OST funding was not 
reduced as a result of the $15 million increase for the Safe Schools 
Technology R&D.

[11] We separated reimbursements from this total because they included 
projects that were not originally allocated to OST, although those 
projects also may have been specified in public law and committee 
reports. 

[12] Included in the $249.8 million was $143.5 million for the CLIP 
project. Committee report guidance further designated $107.0 million of 
that $143.5 million for specific recipients. Given that we have 
included the $107.0 million in the amounts designated in public law for 
specific recipients or projects, we excluded it from the committee 
report guidance category to avoid double counting.

[13] H.R. Conf. Rep. No. 105-825, at 1020-21 (1998).

[14] For this effort, NIJ initially allocated Local Law Enforcement 
Block Grant funds to OST.

[15] P.L. 104-208, 110 Stat. 3009, 3009-13 (1996).

[16] H.R. Conf. Rep. No. 106-479, at 161 (1999); H.R. Conf. Rep. No. 
106-1005, at 226 (2000); and H.R. Conf. Rep. No. 107-278, at 86-87 
(2001).

[17] The total amount of budgetary resources for investigative and 
forensic sciences is likely to be larger. However, because of the 
limitations in detail in the budget documents we received from OST, we 
could not determine the amount of funding for investigative and 
forensic sciences within certain NIJ Base and LLEBG projects, such as 
within OST's technology center network and unspecified NIJ-directed 
projects. 

[18] Because NIJ's science and technology efforts predate OST's 
establishment in fiscal year 1995, some of the products listed as 
delivered have award years prior to 1995. The earliest listed is 1983.

[19] While some of the products resulting from technology R&D are 
similar to those of the application, demonstration, and evaluation of 
new and existing technologies group, the primary distinction is that 
the former includes the development of prototypes and the latter 
generally does not.

[20] Performance measures are to be included in the agency performance 
plan covering each program activity set forth in the budget of such 
agency. Program activity, in this case, refers to projects and 
activities that are listed in program and financing schedules of the 
annual Budget of the United States Government.

[21] U.S. General Accounting Office, Managing for Results: An Agenda to 
Improve the Usefulness of Agencies' Annual Performance Plans, GAO/GGD/
AIMD-98-228 (Washington, D.C.: Sept. 8, 1998).

[22] Annual performance plans describe a department component's goals 
and performance targets in support of the department's long-term 
strategic goals and targets. In its fiscal year 2004 performance plan, 
OST reported actual performance data for fiscal year 2002, enacted 
plans for fiscal year 2003, and performance plans for fiscal year 2004.

[23] Initiatives in this sense encompass portfolio areas, programs, and 
projects.

[24] According to the OMB document, Budget of the United States 
Government (Analytical Perspectives) for fiscal year 2004, basic R&D is 
defined as systematic study directed toward greater knowledge or 
understanding of fundamental aspects of phenomena and of observable 
facts without specific applications toward processes or products in 
mind. Applied R&D is defined as systematic study to gain knowledge or 
understanding necessary to determine the means by which a recognized 
and specific need may be met.

[25] The surveys were done to determine if participants were satisfied 
with the conference as a vehicle of information dissemination. 

[26] To address issues with the mailing lists, the technology centers 
have shipped a larger portion of copies to agencies, in bulk, and to 
individuals who have actively requested copies and supplied their 
addresses; continued to purchase the most current version of the 
National Directory of Law Enforcement Administrators, Correctional 
Institutions and Related Agencies to update their mailing list; and 
modified mailing labels to include the addressee and "..or Training 
Officer" in case the addressee is no longer with that agency.

[27] Initiatives in this sense encompass portfolio areas, programs, and 
projects.

[28] GPRA establishes two approaches for assessing an agency's 
performance: annual measurement of program performance against goals 
outlined in a performance plan and program evaluations to be conducted 
by the agency as needed. Evaluations can play a critical role in 
helping to address measurement and analysis challenges. Performance 
measurement is the ongoing monitoring and reporting of program 
accomplishments, particularly progress toward established goals. 
Program evaluations are individual systematic studies conducted 
periodically or on an ad hoc basis to assess how well a program is 
working. See U.S. General Accounting Office, Performance Measurement 
and Evaluation: Definitions and Relationships, GAO/GGD-98-26 
(Washington, D.C.: April 1998).

[29] The Homeland Security Act actually directs the "Director" of OST 
to transmit the report. After reorganizing in early 2003, NIJ now calls 
this position the assistant NIJ director for science and technology.

[30] U.S. General Accounting Office, Program Evaluation: Strategies for 
Assessing How Information Dissemination Contributes to Agency Goals, 
GAO-02-923 (Washington, D.C.: Sept. 30, 2002)

[31] To determine the goal for each OST program included in the plan, 
we used the stated public benefit statement provided in the plan, 
except for the Law Enforcement Technology R&D program.

[32] See U.S. General Accounting Office, Agency Performance Plans: 
Examples of Practices That Can Improve Usefulness to Decisionmakers, 
GAO/GGD/AIMD-99-69 (Washington, D.C.: Feb. 26, 1999) for our guidance 
concerning intermediate-oriented measures and Managing for Results: 
Critical Issues for Improving Federal Agencies' Strategic Plans, GAO/
GGD-97-180 (Washington D.C.: Sept. 16, 1997).

[33] See U.S. General Accounting Office, Performance Measurement and 
Evaluation: Definitions and Relationships, GAO/GGD-98-26 (Washington, 
D.C.: April 1998). 

[34] Interoperability of communications is the ability to communicate 
across different public safety agencies and jurisdictions. 

[35] In addition, there were 2 federal detailees, 2 visiting 
scientists, and 32 on-site contractors supporting OST.

GAO's Mission:

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. General Accounting Office

441 G Street NW,

Room LM Washington,

D.C. 20548:

To order by Phone: 	

	Voice: (202) 512-6000:

	TDD: (202) 512-2537:

	Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.

General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.

20548: