This is the accessible text file for GAO report number GAO-05-24 
entitled 'Highway Safety: Improved Monitoring and Oversight of Traffic 
Safety Data Program Are Needed' which was released on November 04, 
2004.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to Congressional Committees:

November 2004:

HIGHWAY SAFETY:

Improved Monitoring and Oversight of Traffic Safety Data Program Are 
Needed:

GAO-05-24:

GAO Highlights:

Highlights of GAO-05-24, a report to congressional committees.

Why GAO Did This Study:

Auto crashes kill or injure millions of people each year. Information 
about where and why such crashes occur is important in reducing this 
toll, both for identifying particular hazards and for planning safety 
efforts at the state and federal levels. Differences in the quality of 
state traffic data from state to state, however, affect the usability 
of data for these purposes. The National Highway Traffic Safety 
Administration (NHTSA) administers a grant program to help states 
improve the safety data systems that collect and analyze crash data 
from police and sheriff’s offices and other agencies, and the Congress 
is considering whether to reauthorize and expand the program. The 
Senate Appropriations Committee directed GAO to study state systems 
and the grant program. Accordingly, GAO examined (1) the quality of 
state crash information, (2) the activities states undertook to improve 
their traffic records systems and any progress made, and (3) NHTSA’s 
oversight of the grant program.

What GAO Found:

States vary considerably in the extent to which their traffic safety 
data systems meet recommended criteria used by NHTSA to assess the 
quality of crash information. These criteria relate to whether the 
information is timely, consistent, complete, and accurate, as well as 
to whether it is available to users and integrated with other relevant 
information, such as that in the driver history files. GAO reviewed 
systems in 9 states and found, for example, that some states entered 
crash information into their systems in a matter of weeks, while others 
took a year or more. While some systems were better than others, all 
had opportunities for improvement.

States reported carrying out a range of activities to improve their 
traffic safety data systems with the grants they received from NHTSA. 
Relatively little is known about the extent to which these activities 
improved the systems, largely because the documents submitted to NHTSA 
contained little or no information about what the activities 
accomplished. The states GAO reviewed used their grant funds for a 
variety of projects and showed varying degrees of progress. These 
efforts included completing strategic plans, hiring consultants, and 
buying equipment to facilitate data collection.

NHTSA officials said their oversight of the grant program complied 
with the statutory requirements, but for two main reasons, it does not 
provide a useful picture of what states were accomplishing. First, the 
agency did not provide adequate guidance to ensure that states provided 
accurate and complete data on what they were accomplishing with their 
grants. Second, it did not have an effective process for monitoring 
progress. The agency has begun to take some actions to strengthen 
oversight of all its grant programs. If the Congress decides to 
reauthorize the program, however, additional steps are needed to 
provide effective oversight of this particular program. GAO also noted 
that in proposing legislation to reauthorize the program, one 
requirement was omitted that may be helpful in assessing progress—the 
requirement for states to have an up-to-date assessment of their 
traffic data systems. 

What GAO Recommends:

The Congress may want to consider incorporating into legislation a 
requirement that states have their traffic safety data systems assessed 
at least every 5 years. Further, we are recommending that NHTSA improve 
their management of grant documentation as well as monitoring and 
oversight of grant funds. The Department of Transportation agreed with 
the recommendations in this report.

www.gao.gov/cgi-bin/getrpt?GAO-05-24.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Katherine Siggerud at 
(202) 512-6570 or SiggerudK@gao.gov.

[End of section]

Contents:

Letter:

Results in Brief:

Background:

Quality of Data Systems Varied Greatly, with All States Examined 
Showing a Need for Improvement:

States Carried Out Various Activities Using 411 Grant Funds, but Little 
Is Known about Progress:

NHTSA's Limited Oversight of the 411 Grant Program Contributed to 
Incomplete Knowledge of How Funds Were Used:

Conclusions:

Matter for Congressional Consideration:

Recommendations for Executive Action:

Agency Comments and Our Evaluation:

Appendixes:

Appendix I: Objectives, Scope, and Methodology:

Appendix II: Additional Analysis of Data Quality in NHTSA's State Data 
System:

Variations in Reporting Thresholds Impact the Usefulness of Data in the 
State Data System:

Variations in Reporting Alcohol and Drug Data:

Researchers' Use of Another Database Omits Data on Nonfatal Crashes:

Appendix III: Examples of Federal and Other Efforts at Improving 
Traffic Safety Data:

Appendix IV: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Staff Acknowledgments:

Tables:

Table 1: NHTSA Recommended Criteria for Assessing Quality of Crash 
Information:

Table 2: Extent of Data Integration with Crash Files in the 9 Case-
study States:

Table 3: Status of States in the 411 Grant Program, 1st and 4th Year:

Table 4: Examples of Reported Activities Drawn from Available Documents 
of Participating States:

Table 5: Comparison of MMUCC Guidelines and Crash Information Provided 
to NHTSA by 5 States Regarding Alcohol-and Drug-Impaired Driving:

Figures:

Figure 1: Model State Traffic Safety Data System:

Figure 2: Year of State Traffic Safety Data That Were Used to Develop 
2005 State Highway Safety Plans:

Figure 3: Extent to Which Vehicle Identification Numbers Were Included 
in Data Reported by 17 States in the SDS Program:

Figure 4: Examples of Reported Activities Drawn from Available 
Documents of Participating States:

Figure 5: Types of Expenditures under the 411 Grant Program in 8 Case-
study States:

Figure 6: State Criteria for Filing a Police Crash Report for Property-
Damage-Only Crashes:

Figure 7: Extent to Which States Collected Information about Uninjured 
Passengers:

Figure 8: Percentage of Alcohol Test Results That Were Coded as Missing 
for 1998 and 2002:

Figure 9: Percentage of Alcohol Test Results That Were Coded as Unknown 
for 1998 and 2002:

Abbreviations:

CODES: Crash Outcome Data Evaluation System:

DOT: Department of Transportation:

FARS: Fatality Analysis Reporting System:

FHWA: Federal Highway Administration:

FMCSA: Federal Motor Carrier Safety Administration:

HSIS: Highway Safety Information System:

MMUCC: Model Minimum Uniform Crash Criteria:

NCSA: National Center for Statistics and Analysis:

NHTSA: National Highway Traffic Safety Administration:

SDS: State Data System:

TEA-21: Transportation Equity Act for the 21ST Century:

TraCS: Traffic and Criminal Software program:

TSIMS: Transportation Safety Information Management System:

VIN: vehicle identification number:

Letter November 4, 2004:

The Honorable Richard C. Shelby: 
Chairman:
The Honorable Patty Murray: 
Ranking Minority Member: 
Subcommittee on Transportation/Treasury and General Government: 
Committee on Appropriations: 
United States Senate:

The Honorable Ernest J. Istook: 
Chairman: 
The Honorable John W. Olver: 
Ranking Minority Member: 
Subcommittee on Transportation and Treasury, and Independent Agencies: 
Committee on Appropriations: 
House of Representatives:

Automobile crashes exact an enormous personal and economic toll on this 
country. In 2003, 42,643 people died in automobile crashes in the 
United States, and nearly 2.9 million more were seriously injured. In 
2000, the most recent year for which cost estimates are available, the 
economic cost of fatalities and injuries from crashes totaled almost 
$231 billion.[Footnote 1] Reducing this toll requires making informed 
decisions about safety problems. Traffic safety data, which are 
compiled from police accident reports completed at the scene of crashes 
and assembled largely at the state level, are key to making these 
decisions. The states and the federal government use data from state-
level crash data systems to make many roadway-related spending and 
policy decisions, ranging from deciding to fix particular roadways to 
launching major national safety campaigns, such as preventing alcohol-
impaired driving or increasing seat belt use. There is known 
variability in the quality of information in state traffic data 
systems, and these differences impact the usefulness of data for these 
purposes.

When the Congress was considering reauthorizing various transit and 
highway programs earlier this year, both the House and Senate proposed 
bills that would have expanded the section "411 grant program," which 
was initially authorized in the Transportation Equity Act for the 21ST 
Century (TEA-21).[Footnote 2] The 411 grant program was designed 
specifically to help states improve their traffic safety data systems 
and provided states with $36 million over 6 years.[Footnote 3] The 
House and Senate proposals would have authorized a similar grant 
program with a budget authority of up to $270 million over 6 
years.[Footnote 4] However, an 8-month extension of TEA-21 was passed 
on September 30, 2004, extending current highway and transit programs 
through May 2005 when both bills may be reintroduced.

The Senate Appropriations Committee Report[Footnote 5] accompanying the 
Department of Transportation Appropriations bill for fiscal year 2004 
(S. 1589) directed us to conduct a survey of state traffic safety data 
systems. The committee also asked us to report on the extent to which 
the 411 grant program has led to improvements in these systems. The 
grant program is overseen by the Department of Transportation's (DOT) 
National Highway Traffic Safety Administration (NHTSA), which has 
established six criteria for assessing such systems when states request 
guidance from NHTSA. Four criteria relate to the information itself 
(timeliness, consistency, completeness, and accuracy); two relate to 
the use of the information (accessibility to users and links to other 
related data). Accordingly, this report examines (1) the quality of 
state crash information; (2) the activities states undertook using 411 
grant funds to improve their traffic safety data systems, and the 
progress they made using the grant funds; and (3) NHTSA's oversight of 
the grant program, including what changes in oversight, if any, might 
help encourage states to improve their traffic data systems and ensure 
accountability under a reauthorized program.

To provide information on the quality of state crash data and state 
efforts to improve these data, we conducted site visits, analyzed 
available traffic safety data, and reviewed grant documentation. Using 
several criteria, we selected 9 states to visit for detailed reviews 
and assessed the status of their data systems on the basis of NHTSA's 
six quality criteria for crash information. Eight of these 9 states had 
participated in the 411 grant program.[Footnote 6] To identify 
variations in data structure and quality, we also analyzed crash data 
for 17 states that currently participate in NHTSA's State Data System 
(SDS) program.[Footnote 7] Finally, we reviewed the grant documentation 
submitted by the 48 states that participated in the 411 grant program, 
including grant applications, traffic records assessments, strategic 
plans, progress reports, and highway safety plan annual evaluation 
reports. To provide information about NHTSA's oversight of the program, 
we interviewed NHTSA officials responsible for oversight and 
administration and reviewed NHTSA guidance and policy. We conducted our 
review from January 2004 through October 2004 in accordance with 
generally accepted government auditing standards. See appendix I for 
more details regarding our objectives, scope, and methodology. Because 
an examination of data quality was one of the objectives of this 
report, we also conducted an assessment of data reliability. A more 
complete discussion of data reliability can be found in appendix II.

Results in Brief:

The 9 state traffic safety data systems we reviewed varied widely in 
the degree to which they met NHTSA's six recommended quality criteria 
for crash information.[Footnote 8] None of the state data systems we 
reviewed appeared to meet all of the criteria, which affected the 
usefulness and reliability of their data. For example, while NHTSA's 
timeliness criteria call for data to be available to users preferably 
within 90 days, the states we visited had data available from within 1 
month to 18 months. Delayed access to crash data diminishes the ability 
to identify current and emerging roadway hazards or other safety 
problems and to carry out effective planning efforts, such as the 
development of annual state highway safety plans. Likewise, while 4 
states completed or checked crash report data for accuracy by linking 
the data to information in driver or vehicle licensing files, 3 states 
had no accuracy checks at all. Since crash data are used mainly by 
states for highway safety planning, allocating resources, and measuring 
efforts toward safety goals, the information states collected varied--
reflecting individual state needs. In addition, some states did not 
collect information such as vehicle identification numbers (VIN), which 
are particularly useful in identifying automobile safety concerns. 
Variations such as these can affect DOT's ability to make the state-to-
state comparisons that are necessary to evaluate past safety problems 
and develop future policy. For example, a recent DOT-funded national 
analysis of vehicle braking performance was based on data from only 5 
states, because only these states had the required information for the 
analysis.

States met the requirements of the grant and also undertook a range of 
activities using 411 grant funds. However, our review showed that 
little is known about the progress resulting from these activities. 
Entering 2002, the final year for which funding for the 411 grant 
program was available, 43 of the 44 states that received grants 
reported that the program's basic requirements--an assessment of the 
current data system, a strategic plan for making improvements, and a 
state-level committee to coordinate the effort--were in place.[Footnote 
9] Both the assessments and the strategic plans appeared generally 
helpful in establishing baselines and priorities for improvement. 
Beyond meeting these basic requirements, states were given broad 
flexibility in implementing activities to improve their traffic data 
systems. Although the funding available under the 411 grant program was 
small in comparison with other federal safety grants, states reported 
pursuing a variety of activities, such as contracting with companies to 
help decrease crash report backlogs, redesigning data forms to better 
adhere to recommended guidelines, and attending traffic records 
conferences. However, the documents that states filed with NHTSA 
concerning these activities provided little or no information on how 
far these efforts progressed or what they accomplished. In the states 
we visited, officials were able to provide examples of how their 
efforts had improved data systems. Some of these states spent nearly 
their entire grant on a single project, such as creating a crash 
database, while others spread the money among multiple activities. In 
states where the coordinating committee had broad representation and 
the ability to commit financial resources, projects more fully 
addressed the needs of a broad cross-section of users.

NHTSA conducted limited oversight of the 411 grant program. While the 
grant program's statutory requirements were not explicit about the 
scope of oversight that NHTSA should have undertaken, NHTSA did not 
conduct adequate oversight among states participating in the program, 
which resulted in a poor and uneven picture of what the states were 
doing--or accomplishing--with their grants. Two key oversight elements 
were missing. First, NHTSA's regulatory requirements and guidelines 
required states to submit progress reports under the 411 grant program, 
but were unclear about what information states should report to 
document activities and progress made improving traffic data systems. 
As a result, states were not consistent in the scope and detail of the 
information they reported. Second, NHTSA did not have an effective 
process for monitoring state progress. For example, NHTSA was not able 
to provide us with complete grant documentation (grant applications 
with progress reports, strategic plans, and traffic data systems 
assessments) for about half of the states that had participated in the 
grant program. Hence, NHTSA has limited knowledge of the extent to 
which states improved their traffic data systems through reported 
activities and whether states expended grant monies for intended 
activities. We found, for example, that 2 of the 8 states we visited 
were not accurately reporting on 411-funded activities. NHTSA has 
planned actions to correct some oversight shortcomings across its many 
grant programs, but it is too soon to determine the extent to which 
these actions will help ensure accountability if this grant program is 
renewed. These actions also may not be specific enough to address the 
weaknesses in this one program. In addition, proposed reauthorization 
bills that included a follow-on program for the 411 grant program were 
considered by the Congress in 2004, and these bills included 
requirements related to documenting the use of grant funds and 
demonstrating measurable progress that could result in clearer 
expectations for state reporting and NHTSA oversight.

This report contains one matter for consideration by the Congress 
concerning a requirement that state traffic safety data systems be 
assessed at least once every 5 years. States were to have an assessment 
of their traffic data systems within 5 years of participating in the 
original 411 grant program--this was not included in reauthorization 
proposals. The Congress may wish to add this requirement as it 
considers the legislation. We also recommend that, if the Congress 
reauthorizes the traffic safety incentive grant program, the Secretary 
of Transportation direct NHTSA to ensure better accountability and 
management of grant documentation and improved monitoring and oversight 
of 411 grant funds.

Background:

Traffic safety data are the primary source of knowledge about crashes 
and how they are related to the traffic safety environment, human 
behavior, and vehicle performance. Most states have developed traffic 
safety data systems and manage these data through the initial reporting 
of crashes by law enforcement officers through data entry and analysis. 
Figure 1, which is based on NHTSA's Traffic Records Highway Safety 
Program Advisory[Footnote 10] depicts a model state traffic safety data 
system, including the collection and submission of data, the processing 
of these data into state safety data systems, and the potential uses 
for quality crash information. These data are often not housed in a 
single file or on just one computer system; however, users should have 
access to crash information in a useful form and of sufficient quality 
to support the intended use.

Figure 1: Model State Traffic Safety Data System:

[See PDF for image]

[A] States are required to include highway safety plans in their 
applications for federal funding of their highway safety programs. 
These plans describe the projects and activities that states plan to 
implement to reach goals identified in their performance plans.

[End of figure]

At the state level, state agencies use traffic safety data to make 
highway safety planning decisions and to evaluate the effectiveness of 
programs, among other uses. In those states where quality crash data on 
a range of crashes are not available, officials use federal data such 
as that from NHTSA's Fatality Analysis Reporting System (FARS) to make 
programming decisions.[Footnote 11] FARS data, while useful for some 
purposes, are limited because they only include information about fatal 
crashes, thus preventing decision making based on a range of crash 
severity or the entirety of a state's crash situation. At the federal 
level, NHTSA provides guidelines, recommendations, and technical 
assistance to help states improve their crash data systems and is 
responsible for overseeing state highway safety programs.[Footnote 12] 
Under TEA-21, NHTSA awarded $935.6 million in highway safety incentive 
grants to improve safety. In 2003, NHTSA made improving traffic safety 
data one of the agency's highest priorities.

Since the early 1980s, NHTSA has been obtaining crash data files from 
states, which in turn have been deriving the data from police crash 
reports. These statewide crash data files are referred to as the SDS 
program. Participation by states is voluntary, with 27 states currently 
participating. These data include some of the basic information for the 
analyses and data collection programs that support the NHTSA mission of 
identifying and monitoring traffic safety problems.

One of NHTSA's grant programs was specifically aimed at improving 
traffic safety data. Administered through its 10 regional offices 
around the country, the program provided about $36 million to states 
for improving their crash data systems. This grant program was 
authorized under TEA-21 and was known as the "411 grant program" after 
the relevant section of the U.S. Code.[Footnote 13] NHTSA administers a 
number of other grant programs besides the 411 grant program; however, 
it was the only incentive grant program that was specifically directed 
at improving state traffic safety data systems.[Footnote 14] The grant 
program required states to establish a foundation for improving their 
traffic safety data systems by first completing three activities:

* Establish a coordinating committee of stakeholders to help guide and 
make decisions about traffic safety data: The committee would ideally 
include stakeholders from agencies that manage the various data files 
(e.g., representatives from the state department of transportation 
responsible for roadway information, and from the state department of 
motor vehicles responsible for the management of vehicle licensing 
information).

* Conduct an assessment of the current system: The assessment would 
evaluate a state's system by identifying strengths and weaknesses and 
providing a baseline from which the state could develop its strategic 
plan to address data system needs.

* Develop a strategic plan that prioritizes traffic safety data system 
needs and identifies goals: The strategic plan is to provide the "map" 
specifying which activities should be implemented in order to achieve 
these goals. As with the assessment, the focal point for developing the 
strategic plan, if a state did not already have one, would be the 
coordinating committee.

The level of funding available to a state was dependent on whether 
states had already put these requirements in place. Additionally, 
states were required to contribute matching funds of between 25 and 75 
percent, depending on the year of the grant.[Footnote 15] Three types 
of grants were awarded:

* A state received a start-up grant if it had none of the three 
requirements in place. This was a one-time grant of $25,000.

* A state received an initiation grant if it had established a 
coordinating committee, had completed or updated an assessment within 
the previous 5 years, and had begun to develop a strategic plan. This 
grant was a one-time grant of $125,000, if funds were available.

* A state received an implementation grant if it had all three 
requirements in place and was positioned to make specific improvements 
as indicated in its strategic plan. This grant was at least $250,000 in 
the first year and $225,000 in subsequent years, if funds were 
available.[Footnote 16]

The Congress has extended TEA-21 until May 2005, and new House and 
Senate bills will likely be introduced during the next congressional 
session. The most recent House and Senate bills under 
consideration,[Footnote 17] which were not passed in the 2004 session, 
included proposals to reauthorize the 411 grant program in a similar, 
but not identical, form to the original program. The proposals included 
funding up to $270 million, which is over six times the original 
funding amount. They also included (1) additional requirements for 
documentation from states describing how grant funds would be used to 
address needs and goals in state strategic plans and (2) a requirement 
that states demonstrate measurable progress toward achieving their 
goals. The proposals, however, did not include one of the original 
program requirements--that states have an assessment of their traffic 
safety data systems that is no more than 5 years old when they applied 
for the grant.

Quality of Data Systems Varied Greatly, with All States Examined 
Showing a Need for Improvement:

The 9 states we examined in detail varied considerably in the extent to 
which their traffic safety data systems met NHTSA's recommended 
criteria for the quality of crash information. NHTSA's six criteria 
(shown in table 1 below, along with an explanation of each criterion's 
significance) appear in the agency's Traffic Records Highway Safety 
Program Advisory, the guide used by NHTSA when it carries out traffic 
records assessments at the request of state officials. These 
assessments are a technical assistance tool offered to state officials 
to document state traffic safety data activities, note strengths and 
accomplishments, and offer suggestions for improvement. In addition, 
NHTSA released the report Initiatives to Address Improvement of Traffic 
Safety Data in July 2004, which emphasized these data quality criteria 
and provided recommendations to states. We examined all six criteria 
for the 9 case-study states, and our review of 17 states that 
participated in NHTSA's SDS program provided additional information for 
three of these six criteria.[Footnote 18] None of the 9 states in our 
case-study review met all six criteria, and most had opportunities for 
improvement in many of the criteria. The sections below discuss each 
criterion.

Table 1: NHTSA Recommended Criteria for Assessing Quality of Crash 
Information:

Criteria: Timeliness: Crash information should be available for 
analytical purposes within a useful time frame for identifying crash 
problems within a state--preferably within 90 days of a crash; 
Significance: Timely crash data allow for the use of up-to-date 
information to identify safety problems, for policy making, and for 
resource allocation, among other uses.

Criteria: Consistency: Crash information should be consistent among 
reporting jurisdictions within a state. It should also be consistent 
with nationally accepted and published guidelines and standards, such 
as the Model Minimum Uniform Crash Criteria; 
Significance: Uniform data within a state allow for the timely merging 
of data sets and the identification of traffic safety problems as they 
arise. In addition, states benefit by being able to compare their 
results nationally and with one another to identify traffic safety 
problems and manage and monitor progress toward fixing them. Finally, 
consistent state standards for determining which crashes to report 
allow for national comparisons.

Criteria: Completeness: Data should be collected for all reportable 
crashes in the state and on all appropriate crash variables; 
Significance: Adherence to state reporting requirements permits 
evaluation of the effectiveness of countermeasures initiated by the 
state. Complete data also generate a picture of safety performance 
useful for states to qualify for highway safety incentive funding.

Criteria: Accuracy: Quality control methods should be employed to 
ensure accurate and reliable crash information for both individual 
crashes and aggregate crash information; 
Significance: Comprehensive information is necessary to understand 
what makes a difference and what has a direct impact on reducing 
deaths, injuries, injury severity, and costs.

Criteria: Accessibility: Crash information should be readily and 
easily accessible to the principal users of such data. This applies 
both to direct access of crash information from the appropriate crash 
databases and to standard reports generated from crash data; 
Significance: Accessible data enable the use of data to identify 
safety problems and for resource allocation, the quick evaluation of 
recent traffic safety initiatives, the use of data for reporting 
requirements, and the ability to respond to inquiries and requests 
from state legislative and executive branches, among others.

Criteria: Data integration: Crash information should be capable of 
linking to other information sources. Such linking could be 
accomplished through the use of common identifiers or probabilistic 
data-matching methods; 
Significance: Links make it possible to evaluate the relationship 
between specific roadway, crash, vehicle, and human factors at the 
time of a crash. They also permit these factors to be linked to health 
outcome data to determine their association with specific medical and 
financial consequences, which facilitates choosing safety priorities 
that have the most impact on reducing death and disability.

Source: GAO analysis based on NHTSA's Traffic Records Highway Safety 
Program Advisory.

[End of table]

Timeliness: In Some States, Available Data Were Several Years Old:

Data processing time frames ranged widely in the 9 states we visited. 
Three of the 9 states met NHTSA's 90-day timeliness criterion for 
having useful data available. Data processing times for the 9 states 
ranged from less than 1 month in 2 states to 18 months or more in 2 
others. For example, to develop their 2005 highway safety plans during 
2004, 4 of the 9 states used data from 2000, 2001, or 2002, and the 
remaining 5 states used 2003 data. (See fig. 2.)

Figure 2: Year of State Traffic Safety Data That Were Used to Develop 
2005 State Highway Safety Plans:

[See PDF for image]

[A] One state received an extension on the due date of its safety plan 
to allow for additional processing of 2002 data. Without this 
extension, planning would have been done using 2001 crash information.

[B] One state used preliminary 2003 data because data entry of 
location information had not been completed. 

[End of figure]

For 6 of the 9 states, three factors accounted for their not meeting 
the timeliness criterion: slow data entry, data integration delays, and 
lengthy data edits. As a result, the state safety plans are unable to 
take recent crash trends into account in these states. Generally, those 
states submitting data electronically from local law enforcement 
agencies to the state traffic safety data system had much faster entry 
of crash information into centralized databases.[Footnote 19] In 
contrast, states that processed reports manually by keying in 
information from paper forms at the state level had longer data entry 
time frames. The availability of data was also sometimes delayed by 
inefficient data completion processes.[Footnote 20] In states where 
this is not done automatically, crash data and location information 
are often manually entered into the traffic safety data system. In 
addition, checks for accuracy also delayed data availability. For 
example, 1 of the states that had to use data from 2000 to develop its 
highway safety plan, had used electronic methods to enter more recent 
data, but detailed edit checks delayed the data's release considerably.

Consistency: Data Were Not Consistent Across States, Even for Basic 
Information:

Seven of the 9 states we visited had crash forms that could be used to 
collect data across all jurisdictions within the state, helping to 
ensure that data collected within the state are consistent. However, no 
state had forms that met all of the consistency criteria recommended in 
the Model Minimum Uniform Crash Criteria (MMUCC) guidelines that were 
developed collaboratively by state and federal authorities. These 
guidelines provide a recommended minimum set of data elements to be 
collected for each crash, including a definition, attributes, and the 
rationale for collecting each element. While variation in the crash 
data collected by states can be attributed to varying information 
needs, guidelines help to improve the reliability of information 
collected and also assist in state-to-state comparisons and national 
analyses. The variation between states can be seen among the 17 states 
we analyzed that contribute to NHTSA's SDS program.[Footnote 21] For 
example, the MMUCC guidelines recommend reporting on whether alcohol 
was a factor in the crash by indicating the presence or absence of an 
alcohol test, the type of test administered, and the test results. 
However, several of the states collected information on impaired 
driving without specifying the presence of an alcohol test, the test 
type, or the test result; thereby making it difficult to determine 
whether alcohol-use contributed to the crash. In addition, the states 
were not uniform in collecting and reporting the VIN, another element 
recommended in the MMUCC. A VIN is a unique alphanumeric identifier 
that is applied to each vehicle by the manufacturer. The VIN allows for 
an effective evaluation of vehicle design characteristics such as 
occupant protection systems. As figure 3 shows, data about VINs were 
not available for all 17 states in crashes for any year between 1998 
and 2002. For example, although every state had submitted crash data 
for 1998 and 1999, crash data for 6 of the 17 states did not include 
VINs.

Figure 3: Extent to Which Vehicle Identification Numbers Were Included 
in Data Reported by 17 States in the SDS Program:

[See PDF for image]

[End of figure]

The lack of consistency limits the use of state crash data for most 
nationwide analyses. For example, in a recent National Center for 
Statistics and Analysis (NCSA)[Footnote 22] Research Note on child 
safety campaigns, only 3 states met the criteria to be included in the 
analysis, not nearly enough data to statistically represent the nation. 
The criteria necessary for inclusion in the report included collecting 
data on all vehicle occupants, including uninjured occupants, and VINs 
for the relevant years (1995-2001). If state systems matched the MMUCC, 
they would include this information. Similarly, only 5 states qualified 
for use in a NCSA analysis of braking performance as part of the New 
Car Assessment Program because only these states collected VINs and had 
the necessary variables for the years involved in the study.

There is evidence that as states redesign their crash forms, they are 
following the MMUCC guidelines more closely. Remaining differences from 
the suggested guidelines often reflect the needs of individual states. 
Among the 9 states we visited, 5 had redesigned their crash forms since 
1997. All 5 used the guidelines as a baseline, although each of them 
tailored the form to a degree. One state, for example, collected no 
data about the use of seat belts for uninjured passengers, while 
another chose to collect additional state-specific attributes, such as 
describing snow conditions (e.g., blowing or drifting). Among the 
remaining 4 states we visited, 2 states are currently using the MMUCC 
guidelines to redesign their forms.

Completeness: Gaps Existed in the Completeness of Reporting:

One factor affecting the degree of completeness is state reporting 
thresholds--that is, standards that local jurisdictions use to 
determine whether crash data should be reported for a particular crash. 
These thresholds include such things as the presence of fatalities or 
injuries or the extent of property damage. Although all 9 of the states 
we visited had reporting thresholds that included fatalities and 
injuries, the thresholds for property damage varied widely. For 
example, some states set the property damage threshold at $1,000, while 
1 state did not require reporting of property-damage-only crashes. In 
addition, it was not possible to determine the extent that all 
reportable crashes had been included in the traffic safety data system. 
Officer discretion may play a role. For example, capturing complete 
documentation of a crash event is often a low priority when traffic 
safety data are not perceived as relevant to the work of the law 
enforcement officer or other public safety provider. In 1 state, for 
example, the police department of a major metropolitan area only 
reported crashes involving severe injuries or fatalities, although the 
state's reporting threshold included damage of $1,000 or more.

Variation in thresholds among states is not the only factor that 
affects the completeness of crash data. For the crash information that 
does make it into the state database, there are often gaps in the data, 
as we learned from evaluating the records of 17 states participating in 
NHTSA's SDS program. For 5 of these states, we analyzed data coded 
"unknown" and "missing" for 24 data elements. The percentage of data 
coded as unknown or missing was frequent for several key data elements, 
such as the VIN; the results of alcohol or drug testing; and the use of 
seat belts, child car seats, and other restraint devices. For example, 
the percentage of data coded as unknown or missing for the use of seat 
belts and other restraints ranged between 1.5 and 54.8 percent for 4 of 
the 5 states. Such data can be inherently difficult to 
collect.[Footnote 23] For example, when officers arrive at the scene of 
a crash, drivers and passengers may already be outside their vehicles, 
making it impossible to know if they were wearing seat belts. Asked if 
they were wearing a seat belt, those involved in the crash may not tell 
the truth, especially if the state has a law mandating seat belt use.

Accuracy: States Varied Greatly in the Extent of Checks for Data 
Accuracy:

Six of the 9 states we visited made use of quality control methods to 
help ensure that individual reports were accurate when they were 
submitted to the traffic safety data system. Of these 6 states, for 
example, 4 linked crash reports to other traffic safety data, including 
driver or vehicle files, to verify or populate information on crash 
reporting forms. Table 2 contains examples of other tools and checks 
that the states used to help ensure accuracy.

Table 2: Examples of Tools and Processes That Were Used to Ensure the 
Accuracy of Individual Crash Reports in 6 Case-study States:

On-scene data verification tools: * Drop-down menus to complete crash 
reports; 
* Scannable bar codes on driver licenses or vehicle registrations; 
* Wireless connections to vehicle or driver files; 
* Global positioning system location equipment; 

Postcrash reporting accuracy checks: 
* Automatic validity checks included in electronic data submission; 
* Links to driver, vehicle, or roadway files to validate or populate 
crash reports using common identifiers, such as license plate numbers, 
driver names, or location identifiers.

Source: GAO analysis of information provided by states.

[End of table]

Four of the 9 states did quality checks at the aggregate level--that 
is, when crash reports are analyzed in batches to identify 
abnormalities in reporting that may not be apparent looking at 
individual reports. Of these 4 states, for example, 1 had staff analyze 
the reports to identify invalid entries and data miscodings, while 
another conducted edit checks each year to check for invalid vehicle 
types or other problems. Such aggregate-level analysis can be useful to 
identify systematic problems in data collection that may lead to 
erroneous investigation or false conclusions, such as when officers 
report one type of collision as another. For instance, officers in 1 
state were found to be characterizing some car-into-tree crashes as 
head-on collisions. Once identified, such data collection problems can 
often be resolved through officer training.

To test data accuracy, we analyzed crash data submitted by the 17 
states to NHTSA and found relatively few instances of data that had 
been coded as "invalid"--generally 3 percent or less. Data classified 
as invalid were most often for elements more likely to be transposed or 
miscopied, such as VINs. However, because we could not observe crash-
scene reporting and did not examine or verify information on source 
documents (such as police accident reports), we cannot assume that the 
other 97 percent of data were accurately reported and entered 
correctly. Invalid data entries are a good starting point for measuring 
the accuracy of a data system, but they are only one indication of the 
accuracy of state traffic safety data.

Accessibility: Crash Data Were Accessible to Users in Varying Ways:

All 9 states produced crash information summaries, although some were 
based on data that were several years old--a factor that limited their 
usefulness. In addition, 8 states provided law enforcement agencies or 
other authorized users with access to crash information within 6 months 
of crashes. Such access was often via the Internet, and data analysis 
tools were typically limited to a certain number of preestablished data 
reports. Thus, any in-depth analysis was limited to the tools available 
online. Three states had analysts available to provide information or 
complete data queries upon request. In another state, which had the 
capability to conduct data collection electronically, local law 
enforcement agencies had access to analysis tools to use with their own 
data.

If users wanted direct access to completed data for more detailed 
analysis, they often had to wait somewhat longer, given the need for 
additional data entry or the completion of accuracy checks. In 1 state, 
for example, there was a 2-to 3-month delay due to the transfer of 
preliminary crash data from the state police database to the state 
department of transportation where location information was added to 
complete the data.

Data Integration: Only 1 State Integrated Traffic Safety Information 
with All Databases:

Only 1 of the 9 states integrated the full array of potential 
databases--that is, linked the crash file with all five of the files 
typically or potentially available in various state agencies: driver, 
vehicle, roadway, citation/conviction, and medical outcome. All 9 of 
the states we visited integrated crash information with roadway files 
to some degree, but only a few integrated these data with driver or 
vehicle licensing files, or with the conviction files housed in state 
court systems. (See table 3.) In addition, 7 of the 9 states 
participated in NHTSA's Crash Outcome Data Evaluation System (CODES) 
program,[Footnote 24] which links crash data with medical information 
such as emergency and hospital discharge data, trauma registries, and 
death certificates.

Table 3: Extent of Data Integration with Crash Files in the 9 Case-
study States:

Type of file: Driver licensing; 
Reported state links with crash data: Three states reported having 
direct links between the crash database and driver records. Another 
state reported that the state department of public safety pulled 
relevant information directly from crash reports, including insurance 
information.

Type of file: Vehicle licensing; 
Reported state links with crash data: Two states reported links with 
the vehicle file. Each of these transfers was done on at least a 
weekly basis.

Type of file: Roadway; 
Reported state links with crash data: All 9 of the states reported 
integrating crash data with roadway information, although in 3 states, 
the integration was done by the state department of transportation 
with the creation of a new database involving additional data entry or 
file manipulation.

Type of file: Citation/Conviction; 
Reported state links with crash 
data: Two states reported linking citation information with conviction 
data from the state court databases. Another state noted that 20 
percent of citation information was submitted to the department of 
public safety.

Type of file: Medical outcome; 
Reported state links with crash data: Seven states were involved in 
the CODES program, which links crash data with medical files, such as 
emergency medical service, hospital in/outpatient, and death 
certificate information. Six of these states carried out data 
integration using probabilistic linkages, and the 7th was preparing to 
do so.

Source: GAO analysis of information provided by the case-study states.

[End of table]

Technological challenges and the lack of coordination among state 
agencies often posed hurdles to the integration of state data. In 1 
state, for example, crash files were sent from the central traffic 
records database kept by the state department of safety to the state 
department of transportation for manual entry of location information 
from the roadway file. Once the state department of transportation 
completed these records, however, there was no mechanism to export that 
information back into the central database. Also, in some states data 
integration was limited because data were not processed with 
integration in mind. In 1 state, for example, state department of 
transportation officials noted that the new crash system had been 
developed for state police use, and that efforts were still under way 
to develop an interface to bring crash data into the department's 
system. In contrast, a state official in another state noted that the 
housing of several agencies involved in the traffic safety data system-
-including those responsible for the driver, vehicle, and roadway 
files--in the state department of transportation had facilitated the 
direct sharing of information and the full integration of data.

NHTSA Continues to Emphasize Quality Criteria:

In support of these quality criteria and improved traffic safety data 
systems, NHTSA released a report in July 2004 detailing steps that 
could be taken by federal and state stakeholders to improve traffic 
safety data. The report, Initiatives to Address Improvement of Traffic 
Safety Data, was issued by NHTSA and drafted by an Integrated Project 
Team that included representatives from NHTSA, the Bureau of 
Transportation Statistics, the Federal Highway Administration, and the 
Federal Motor Carrier Safety Administration. The report articulates the 
direction and steps needed for traffic safety data to be improved and 
made more useful to data users. It makes a number of recommendations 
under five areas, including improving coordination and leadership, 
improving data quality and availability, encouraging states to move to 
electronic data capture and processing, creating greater uniformity in 
data elements, and facilitating data use and access. Along with these 
recommendations, the report also outlines initiatives that NHTSA and 
other stakeholders should implement. For example, under the area of 
data quality and availability, the report indicates that states--under 
the guidance of their coordinating committees--should encourage 
compliance by law enforcement with state regulations for obtaining 
blood-alcohol concentration and drug use information and should also 
strive to capture exact crash locations (using latitude and longitude 
measures) in their traffic safety data systems.

States Carried Out Various Activities Using 411 Grant Funds, but Little 
Is Known about Progress:

States reported carrying out a range of activities with funding made 
available under the 411 grant program. However, relatively little is 
known about the extent to which they made progress in improving their 
traffic safety data systems for the years of the grant. When applying 
for follow-on grants, states were required to report to NHTSA's 
regional offices on the progress they were making in improving their 
traffic safety data systems during the prior year. However, the 
required documents filed with NHTSA yielded little or no information on 
what states had achieved. We were able to discern from the 8 states we 
reviewed in detail that those states had indeed used their grants for a 
variety of projects and showed varying degrees of progress.[Footnote 
25] Regardless of whether states concentrated their grant funds on one 
project or funded a number of activities, the level of progress was 
influenced by the effectiveness of state coordinating committees.

Most States Received Grants for 4 Years and Initiated State Data 
Improvements:

Forty-eight states applied for and received grant awards under the 411 
grant program. As table 4 shows, most states (29) began their 
participation at the implementation grant level--that is, most of them 
already had the three basic requirements in place, including a 
coordinating committee, an assessment of their data system, and a 
strategic plan for improvement. Those states receiving start-up or 
initiation grants were expected to put the three requirements in place 
before beginning specific data-related improvement projects. By the 4TH 
year of the grant, 44 states were still participating, and all but 1 
was at the implementation grant level. The 4 states that were no longer 
participating by the 4TH year reported that they discontinued 
participation mainly because they could not meet grant requirements.

Table 4: Status of States in the 411 Grant Program, 1ST and 4TH Year:

Type of grant: Start-up; 
1st year of grant (1999): Number of states[A]: 7; 
1st year of grant (1999): Size of grants: $25,000; 
4th year of grant (2002): Number of states: 0; 
4th year of grant (2002): Size of grants: $N/A.

Type of grant: Initiation; 
1st year of grant (1999): Number of states[A]: 11; 
1st year of grant (1999): Size of grants: $63,100; 
4th year of grant (2002): Number of states: 1; 
4th year of grant (2002): Size of grants: $124,524.

Type of grant: Implementation; 
1st year of grant (1999): Number of states[A]: 29; 
1st year of grant (1999): Size of grants: $126,260; 
4th year of grant (2002): Number of states: 43; 
4th year of grant (2002): Size of grants: $224,151. 

Source: GAO analysis of information provided by NHTSA.

Note: The program was not funded for the final 2 years of TEA-21.

[A] A total of 48 states participated in the grant program between 1999 
and 2002; 47 began in 1999, and 1 state began in 2000.

[End of table]

All three basic program requirements were useful to states to initiate 
or develop improvements in their traffic safety data systems. By 
meeting these grant requirements, states were able to "jump start" 
their efforts and raise the importance of improving state traffic 
safety data systems. The assessments, which were required to be 
conducted within 5 years of the initial grant application, provided 
benchmarks and status reports to NHTSA and state officials and included 
information on how well state systems fared in regard to NHTSA's six 
recommended quality criteria. Officials with whom we spoke generally 
agreed that these assessments were excellent tools for systematically 
identifying needed state improvements. Similarly, strategic plans 
generally appeared to be based on the state assessment findings and 
helped states identify and prioritize their future efforts. The 
establishment of the traffic records coordinating committees to guide 
these efforts was also key to initiating improvements, since traffic 
safety data systems involve many departments and their cooperation is 
essential in developing and implementing improvements to a state 
traffic safety data system.

Progress Reports Were Limited and Difficult to Assess:

Documentation of state progress was limited and of little use in 
assessing the effect of traffic safety data improvement efforts. To 
qualify for grants beyond the first year, each state had to (1) certify 
that it had an active coordinating committee and (2) provide 
documentation of its efforts through updated strategic plans, separate 
progress reports, or highway safety annual evaluation reports. We 
reviewed these documents when available and found that they contained a 
variety of activities, ranging from completing the basic requirements 
(such as conducting assessments and developing strategic plans) to 
identifying specific projects (such as outsourcing data entry services 
and redesigning crash forms). Figure 4 lists examples of these types of 
reported activities.

Figure 4: Examples of Reported Activities Drawn from Available 
Documents of Participating States:

[See PDF for image]

[End of figure]

The grant documentation NHTSA received provided few details on the 
quality of the state efforts.[Footnote 26] For example, although states 
certified the existence of a coordinating committee, they were not 
required to report on what the committee did or how well it functioned. 
Also, while states for the most part identified efforts to improve 
their data systems, we found it difficult to assess their progress 
because the reports lacked sufficient detail. For example:

* One state reported using grant funds on alcohol testing devices to 
collect more alcohol impairment data on drivers. However, the progress 
reports did not indicate who received these devices and how data 
collection was improved.

* One state used funds to hire data entry staff to reduce the backlog 
of old crash reports. However, the state provided no indication of 
whether the increase in staff had reduced the backlog and how any 
reduction in the backlog could be sustained in the longer term.

* One state reported using funds on multimillion dollar information 
technology projects, but it is unclear how the grant funds were used in 
these projects.

Case-study States Conducted a Variety of Activities Ranging from One 
Specific Project to a Variety of Activities:

Our visits to 8 of the states that participated in the 411 grant 
program yielded additional information and documentation about their 
grant activities, the nature of their efforts, and the extent of 
progress made. These states expended funds on a variety of activities, 
ranging from completing the basic requirements of assessments and 
strategic plans to implementing specific projects. As figure 5 shows, 
in the aggregate, these activities translated into two main types of 
expenditures--equipment, such as computer hardware and software, and 
consultant services, such as technical assistance in designing new data 
systems.

Figure 5: Types of Expenditures under the 411 Grant Program in 8 Case-
study States:

[See PDF for image]

[End of figure]

The 8 states either concentrated funding on one large project or used 
funding on a variety of activities, including data entry, salaries, 
training, and travel. Four of the 8 states focused on a single project 
related to improving their data systems mainly by enhancing electronic 
reporting. One state reengineered its files to better integrate them 
with other data systems; 1 piloted an electronic crash data collection 
tool; and the remaining 2 created new electronic data systems, which 
were upgrades from their previous manual systems. These states also 
improved the tools used by law enforcement officers to input data into 
their crash systems, such as software for mapping and graphing traffic 
crashes or laptop computers for patrol cars so that law enforcement 
officers could collect and transmit crash data electronically to 
statewide repositories.

The remaining 4 states used funding on multiple activities, such as 
obtaining technical support, adding capability for more data entry, or 
attending conferences. Some also conducted pilot projects. For example, 
1 state created a project that enabled electronic uploads of traffic 
citation data from local agencies to the state department of motor 
vehicles. According to state officials, this project helped 
considerably with both timeliness and completeness in the uploading of 
conviction information to driver files. In another example, the state 
used funding to pilot a project to capture data about crashes 
electronically.

States made improvements under both the single-and multiple-project 
approaches. One state that focused on a single project, for example, 
developed a new statewide electronic crash system that officials said 
had improved data timeliness and completeness. Similarly, of the states 
that spread funding among multiple activities, 1 state used funding for 
a data project on driver convictions--paying for traffic records 
staff's salaries and hiring consultants to map crashes to identify 
roadway issues. As a result, the quality and completeness of crash data 
improved overall, according to a state official.

One factor that affected state progress was the relative effectiveness 
of the state's coordinating committee. In those states, where the state 
coordinating committee did not actively engage all stakeholders or 
where its level of authority was limited, projects did not fully 
address system needs. For example, 1 state established a coordinating 
committee that included few stakeholders outside the state police, and 
this committee decided to concentrate funding on a new electronic crash 
data system. The new system, acknowledged by many stakeholders as 
improving the timeliness and completeness of crash data, resulted in a 
useful resource allocation and crash-reporting tool for the state 
police to allocate resources and report on crashes. According to 
officials at the state department of transportation, however, 
improvements in the crash information did not effectively serve to 
facilitate the state's use of crash data to identify unsafe roadways 
because the state department of transportation was not fully engaged in 
the coordinating committee's process.

Similarly, in another state, the coordinating committee lacked the 
authority needed to fully implement its efforts. The coordinating 
committee created two subcommittees--a technical committee and an 
executive committee. While the executive committee was made up of 
higher level managers from various agencies, the coordinating committee 
did not have the legislative authority to compel agencies to 
participate in the process or to even use the newly created statewide 
crash data system. To date, the state does not have all key 
stakeholders participating in the process and is continuing to have 
difficulty persuading the largest municipality in the state to use the 
newly developed statewide electronic reporting system. As a result, the 
municipality continues to lag behind other communities in having its 
crash information entered into the state crash system. In contrast, 
another state's coordinating committee had the authority to approve or 
reject proposals for data system improvements as well as funding. This 
state was able to complete several agreed-upon projects, including 
implementing an electronic driver citation program, which improved the 
completeness and timeliness of the state crash data.

NHTSA's Limited Oversight of the 411 Grant Program Contributed to 
Incomplete Knowledge of How Funds Were Used:

NHTSA did not adopt adequate regulations or guidelines to ensure states 
receiving 411 grants submitted accurate and complete information on 
progress they were making to improve their traffic safety data systems. 
In addition, the agency did not have an effective process for 
monitoring progress and ensuring that grant monies were being spent as 
intended. We found some examples where states did not report their 
progress accurately. NHTSA, while beginning to take some actions to 
strengthen program oversight, must be more proactive in developing an 
effective means of holding states accountable under this program.

Regulatory Requirements and Guidance for 411 Program Activities Were 
Not Specific:

In our previous discussion about activities being carried out under the 
grant program, we described how state documentation of progress often 
contained too little detail to determine anything about the progress 
being made as a result of activities being funded with program grants. 
Reasons for this lack of information, in our view, were NHTSA's limited 
regulatory requirements and inconsistent guidance about what 
information states should submit.

Regulations for the 411 grant program required states to submit an 
updated strategic plan or a progress report, but did not specify how 
progress should be reported. Further, NHTSA's regulations required 
states to report on progress as part of their 411 grant application, 
which in effect meant that states did not have to report specifically 
on 411 activities after fiscal year 2001. According to NHTSA 
regulations, states were to include information on progress through 
their highway safety plans and annual evaluation reports after fiscal 
year 2001, which are part of the reporting for all of NHTSA's highway 
safety grants. However, our analysis of these documents found that they 
lacked the detail needed to adequately assess state activities 
undertaken with 411 funds. Further, while NHTSA officials told us they 
also informally obtained information about progress after fiscal year 
2001, the available information about what the activities actually 
accomplished was limited. Limitations in the information regarding 
states activities were particularly significant given that states spent 
most of their grant funds after fiscal year 2001.

NHTSA regional offices supplemented the regulatory requirements with 
their own guidance to states, but the guidance varied greatly from 
region to region. Some of the regional offices said that their contact 
with states about these requirements was informal, and that their 
primary contact with states (1) was over the telephone or by e-mail and 
(2) was generally in regards to technical assistance, such as training 
or referring states to existing guidelines. Other regional office staff 
said they had additional contact with states through participation in 
meetings of state coordinating committees, where they were able to 
provide additional assistance. However, we found this participation 
occurred most often for states in proximity to NHTSA regional offices. 
Few regional offices provided written guidance to states with specific 
direction on what to include in their progress reports. For the regions 
that did so, the requested information included documentation 
indicating how states intended to use the current year grant funds, a 
list of projects implemented in the past fiscal year, a brief 
description of activities completed, an account of problems 
encountered, and the status of allocated funds.

Without consistent and clear requirements and guidance on the content 
of progress reports, states were left to their own devices. We found 
that even in regions where NHTSA officials outlined the information 
that should be included in the progress reports, states did not 
necessarily provide the level of information needed for NHTSA to 
adequately track state progress. For example, in 1 region, states were 
to provide NHTSA with documentation that included a list of projects 
and a description of progress made. However, 1 state in that region did 
not provide the list of completed projects; it only provided a brief 
description of projects completed during 1 of the 4 years of the grant.

We also found a wide variation in how states reported their activities. 
For example:

* Some states provided brief descriptions of the activities completed 
or under way, while others did not.

* States that provided brief descriptions of their activities did not 
always include the same information. For example, some states indicated 
how they were intending to use the current grant funds but did not list 
projects implemented in the past year. Some states did not indicate the 
status of their allocated funds for ongoing activities.

* None of the states in our review indicated problems that were 
encountered in implementing projects or activities.

Monitoring of State Progress and Activities Was Lacking:

Under the 411 grant program, NHTSA's oversight process for monitoring 
state progress and ensuring that funds were spent in line with program 
intent was limited. In fact, NHTSA was unable to provide copies of many 
of the documents that states were required to submit to qualify for the 
411 grant program. We requested these documents beginning in February 
2004, and NHTSA was only able to provide us with complete documentation 
for half of the states participating in the program.[Footnote 27]

When we visited 8 states that participated in the program, we were able 
to compare expenditure reports obtained from the states with activities 
that were reported to NHTSA. We found instances in which documentation 
of state reported activities provided by NHTSA did not match 
information provided directly to us by the states.

* In documentation submitted to NHTSA, 1 state reported using grant 
funds on alcohol breath test devices. However, documents available at 
the state level indicate that nearly all of the funds were expended on 
a single project to redevelop a crash data system. Officials we spoke 
with also indicated that the money had gone for redeveloping the data 
system.

* In a report to NHTSA, 1 state we visited had reported undertaking 
four projects, but we found that two of them were actually funded by a 
different federal grant.

The degree to which NHTSA monitored state 411-funded activities was 
difficult to determine. NHTSA officials told us that they were not 
required to review state 411-funded activities in detail. A few 
regional office officials told us that they verified state reported 
activities by linking them to objectives identified in state strategic 
plans; however, no documentation of these reviews was provided.

Recent Steps Were Announced for Improving Oversight, but Impact on the 
411 Program Is Unclear:

NHTSA has taken several steps to improve its oversight and assist 
states in improving their traffic safety data systems; however, more 
efforts are needed. As we were completing our work, NHTSA released a 
report, Initiatives to Address Improvement of Traffic Safety Data, that 
provides the status of data systems in five areas, including 
coordination and leadership, improving data quality and availability, 
encouraging states to move to electronic capture and processing, 
creating greater uniformity in data elements, and facilitating data use 
and access. It also provides recommendations and initiatives in support 
of NHTSA's efforts to improve state traffic safety data systems. 
Although the report outlines (1) steps to be taken, (2) stakeholder 
responsibilities for each recommendation, and (3) the general outcomes 
expected, the extent to which actions will occur as a result of the 
report is unclear. The report is limited to a description of conditions 
and needs for traffic safety data improvements and does not include an 
implementation plan with milestones or timelines. The report 
acknowledges that due to limited funding, NHTSA will focus primarily on 
recommendations that are feasible given current resources. According to 
NHTSA, the report was issued as a fact-finding status report and, 
therefore, no timelines or milestones were included. However, beginning 
October 2004, a newly created National Traffic Records Coordinating 
Committee is developing an implementation plan for the goals identified 
in the report.

NHTSA also recently enhanced its oversight tools for all safety grants. 
It has mandated management reviews every 3 years and also expanded its 
existing regional planning documents for the areas of occupant 
protection and impaired driving, with three additional areas, including 
traffic safety data.[Footnote 28] The first of these regional action 
plans aimed at data improvements are being initiated fiscal year 2005 
and include goals, objectives, and milestones. Mandating management 
reviews that encompass the broad array of grant programs every 3 years 
is an improvement over the inconsistent application of these reviews in 
the past. Also, by establishing traffic safety data improvements as 
part of the regional action plans, NHTSA will have more uniform 
tracking of state data improvements and also better information on 
state progress. While these newly initiated efforts are positive steps 
to improving oversight, it is too soon to tell how effective they will 
be for monitoring and ensuring accountability under the 411 grant 
program, should the Congress chose to reauthorize it.

Language in Reauthorization Bills Also Enhances Oversight, but Omits 
One Key Step:

NHTSA's oversight of the 411 grant program may be strengthened under 
reauthorized legislation. Proposed reauthorization bills that were 
considered by the Congress in 2004 included additional requirements 
that states (1) demonstrate measurable progress toward achieving goals 
in their strategic plans and (2) specify how they will use grant funds. 
These additional provisions would be important steps in addressing the 
too-vague reporting requirements of the current program and would be 
helpful in addressing congressional and other inquiries about what the 
program is accomplishing.

As the previous proposed bills were drafted, however, they omitted one 
requirement that will be important in tracking state progress--the 
requirement for a state to have an assessment of its traffic safety 
data system no more than 5 years prior to participating in the 411 
grant program. Assessments are used mainly to establish the status of 
state efforts, but state and NHTSA officials suggest that updated 
assessments could also help in tracking state progress. During our 
review, we found some assessments submitted by states that were nearly 
10 years old. We also found that assessments based on recent 
information reflected the dynamic and often-changing reality of state 
systems. For example, 1 of our case-study states had recently conducted 
an assessment in 2002. When we compared the information we had 
collected during our site visit, we found much of the information from 
our visit reflected in the assessment. Updating these assessments at 
least every 5 years would allow NHTSA to track state progress. 
According to NHTSA officials, these assessments were valuable starting 
points in helping states take stock of the strengths and weaknesses of 
their entire systems. Updated assessments would take into account 
changes made as a result of the new 411 grant program and other efforts 
to improve the system since previous assessments were conducted.

Conclusions:

The states and the federal government base significant roadway-related 
spending and policy decisions on traffic safety data, ranging from 
deciding to repair particular roadways to launching major safety 
campaigns. The quality of such decisions is tied to the quality of 
these data. Our review indicates that there were opportunities for 
states to improve crash data. However, because NHTSA exercised limited 
oversight over the 411 grant program, it is difficult to say what the 
program as a whole specifically accomplished or whether there was a 
general improvement in the quality of these data over the program's 
duration. Nevertheless, information we obtained from the 8 states we 
visited suggests the premise that the 411 program did help states 
improve their traffic safety data systems. Based on our work in these 8 
states, we believe that states undertook important improvements in 
their data systems with the federal grant funds. The potential 
reauthorization of the grant program and NHTSA's recent study of state 
safety data provide an opportunity to include assurances that states 
use these grants on effective and worthy projects. Furthermore, the 
reauthorization may provide greater funding and, therefore, greater 
opportunity for states to improve their traffic safety data systems. 
However, a larger program would come with a greater expectation 
regarding what states will accomplish as well as with a need to 
effectively track the progress states are making.

NHTSA's inability to provide key grant documentation and its 
deficiencies in monitoring state progress with 411 grant funds could be 
minimized if NHTSA (1) better managed grant documents, (2) had clearer 
requirements and guidance for the grant program, and (3) had an 
effective oversight process in place to monitor activities and 
progress. Requiring more specific information on the improvements 
states are making in their data systems would begin to address the 
problems we identified with regard to inadequate reporting on the 
program. If the program is reauthorized, NHTSA should develop an 
oversight process that does a better job of (1) tracking state 
activities to their strategic plans and assessments, (2) providing 
information about progress made in improving safety data, and (3) 
ensuring that NHTSA can adequately manage the documentation it is 
requiring. In addition, if NHTSA develops a plan to implement the 
recommendations in its recent Integrated Project Team report on traffic 
safety data systems, it could incorporate these recommendations through 
improved oversight efforts.

Finally, one requirement present in the earlier program--up-to-date 
assessments of state traffic safety data systems--was not included in 
recent proposals to reauthorize the 411 grant program. These 
assessments proved a valuable tool to states in developing and updating 
their strategic plans and activities for the 411 grant program. They 
also provide NHTSA with valuable information, including the current 
status of state traffic safety data systems organized by NHTSA's own 
recommended quality criteria.

Matter for Congressional Consideration:

In considering the reauthorization of the traffic safety incentive 
grant program, the Congress should consider including the requirement 
that states have their traffic safety data system assessed or an update 
of the assessment conducted at least every 5 years.

Recommendations for Executive Action:

If the Congress reauthorizes the traffic safety data incentive grant 
during the next session, we recommend that the Secretary of 
Transportation direct the Administrator, National Highway Traffic 
Safety Administration, to do the following:

* Ensure better accountability and better reporting for the grant 
program by outlining a process for regional offices to manage and 
archive grant documents.

* Establish a formal process for monitoring and overseeing 411-funded 
state activities. Specifically, the process should provide guidance for 
submitting consistent and complete annual reporting on progress for as 
long as funds are being expended. These progress reports should, at a 
minimum, include the status of allocated funds, documentation 
indicating how states intend to use the current year grant funds, a 
list of projects implemented in the past fiscal year, brief 
descriptions of activities completed, and any problems encountered.

* Establish a formal process for ensuring that assessments, strategic 
plans, and progress reports contain the level of detail needed to 
adequately assess progress and are appropriately linked to each other.

Agency Comments and Our Evaluation:

We provided a draft of this report to the Department of Transportation 
for its review and comment. Generally, the department agreed with the 
recommendations in this report. Department officials provided a number 
of technical comments and clarifications, which we incorporated as 
appropriate to ensure the accuracy of our report. These officials 
raised two additional points that bear further comment. First, 
officials voiced concern regarding the use of data quality criteria 
from NHTSA's Traffic Records Highway Safety Program Advisory to review 
the quality of data or the performance of states. The department 
emphasized that these criteria are voluntary and states are not 
required to meet them; therefore, states should not be judged against 
them. We acknowledge that these criteria are voluntary and clarified 
the report to emphasize this point more fully. However, we used the 
criteria as a framework for providing information on the status of 
state systems and view this analysis as appropriate since these 
criteria are used by NHTSA in conducting assessments of state traffic 
safety data systems. Second, department officials noted that their 
oversight of the 411 grant program was in accordance with the statutory 
requirements. Although we recognize that there were minimal 
requirements for the 411 grant program specifically, we believe the 
department should carry out more extensive oversight activities so that 
NHTSA can monitor the progress states are making to improve their 
traffic safety data systems and better ensure that states are spending 
the grant monies as intended.

We will send copies of this report to the interested congressional 
committees, the Secretary of Transportation, and other interested 
parties. We will make copies available to others upon request. In 
addition, the report will be available at no charge on GAO's Web site 
at [Hyperlink, http://www.gao.gov].

If you or your staff have any questions about this report, please call 
me at (202) 512-6570. Key contributors to this report are listed in 
appendix IV.

Signed by: 

Katherine Siggerud: 
Director, Physical Infrastructure Issues:

[End of section]

Appendixes:

Appendix I: Objectives, Scope, and Methodology:

The objectives in this report were to identify (1) the quality of state 
crash information; (2) the activities states undertook using 411 grant 
funds to improve their traffic safety data systems, and progress made 
using the data improvement grants; and (3) the National Highway Traffic 
Safety Administration's (NHTSA) oversight of the grant program, 
including what changes in oversight, if any, might help encourage 
states to improve traffic safety data systems and ensure accountability 
under a reauthorized program. To address these objectives, we conducted 
case-study visits to 9 states, analyzed state crash data, interviewed 
key experts, reviewed 411 grant program documentation, and interviewed 
NHTSA officials regarding their oversight and guidance to states in 
improving their traffic safety data systems.

To provide information on the quality of state crash data and state 
efforts to improve these data, we conducted site visits to 9 states, 
including California, Iowa, Kentucky, Louisiana, Maine, Maryland, 
Tennessee, Texas, and Utah. The case-study states were chosen on the 
basis of a variety of criteria, including population, fatality rates, 
participation in the 411 grant program, the level of funding received 
through the program, and participation in the State Data System (SDS) 
program and the Crash Outcome Data Evaluation System (CODES). We 
adopted a case-study methodology for two reasons. First, we were unable 
to determine the status of state systems from our review of 411 
documents. Second, while the results of the case studies cannot be 
projected to the universe of states, the case studies were useful in 
illustrating the uniqueness and variation of state traffic safety data 
systems and the challenges states face in improving them. During our 
case-study visits, we met and discussed the status of state traffic 
data systems with a variety of traffic safety data officials.[Footnote 
29] These discussions included gathering information on NHTSA's 
criteria, state objectives, and the progress made with 411 grant 
funds.[Footnote 30] In addition to these case-study visits, we analyzed 
data for 17 states that currently participate in NHTSA's SDS program to 
identify variations in data structure and quality. We selected a number 
of elements to assess the quality of data as they related to 
completeness, consistency, and accuracy for 5 of the 17 states that 
were part of the SDS program and also part of our case-study visits. We 
based the analysis on data and computer programs provided by NHTSA. We 
reviewed the programs for errors and determined that they were 
sufficiently accurate for our purposes. (See app. II.) Finally, we 
interviewed key experts who use traffic safety data, including 
consultants, highway safety organizations, and researchers.

In order to describe the activities that states undertook to improve 
their traffic safety data systems and the progress made under the data 
improvement grant, we reviewed 411 grant documentation for all 48 
participating states, including 8 of our 9 case-study states.[Footnote 
31] Our review included examining required documents states submitted 
to NHTSA, including their assessments, strategic plans, and grant 
applications and progress reports. We obtained these documents from 
NHTSA regional offices. For the case-study states, we also obtained 
additional documentation, including 411 grant expenditure information, 
in order to (1) describe state activities and progress made and (2) 
compare actual expenditures with the activities states reported to 
NHTSA.

To review NHTSA's oversight of the 411 grant program, we interviewed 
NHTSA officials responsible for oversight and administration of the 
program. Our interviews were conducted with NHTSA program staff at 
headquarters and in all 10 NHTSA regional offices. We also discussed 
program oversight with state officials in 8 of our 9 case-study states. 
We reviewed NHTSA guidance and policy, including regulations for the 
411 grant program and rules issued by NHTSA for the program. We also 
reviewed previous House and Senate bills that were introduced 
reauthorizing the 411 grant program.[Footnote 32] Finally, in order to 
understand NHTSA's broader role in oversight, we spoke with NHTSA staff 
and reviewed NHTSA's response to our recommendations that it improve 
its oversight.

We conducted our review from January 2004 through October 2004 in 
accordance with generally accepted government auditing standards. 
Because an examination of data quality was one of the objectives of 
this report, we also conducted an assessment of data reliability. 
Appendix II contains a more complete discussion of data reliability.

[End of section]

Appendix II: Additional Analysis of Data Quality in NHTSA's State Data 
System:

As part of our work, we examined data quality for 17 states that 
participate in NHTSA's SDS program. The body of our report presents 
several examples of the kinds of limitations we found; this appendix 
contains additional examples. The examples discussed below relate to 
two of NHTSA's quality criteria--data consistency and data 
completeness.

Variations in Reporting Thresholds Impact the Usefulness of Data in the 
State Data System:

The extent to which a state captures information about various data 
elements has much to do with the standards or thresholds it sets for 
what should be reported in crash reports. NHTSA's Model Minimum Uniform 
Crash Criteria (MMUCC) recommends that every state have reporting 
thresholds that include all crashes involving death, personal injury, 
or property damage of $1,000 or more; that reports be computerized 
statewide; and that information be reported for all persons (injured 
and uninjured) involved in the crash.

We found these thresholds differed from state to state. Two thresholds, 
in particular, create variation in the data: (1) criteria for whether a 
crash report must be filed and (2) criteria for whether to report 
information about uninjured occupants.

Determining Which Crashes Require a Crash Report:

The states varied greatly in their policies on when a police report 
must be filed. Fourteen of the 17 states set a property damage 
threshold, but the threshold varied from less than $500 to as much as 
$1,000 (see fig. 6). Among the other 3 states, 1 left the reporting of 
property-damage-only crashes to the officer's discretion, and 2 
stipulated that no report is to be filed unless at least one vehicle 
has to be towed from the scene. Thus, a crash involving $900 of damage 
to an untowed vehicle would be reported in some states but not in 
others.

Figure 6: State Criteria for Filing a Police Crash Report for Property-
Damage-Only Crashes:

[See PDF for image]

[End of figure]

Reporting Information about Uninjured Passengers:

Similarly, some states did not collect information about uninjured 
passengers involved in crashes. (See fig. 7.) While all 17 states 
collected information about uninjured drivers (such as whether he or 
she was wearing a seat belt), 5 did not collect such information about 
uninjured passengers. Such information could potentially be important, 
for example, in assessing the degree to which seat belt use helped 
prevent injuries from occurring. Even for states that collected 
information about uninjured passengers, the information may be 
incomplete. NHTSA officials said they thought that in these states, 
some officers left seat belt information blank or coded it as 
"unknown," either because reporting officers did not know the 
information or because collecting it was too time-consuming.

Figure 7: Extent to Which States Collected Information about Uninjured 
Passengers:

[See PDF for image]

[End of figure]

Variations in Reporting Alcohol and Drug Data:

Alcohol and drug data also showed state-to-state differences, both in 
consistency and completeness. Alcohol and drug data are important in 
addressing a major safety issue--impaired driving. In 2000, crashes in 
which drivers had blood-alcohol levels above .08 (.08 recently became 
the threshold for being legally impaired in all 50 states) accounted 
for an estimated 2 million crashes that killed nearly 14,000 people and 
injured nearly 470,000 others. Alcohol-related crashes in the United 
States that year cost an estimated $114.3 billion.[Footnote 33]

To assess the quality of these data in the SDS program, we selected 5 
states for detailed review. The states, chosen because they were also 
visited as part of our case studies, were California, Kentucky, 
Maryland, Texas, and Utah--although they are not identified by name in 
the results below. We looked at the degree to which they conform to 
guidelines recommended in the MMUCC with regard to the consistency and 
completeness of their data.

Consistency of Traffic Safety Data:

Information collected about alcohol-and drug-impaired driving varied 
from state to state and was not consistent with MMUCC guidelines. Table 
5 provides examples of this variation by comparing crash information 
submitted by states with the recommended guidelines. The table shows 
MMUCC's recommended guidelines for four elements--two elements each for 
alcohol and drugs. One element relates to whether the officer suspects 
alcohol or drug use, and the other to an actual test for alcohol or 
drugs. All 5 states collected some type of information on suspected 
alcohol or drug use, but each state differed from the others to some 
degree. Three states, for example, collected this information as part 
of a broader element that includes suspected alcohol and drug use as 
one attribute in a list of causes that might have contributed to the 
crash. For alcohol and drug testing, 1 state did not report such 
testing at all, and the 4 others differed both from each other and from 
MMUCC guidelines.

Table 5: Comparison of MMUCC Guidelines and Crash Information Provided 
to NHTSA by 5 States Regarding Alcohol-and Drug-Impaired Driving:

Recommended MMUCC element (variable name and definition): Law 
Enforcement Suspects Alcohol Use: Driver or nonmotorist involved in the 
crash suspected by law enforcement to have used alcohol; 
Crash information that 5 states collected (variable name and 
definition): 
* State A: Driver/Pedestrian Drinking: Indicates whether drinking 
impaired the ability of the driver, pedestrian, or bicyclist; 
Attributes include "not stated" and "had been drinking, under 
influence.” 
* State B: Suspected Drinking: Indicates if the driver of the vehicle 
was suspected of drinking; 
Attributes include "yes" and "no.” 
* State C: Driver/Pedestrian Condition: Indicates the condition of 
each driver/pedestrian at the time of the crash; 
Attributes include "had been drinking" and "had been using drugs.” 
* State D: Contributing Factor 2: Describes the driver's actions as the 
second contributing factor in the crash; 
Attributes include "under the influence of alcohol" and "under the 
influence of drugs.” 
* State E: Contributing Circumstance: Describes first actions taken by 
the driver that contributed to the crash; 
Attributes include "Had been drinking" and "Under the influence of 
drugs.”

Recommended MMUCC element (variable name and definition): Alcohol 
Test: Indication of the presence of alcohol by test, test type (blood, 
breath, etc.), and test results; 
Crash information that 5 states collected (variable name and 
definition): 
* State A: None; 
* State B: Alcohol/Drug, Alcohol Test Results, and Alcohol Test Type: 
Alcohol/Drug: Indicates if the driver was tested for alcohol, drugs, or 
both; 
Alcohol Test Results: Indicates the results of the alcohol/ drug test; 
Alcohol Test Type: Indicates the alcohol test type administered to the 
driver; 
* State C: Alcohol/Drug Use and Alcohol Test Results: Alcohol/Drug Use: 
Indicates the presence and the contribution of controlled substances; 
Alcohol Test Results: Indicates the results of the alcohol test. Coded 
for drivers, pedestrians, and bicyclists; 
* State D: Alcohol/Drug Analysis; 
Test and Alcohol/Drug Test Results: Alcohol/Drug Analysis Test: 
Indicates the type of specimen taken for an alcohol and/or drug 
analysis test. Coded for drivers, pedestrians, and bicyclists; 
Alcohol/Drug Test Results: Describes the results of alcohol and/or drug 
test. Coded for drivers, pedestrians, and bicyclists; 
* State E: Alcohol Test Results and Alcohol Test Type: Alcohol Test 
Results: Indicates the results of an alcohol test. Coded for drivers, 
pedestrians, and bicyclists; 
Alcohol Test Type: Describes how the alcohol test was administered. 
Coded for drivers, pedestrians, and bicyclists.

Recommended MMUCC element (variable name and definition): Law 
Enforcement Suspects Drug Use: Driver or nonmotorist involved in the 
crash suspected by law enforcement to have used drugs; 
Crash information that 5 states collected (variable name and 
definition): 
* State A: Driver/Pedestrian Condition: Identifies a physical condition 
of the driver, pedestrian, or bicyclist that may have been a factor in 
the crash; 
Attributes include "under the influence of drugs" and "other physical 
impairment.” 
* State B: Human Factors 1 - 3: Indicates up to three factors by humans 
that contributed to the crash; 
Attributes include "if the officer suspects drug involvement in the 
crash.” 
* State C: Driver/Pedestrian Condition: Indicates the condition of each 
driver/pedestrian at the time of the crash; 
Attributes include "had been drinking" and "had been using drugs.” 
* State D: Contributing Factor 2: Describes the driver's actions as the 
second contributing factor in the crash; 
Attributes include "under the influence of alcohol" and "under the 
influence of drugs.” 
* State E: Contributing Circumstance: Describes first actions taken by 
the driver who contributed to the crash; 
Attributes include "had been drinking" and "under the influence of 
drugs.”

Recommended MMUCC element (variable name and definition): Drug Test: 
Indication of the presence of drug test, test type, and test results. 
Excludes drugs administered postcrash; 
Crash information that 5 states collected (variable name and 
definition): 
* State A: None; 
* State B: Alcohol/Drug and Alcohol Test Results: Alcohol/Drug: 
Indicates if the driver was tested for alcohol, drugs, or both; 
Alcohol Test Results: Indicates the results of the alcohol/drug test; 
* State C: Alcohol/Drug Use: Indicates the presence and contribution 
of controlled substances; 
* State D: Alcohol/Drug Analysis Test and Alcohol/Drug Test Results: 
Alcohol/Drug Analysis Test: Indicates the type of specimen taken for an 
alcohol and/or drug analysis test. Coded for drivers, pedestrians, and 
bicyclists; 
Alcohol/Drug Test Results: Describes the results of alcohol and/or drug 
test. Coded for drivers, pedestrians, and bicyclists; 
* State E: Alcohol Test Results and Alcohol Test Type: Alcohol Test 
Type: Indicates the results of a drug scan. Coded for drivers, 
pedestrians and bicyclists; 
Alcohol Test Results: Includes positive and negative results of drug 
scan. 

Source: GAO presentation of NHTSA/NCSA data.

Note: State A uploads alcohol test result data from the Fatality 
Analysis Reporting System at a later date.

[End of table]

Completeness of Traffic Safety Data:

To determine the completeness of state data files regarding impaired 
driving, we looked at alcohol test result data that were coded as 
"missing" or "unknown." Figures 8 and 9 show the results for the first 
and last years we reviewed. The percentage of data recorded as missing 
varied from 0 percent to more than 12 percent, while the percentage of 
data recorded as unknown varied from 0 percent to more than 6 
percent.[Footnote 34] In addition, the 2 states with the most data in 
these two categories were almost mirror images of each other: that is, 
state D showed no data as missing but had the highest percentage of 
data classified as unknown, while state E showed virtually no data as 
unknown but had the highest percentage of data classified as missing. 
These variations could reflect differences in how states classify and 
record information. For example, NHTSA officials said some states may 
code an alcohol test result that comes back indicating no alcohol in 
the driver's blood stream as missing or unknown, rather than "negative" 
or ".00."

Figure 8: Percentage of Alcohol Test Results That Were Coded as Missing 
for 1998 and 2002:

[See PDF for image]

Note: State A did not provide alcohol test results to NHTSA during the 
period under investigation.

[End of figure]

Figure 9: Percentage of Alcohol Test Results That Were Coded as Unknown 
for 1998 and 2002:

[See PDF for image]

Note: State A did not provide alcohol test results to NHTSA during the 
period under investigation.

[End of figure]

Researchers' Use of Another Database Omits Data on Nonfatal Crashes:

Because the alcohol and drug data in SDS are subject to so many 
problems with completeness and consistency, many researchers and state 
policy makers use alcohol and drug data from the Fatality Analysis 
Reporting System (FARS) database instead. This database, which is also 
administered by NHTSA, contains information on crashes involving 
fatalities that occur within 30 days of the crash. FARS is generally 
seen as a reliable data source, with quality control measures and 
personnel that do as much follow-up as possible to fill in data gaps by 
contacting hospitals, medical offices, and coroners' offices to obtain 
accurate and complete information. However, FARS contains information 
only on fatal crashes--about 1 percent of all crashes. Thus, while the 
FARS data may be more complete and consistent for those crashes that 
are included, the vast majority of alcohol-and drug-related crashes are 
not included. Further, NHTSA imputes some of the alcohol information 
because even with follow-up there are often gaps in data.[Footnote 35]

[End of section]

Appendix III: Examples of Federal and Other Efforts at Improving 
Traffic Safety Data:

[See PDF for image]

[End of figure]

[End of section]

Appendix IV: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Katherine Siggerud (202) 512-6570; 
Randall Williamson (206) 287-4860:

Staff Acknowledgments:

In addition to those individuals named above, Nora Grip, Brandon 
Haller, Molly Laster, Dominic Nadarski, Beverly Ross, Sharon Silas, 
Stan Stenersen, and Stacey Thompson made key contributions to this 
report.

(542032):

FOOTNOTES

[1] The cost estimate was reported by the National Highway Traffic 
Safety Administration.

[2] Public Law 105-178 was enacted in June 1998. The name "411 grant 
program" stems from the authorizing section of the law.

[3] The 411 grant program was in place from fiscal year 1998 through 
fiscal year 2004. However, grants for the program were distributed to 
states only through 2002, when the grant program funds were fully 
disbursed. The 411 grant was originally authorized for $32 million. 
Surplus funds applied from a grant program that provides funds to 
states for alcohol-impaired driving measures raised the total 
authorized 411 funds for all 6 years to $36 million.

[4] The reauthorized program was included in the S. 1072: Safe, 
Accountable, Flexible, and Efficient Transportation Equity Act of 2004 
(SAFETEA) and H.R. 3550: Transportation Equity Act - Legacy for Users 
of 2003 (TEA-LU).

[5] S.R. 108-146, page 76.

[6] We visited California, Iowa, Kentucky, Louisiana, Maine, Maryland, 
Tennessee, Texas, and Utah. Texas did not participate in the grant 
program.

[7] The State Data System program is a database of state census crash 
data managed by NHTSA and the National Center for Statistics and 
Analysis. State participation is voluntary, with 27 states currently 
participating. Using data from this program, NHTSA produces its most 
frequently requested publication, the periodic Crash Data Report (most 
recently updated through 1999), and other important traffic safety 
publications. Our analysis included data from 1998 through the most 
recent year available for California, Florida, Illinois, Indiana, 
Kansas, Kentucky, Maryland, Michigan, Missouri, New Mexico, North 
Carolina, Ohio, Pennsylvania, South Carolina, Texas, Utah, and 
Virginia. Five of these 17 states were included in our case-study 
visits: California, Kentucky, Maryland, Texas, and Utah.

[8] NHTSA's six recommended quality criteria are used by NHTSA to 
assess state traffic safety data systems. The assessments identify 
needed data system improvements, and it is up to states to decide if 
and how to address the findings and where to focus their efforts. 

[9] A total of 48 states participated in the 411 grant program; 
however, by 2002, 4 states had discontinued.

[10] NHTSA's Traffic Records Highway Safety Program Advisory 
establishes criteria to guide state development and use of highway 
safety data. 

[11] FARS contains data derived from a census of fatal traffic crashes 
within the United States (including Washington, D.C., and Puerto Rico). 
Fatal crashes are those that involve a motor vehicle traveling on a 
traffic way open to the public and result in the death of a person 
within 30 days of the crash.

[12] There are other agencies and associations involved in the 
improvement of traffic safety data, see appendix III for examples.

[13] Title 23 U.S.C. Chapter 4.

[14] NHTSA's section 402 grant allows states to use some of their 402 
funding to support their state or local safety records systems.

[15] In the 1ST and 2ND fiscal years, the federal share of the costs 
shall not exceed 75 percent. In the 3RD and 4TH fiscal years, the 
federal share of the costs shall not exceed 50 percent. In the 5TH and 
6TH fiscal years, the federal share of the costs shall not exceed 25 
percent.

[16] Although states were required to meet basic requirements, they 
were not required to submit individual projects for approval.

[17] H.R. 3550 and S. 1072.

[18] We looked at data from 17 states participating in NHTSA's SDS 
program to provide additional information about three criteria: 
consistency, completeness, and accuracy. 

[19] Law enforcement officers in several states use computer programs 
to aid in the collection of crash information. One such program 
available to states is the Traffic and Criminal Software (TraCS) 
program, which was developed with federal assistance in Iowa. The TraCS 
program allows officers to enter crash information into local databases 
before it is transmitted to the central state traffic records database. 
Initial data entry can be completed either using computers in officer 
vehicles or using paper forms that are later keyed into the local 
computer system once an officer returns to his or her agency. In 
addition to facilitating the entry of data directly into the state 
database via electronic submission, the TraCS program maintains a local 
database of crash information and provides the local law enforcement 
agency with tools to do simple analysis of this information.

[20] Data completion may involve pulling in additional information from 
other state agencies, such as inputting the crash location from state 
department of transportation files. 

[21] Appendix II contains additional analysis related to the 
consistency and completeness of data for these 17 states. 

[22] NCSA is an office within NHTSA and is responsible for providing a 
wide range of analytical and statistical support to NHTSA and the 
highway safety community at-large.

[23] In addition to data collection difficulties, the data entry 
policies of the states varied. For example, some states code occupant 
protection system use correctly for injured passengers, but as 
"unknown" for uninjured persons. Similarly, the practice by some states 
of coding the lack of an alcohol test as ".00" rather than "missing" 
can lead to difficulty in obtaining proper information. 

[24] The CODES program is funded by NHTSA and links existing statewide 
traffic safety data with injury outcome, hospital discharge, and other 
injury-related data. The linked data are used to support highway safety 
decision making at the local, state, and national levels to reduce 
deaths, nonfatal injuries, and health care costs resulting from motor 
vehicle crashes.

[25] Texas, our 9TH case-study state, did not participate in the grant 
program.

[26] Documentation included grant applications and progress reports, 
strategic plans, and traffic records assessments.

[27] We received complete documentation (grant applications, state 
assessments, and strategic plans) for 24 of the 48 states that 
participated in the program from 1999 through 2002. Our discussions 
with NHTSA staff showed there was some confusion between NHTSA 
headquarters staff and the regional office staff regarding where the 
411 grant program documents were being held and who was responsible for 
managing them. 

[28] Regional action plans identify, among other things, program goals, 
performance measures, and specific tasks and strategies for the 
upcoming year. The plan for traffic safety data systems will also 
include one or two vital improvements needed in each state's traffic 
records system, such as improving information on blood-alcohol 
concentration testing. 

[29] These officials, in general, included representatives from the 
state traffic coordinating committee, the governor's highway safety 
office, the department of public safety, the department of 
transportation, the department of motor vehicles, the department of 
health, and stakeholders from the medical or injury prevention sector. 

[30] NHTSA's recommended criteria includes timeliness, completeness, 
consistency, accuracy, accessibility, and data integration.

[31] Texas did not participate in the 411 grant program.

[32] On September 30, 2004, while we were completing our review, 
current highway and transit programs in the Transportation Equity Act 
for the 21st Century (TEA-21) were extended to May 2005. 

[33] Estimated costs include $51.0 billion in monetary costs and an 
estimated $63.2 billion in quality of life losses.

[34] While these percentages seem low, the actual number of crashes 
represented is sizable. For all 4 states, the number of total crashes 
represented ranged from 97,000 to 599,000. The number of crashes with 
missing or unknown data for alcohol test results ranged from in the 
hundreds to in the thousands.

[35] Imputation is a statistical inference method used to estimate 
alcohol rates. 

GAO's Mission:

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. Government Accountability Office

441 G Street NW, Room LM

Washington, D.C. 20548:

To order by Phone:

 

Voice: (202) 512-6000:

TDD: (202) 512-2537:

Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director,

NelliganJ@gao.gov

(202) 512-4800

U.S. Government Accountability Office,

441 G Street NW, Room 7149

Washington, D.C. 20548: