This is the accessible text file for GAO report number GAO-06-102 
entitled 'Highway Safety: Further Opportunities Exist to Improve Data 
on Crashes Involving Commercial Motor Vehicles' which was released on 
November 18, 2005. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Committees: 

November 2005: 

Highway Safety: 

Further Opportunities Exist to Improve Data on Crashes Involving 
Commercial Motor Vehicles: 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-102] 

GAO Highlights: 

Highlights of GAO-06-102, a report to congressional committees: 

Why GAO Did This Study: 

Large trucks make up 3 percent of the nation’s registered vehicles, but 
they were involved in 11 percent of all fatal crashes in 2003. To 
reduce the fatality rate, the Federal Motor Carrier Safety 
Administration (FMCSA) sets national safety goals and works in 
partnership with states to reach them. Crash data collected by states 
and submitted to FMCSA is key to these efforts, and to be fully useful, 
this data must be complete, timely, accurate, and collected in a 
consistent manner. GAO addressed (1) what is known about the quality of 
commercial motor vehicle crash data, and what states are doing to 
improve it, and (2) the results of FMCSA’s efforts to help states make 
improvements. 

What GAO Found: 

Overall, commercial motor vehicle crash data does not yet meet general 
data quality standards of completeness, timeliness, accuracy, and 
consistency. For example, FMCSA estimates that nearly one-third of 
commercial motor vehicle crashes that states are required to report to 
the federal government were not reported, and those that were reported 
were not always accurate, timely, or consistent. States are undertaking 
four types of activities to improve data quality, including analyzing 
existing data to identify problems and develop plans for addressing 
them, reducing backlogs of data that have not been entered into state-
level databases, developing and implementing electronic data systems, 
and providing training. As a result of these efforts, states have 
recently improved both the timeliness and the number of reportable 
crashes submitted to FMCSA. 

FMCSA has two main efforts to support states in improving their 
reporting of commercial motor vehicle crash information—a commercial 
vehicle crash data improvement program and a data quality rating 
system—and both appear to be beneficial. Through the data improvement 
program, FMCSA has provided nearly $21 million in discretionary grants 
to 34 states from 2002 through 2005. These grants have ranged from 
$2,000 to $2 million and have helped states conduct a variety of data 
improvement activities. GAO did not find problems with FMCSA’s 
oversight of the program, but we did note that FMCSA does not have 
formal guidelines for awarding grants to states. As state participation 
in the program increases, formal guidelines and systems would likely 
assist FMCSA in prioritizing states’ requests and ensuring consistency 
in grant awards. 

FMCSA’s second major effort, a tool for rating states’ data quality, 
has proven to be an important tool for states to use in improving their 
crash data as well. These results are presented in a map that rates 
each state’s data quality as “good,” “fair,” or “poor.” According to 
both FMCSA and state officials, the map and the underlying rating 
system serve as an incentive for states to improve their crash data. 
While the map is useful, GAO identified problems in the methodology 
used for developing ratings. These problems may potentially lead to 
drawing erroneous conclusions about the extent of improvements that 
have been made, and discourage states from continuing to devote 
attention and resources to areas needing improvement.
FMCSA’s June 2005 data quality map showing each state’s overall data 
quality rating for crash and inspection data: 

[See PDF for image] 

[End of figure] 

What GAO Recommends: 

To ensure uniformity in awarding data improvement funds to states, 
FMCSA should establish specific guidelines for assessing and awarding 
state funding requests. 

Also, in order to address limitations in its data quality map, FMCSA 
should develop a plan for assessing and improving the map’s 
methodology, and it should provide a crash specific data rating and 
limitations of the map on its Web site. 

The Department of Transportation agreed with our findings and 
recommendations in this report. 

www.gao.gov/cgi-bin/getrpt?GAO-06-102. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Katherine Siggerud, (202) 
512-6570, Siggerudk@gao.gov. 

[End of section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

States Continue to Have Problems with CMV Crash Data Quality, but Are 
Pursuing a Variety of Improvement Efforts: 

Data Quality Problems Often Reflect Difficulties in Collection and 
Processing of Crash Reports: 

FMCSA's Efforts Have Contributed to CMV Data Quality Improvements: 

FMCSA's SaDIP Has Supported State Efforts to Improve Data Quality: 

Data Quality Map Has Spurred Improvements, but Limitations Curb Map's 
Continued Usefulness: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendixes: 

Appendix I: Objectives, Scope and Methodology: 

Appendix II: State Safety Data Quality Map Limitations: 

Completeness: Overall Completeness is Based on Fatal Crashes Only: 

Timeliness: Timeliness is Not Based on All Reported Crashes: 

Accuracy: Accuracy is Based on Only One Variable: 

Appendix III: FMCSA Reportable Crash, CMV, and Carrier Identification 
Visor Cards: 

Appendix IV: SaDIP Grant and Cooperative Agreement Distribution by 
State: 

Appendix V: SaDIP Case Study States: 

Appendix VI: GAO Contact and Staff Acknowlegdments: 

Tables: 

Table 1: Examples of Problems with Commercial Vehicle Crash Data 
Quality: 

Table 2: Annual Distribution of SaDIP funds: 

Table 3: Comparison of Data Quality Standards and State Safety Data 
Quality Map Measures: 

Table 4: Overall Data Quality Rating: 

Table 5: State Safety Data Quality Map Measures for CMV Crashes: 

Table 6: Distribution of SaDIP Grants by State: 

Table 7: Distribution of SaDIP Cooperative Agreements by State: 

Table 8: Georgia SaDIP Funding History: 

Table 9: Georgia Crash Data Quality Statistics (Percentages): 

Table 10: Minnesota SaDIP Funding History: 

Table 11: Minnesota Crash Data Quality Statistics (Percentages): 

Table 12: North Carolina SaDIP Funding History: 

Table 13: North Carolina Crash Data Quality Statistics (Percentages): 

Table 14: Ohio SaDIP Funding History: 

Table 15: Ohio Crash Data Quality Statistics (Percentages): 

Table 16: Oklahoma SaDIP Funding History: 

Table 17: Oklahoma Crash Data Quality Statistics (Percentages): 

Table 18: Washington SaDIP Funding History: 

Table 19: Washington Crash Data Quality Statistics (Percentages): 

Figures: 

Figure 1: Criteria for Selecting Truck and Bus Crashes to Report to 
FMCSA: 

Figure 2: State Participation in CMV Crash Data Improvement Efforts 
from Fiscal Year 2002 through Fiscal Year 2005: 

Figure 3: FMCSA Reportable Crashes: 

Figure 4: Reportable Commercial Motor Vehicle Configurations and Cargo 
Body Type: 

Figure 5: Responsible Carrier and Correct DOT Number Identification: 

Abbreviations: 

CMV: commercial motor vehicle: 

CVARS: Commercial Vehicle Analysis Reporting System: 

DOT: Department of Transportation: 

FARS: Fatality Analysis Reporting System: 

FMCSA: Federal Motor Carrier Safety Administration: 

GSA: General Services Administration: 

MCMIS: Motor Carrier Management Information System: 

MCSAP: Motor Carrier Safety Assistance Program: 

NHTSA: National Highway Traffic Safety Administration: 

SaDIP: Safety Data Improvement Program: 

SafeStat: Motor Carrier Safety Status Measurement System: 

SAFETEA-LU: Safe, Accountable, Flexible, and Efficient Transportation 
Equity Act of 2005: A Legacy for Users: 

UMTRI: University of Michigan Transportation Research Institute: 

[See PDF for image] 

[End of figure] 

Letter November 18, 2005: 

The Honorable Christopher S. Bond: 
Chairman: 
The Honorable Patty Murray 
Ranking Minority Member: 
Subcommittee on Transportation, Treasury, the Judiciary, Housing and 
Urban Development, and Related Agencies: 
Committee on Appropriations: 
United States Senate: 

The Honorable Joe Knollenberg: 
Chairman: 
The Honorable John W. Olver: 
Ranking Minority Member: 
Subcommittee on Transportation, Treasury, and Housing and Urban 
Development, the Judiciary, District of Columbia: 
Committee on Appropriations: 
House of Representatives: 

Large trucks make up only 3 percent of the nation's registered 
vehicles, but they were involved in 11 percent of all fatal crashes in 
2003, the last year for which complete data is available. That year, 
large trucks were involved in more than 430,000 crashes, killing 
approximately 5,000 people.[Footnote 1] In 1999, Congress established 
the Federal Motor Carrier Safety Administration (FMCSA) within the 
Department of Transportation[Footnote 2] and mandated it with reducing 
crashes, injuries, and fatalities involving large trucks and buses. 
Currently, FMCSA has the goal of reducing commercial motor vehicle 
crash fatalities to 1.65 fatalities per 100 million miles of travel by 
2008. As of fiscal year 2003, the commercial motor vehicle fatality 
rate was 2.3 fatalities per 100 million miles traveled, the lowest 
recorded since the Department of Transportation initiated tracking in 
1975--but still 40 percent above the 2008 goal. 

FMCSA works in partnership with states to reach commercial motor 
vehicle safety goals. States are the gatekeepers for the collection and 
reporting of commercial motor vehicle crash information. They receive 
crash reports completed by law enforcement personnel in local 
jurisdictions, compile them, and then submit crash reports to FMCSA. At 
the federal level, FMCSA manages a database which provides data that is 
used in rating motor carriers according to various safety indicators. 
Based on this rating, motor carriers are selected for safety 
inspections and reviews as part of FMCSA's enforcement efforts. While 
the data collected is primarily for federal use, states use the 
information to assist overall crash safety efforts and in setting 
commercial motor vehicle safety goals for themselves. Because the data 
is used in both federal and state decision-making on a variety of 
safety-related issues, it is important that it adequately meets data 
quality standards. 

To be useful to both federal and state decision-makers, crash data must 
be complete, timely, accurate, and collected in a consistent manner. 
However, there have been concerns about the quality of the information 
FMCSA and the states use to direct their efforts. Beginning with the 
Motor Carrier Safety Improvement Act of 1999, Congress directed the 
Department of Transportation to improve the collection and analysis of 
data on commercial motor vehicle crashes. This resulted in the creation 
of a commercial motor vehicle data improvement program.[Footnote 3] The 
program was reauthorized in 2005.[Footnote 4] Since 2002, about $21 
million has been awarded to states to improve their crash data quality, 
but data quality problems have persisted.[Footnote 5] In February 2004, 
the Department of Transportation Inspector General released a report 
discussing limitations of the commercial motor vehicle crash 
data.[Footnote 6] 

In a Senate report accompanying the fiscal year 2005 appropriation for 
the Department of Transportation,[Footnote 7] Congress asked that we 
review FMCSA's program for helping states improve their commercial 
motor vehicle crash data. The report directed us to describe the 
benefits obtained through the program, identify what can be done to 
improve the effectiveness of the program, and address concerns 
regarding crash data raised in the February 2004 Department of 
Transportation Inspector General's report. Accordingly, this report 
examines (1) what is known about the quality of commercial motor 
vehicle crash data and what states are doing to improve it, and (2) the 
results of FMCSA's efforts to facilitate the improvement of the quality 
of commercial motor vehicle crash data sent to the federal government. 

To describe the quality of commercial motor vehicle crash data, we 
reviewed a number of sources, including data reported by FMCSA and 
existing studies on the quality of commercial motor vehicle crash data. 
We interviewed officials from FMCSA, contractors that develop FMCSA 
crash data tools, and commercial vehicle industry researchers and 
public interest organizations to gain their perspective on commercial 
motor vehicle crash data quality. To provide information on states' 
efforts to improve commercial motor vehicle crash data, we reviewed 
grant documentation for 34 states that participated in FMCSA's Safety 
Data Improvement Program (SaDIP) as of September 2005.[Footnote 8] We 
also conducted case studies in six states that participated in the 
program.[Footnote 9] States were chosen based on a wide variety of 
factors including crash data quality and level of crash reporting. 
Additionally, we conducted phone interviews with states that did not 
participate, or are no longer participating, in the program.[Footnote 
10] To provide results of FMCSA efforts to facilitate the improvement 
of commercial motor vehicle crash data quality, we conducted interviews 
with officials from FMCSA on the administration and management of the 
SaDIP program. We also analyzed the guidance and support FMCSA provided 
to states and assessed FMCSA's role in coordinating commercial motor 
vehicle data quality initiatives. We reviewed studies conducted by the 
University of Michigan Transportation Research Institute and determined 
that the methodologies used in assessing the quality of the data states 
submit to FMCSA were sound and that the studies provided sufficiently 
reliable results for our purposes. Through site visits, a review of 
grant applications, interviews with relevant stakeholders and experts, 
and these studies, we were able to determine shortcomings in the 
reliability of FMCSA's commercial motor vehicle crash data. However, we 
determined that the data was sufficiently reliable for the purpose of 
case study selection. Our work was conducted from February 2005 through 
November 2005 in accordance with generally accepted government auditing 
standards. See appendix I for more details regarding our objectives, 
scope, and methodology. 

Results in Brief: 

Overall, commercial motor vehicle crash data does not yet meet general 
data quality standards of completeness, timeliness, accuracy, and 
consistency. For example, according to FMCSA, as of fiscal year 2004 
nearly one-third of commercial motor vehicle crashes that states are 
required to report to the federal government were not reported, and 
those that were reported were not always accurate, timely, or 
consistent. Data quality problems most often stem from errors or 
omissions either by law enforcement officers at the scene of a crash or 
in the processing of crash reports to a state level database. To 
address data quality problems, a number of states are undertaking four 
major types of data improvement activities: 

* Analyzing existing data to identify where problems are and to develop 
plans for addressing them; 

* Reducing backlogs of data that have not been entered into state-level 
databases in order to create more complete state crash files, through 
steps such as hiring contract employees; 

* Developing and implementing electronic data systems for collecting 
and processing crash information in a more timely, accurate, and 
consistent manner; and: 

* Providing training, such as educating law enforcement officers on the 
definitions and criteria for commercial motor vehicle crashes, to 
create more accurate and consistent data. 

These state efforts are resulting in some progress. Based on analysis 
of FMCSA data, a great number of crashes are being reported to FMCSA. 
Overall, the total number of commercial motor vehicle crashes being 
reported to FMCSA has increased by 59 percent between fiscal year 2000 
and fiscal year 2004, while the length of time it takes states to 
report these crashes to FMCSA has decreased as well. 

FMCSA has two main efforts to support states in improving their 
reporting of commercial motor vehicle crash information--a commercial 
vehicle crash data improvement program and a data quality rating 
system--and both appear to be beneficial. Through the data improvement 
program, FMCSA has provided nearly $21 million in discretionary grants 
to 34 states between 2002 and 2005. These grants have ranged from 
$2,000 to $2 million and have helped states conduct all four data 
improvement activities previously described. The six states in our case 
studies generally improved their data quality, mainly through projects 
funded in whole or in part through the grant program. While we did not 
find problems with FMCSA's oversight of the program, we did note that 
FMCSA does not have formal guidelines for awarding funds to states. 
Because these grants are discretionary, and because more states are 
expected to participate in the program in the future, having formal 
guidelines and systems in place would likely assist in prioritizing 
states' requests and ensuring consistency in awarding funds. FMCSA's 
second major effort, the State Safety Data Quality map, has proven to 
be an important tool for states to use in improving their crash data as 
well. This map, created by FMCSA and the Volpe National Transportation 
Systems Center, is a color-coded display depicting the overall data 
quality for each state in one of three rating categories--"good" 
(green), "fair" (yellow), or "poor" (red). According to both FMCSA and 
state officials, the map and the underlying rating system serve as an 
incentive for states to make improvements to their crash data. Despite 
the map's utility thus far, we identified potential problems both in 
the methodology used for developing ratings and the risk of drawing 
erroneous conclusions from the map. One example of a problem with the 
current methodology is that the overall ratings combine information 
about crashes with information stemming from FMCSA's inspections of 
motor carriers. Combining ratings for both crash and inspection data 
quality tends to make it difficult to determine how states are doing 
specifically with their crash data. In addition, some states with a 
"good" rating in completeness are not reporting all crashes to FMCSA. 
Rating states as "good" when in fact they have problems may discourage 
states from continuing to devote attention and resources to areas 
needing improvement and possibly misdirect program efforts. FMCSA is 
aware of many of the limitations of the map, but has not yet developed 
and implemented a formal plan to improve it. Addressing these 
limitations will strengthen the data quality map as a tool for 
improving commercial motor vehicle crash data. 

To ensure that FMCSA is able to target limited funds as effectively as 
possible, we are recommending that FMCSA create specific guidelines and 
criteria for awarding commercial motor vehicle crash data improvement 
funding. We are also recommending that FMCSA develop a plan to improve 
the data quality map, including assessing the methodology for 
developing ratings, providing results in greater detail, and 
documenting any limitations associated with the map. These enhancements 
will provide users with a more useful tool to view the condition and 
progress made in states' commercial motor vehicle crash data. 

Background: 

FMCSA was established as a separate administration within the U.S. 
Department of Transportation (DOT) on January 1, 2000, pursuant to the 
Motor Carrier Safety Improvement Act of 1999. FMCSA issues and enforces 
the federal motor carrier safety regulations that govern many aspects 
of specified commercial trucking and bus operations, including the 
interstate operation and maintenance of commercial motor vehicles 
(CMV). Regulations promulgated by FMCSA specify requirements that must 
be met by drivers of these vehicles. FMCSA conducts compliance 
reviews[Footnote 11] of truck and bus companies, and performs safety 
audits of new entrants[Footnote 12] into the industry. In addition, 
FMCSA trains inspectors to conduct safety audits, while states and 
local authorities are responsible for conducting the inspections and 
submitting the results to FMCSA.[Footnote 13] This partnership between 
FMCSA and the states annually results in about 3 million truck and bus 
inspections, 7,000 to 13,000 compliance reviews, and more than 19,000 
new-entrant safety audits. 

CMV crash data is key to FMCSA's efforts. CMV crash data is collected 
by local law enforcement, sent to the state, and then processed and 
uploaded by the state into FMCSA's data system. FMCSA maintains a 
database management system in each state so they can submit crash 
reports into FMCSA's central data system, the Motor Carrier Management 
Information System (MCMIS). FMCSA uses the information in its Motor 
Carrier Safety Status Measurement System, also known as SafeStat, to 
target carriers for compliance reviews to ensure that they are 
following safety regulations.[Footnote 14] SafeStat uses a variety of 
data to rank carrier safety, but it places the heaviest weight on crash 
data. 

Federal and state data quality guidelines call for CMV crash data to 
meet four basic quality standards:[Footnote 15] 

* Completeness: To support adequate decision-making for identifying 
problems and developing appropriate countermeasures, data should be 
collected for all reportable CMV crashes in the state, and data on all 
appropriate crash variables such as the carrier's identification 
number, should be submitted to FMCSA. 

* Timeliness: To make decisions about current safety problems, identify 
trends, or target carriers that pose immediate threats to highway 
safety, CMV crash data should be available for state and federal 
analytical purposes within a useful timeframe. 

* Accuracy: To adequately assess CMV crash problems and target the 
appropriate carriers for enforcement, all data within reportable CMV 
crash records should be accurate and reliable. 

* Consistency: To target carriers nationwide and to compare state 
results, CMV crash data should be collected uniformly using the same 
standards and applying the same criteria across jurisdictions. 

FMCSA has provided more specific guidelines and criteria for meeting 
each of these standards. On timeliness, for example, FMCSA calls for 
states to submit all CMV crash data to MCMIS within 90 days of when the 
crash occurs. In order to facilitate complete reporting of CMV crashes, 
FMCSA recommends data elements, such as the identity of the carrier, 
vehicle configuration, cargo body type, etc., as the minimum 
information to be: 

collected in order to have complete information on CMV 
crashes.[Footnote 16] FMCSA has also created criteria to assist in 
consistent reporting of crash information. These published criteria are 
used for identifying reportable CMV crashes to be submitted to MCMIS 
(see fig. 1). 

Figure 1: Criteria for Selecting Truck and Bus Crashes to Report to 
FMCSA: 

[See PDF for image] 

Note: Exceptions include crashes that involve: 1) a personally-owned 
truck or passenger vehicle meant for personal use only as the sole 
vehicle meeting the criteria above, or 2) a driver with a disease 
condition (e.g., stroke, heart attack, diabetic coma or epileptic 
seizure) and no other injury or damage occurs, or 3) deliberate intent 
(suicide, self-inflicted injury, homicide, etc.), with no unintentional 
injury or damage. Of the 430,000 CMV crashes that occur each year, 
about 150,000 meet reporting criteria. 

[End of figure] 

With the Motor Carrier Safety Improvement Act of 1999, Congress 
established a program to improve the collection and analysis of CMV 
crash data.[Footnote 17] This resulted in the creation of the 
Commercial Motor Vehicle Analysis Reporting System (CVARS). CVARS, 
which is now known as SaDIP was originally intended to be a standalone 
data collection system. After determining that a separate system would 
duplicate existing efforts, however, FMCSA decided to use SaDIP as a 
federal funding tool to support state efforts to collect and report CMV 
crash data.[Footnote 18] 

Besides changes in scope, SaDIP has also changed greatly in how the 
program is administered. Since fiscal year 2001, FMCSA has received 
funding from the Congress to implement this program. As a new agency, 
FMCSA did not have the appropriate contracting infrastructure in place 
to administer the funds. Therefore it transferred funds and the 
administrative duties of the program to the National Highway Traffic 
Safety Administration (NHTSA), which awarded several grants to states 
in fiscal year 2002. NHTSA also provided some of the funding to the 
General Services Administration (GSA) to enter into cooperative 
agreements with several states to fund multi-year data improvement 
projects. In fiscal year 2003, however, FMCSA assumed responsibility 
for all oversight of SaDIP funding. 

FMCSA provides SaDIP funds to states through two different methods. 
States can receive 1-year grants to fund specific projects, or they can 
enter into multi-year cooperative agreements in order to fund multiple 
efforts that are necessary to identify and reduce data quality 
problems. States can receive both of these types of funds from FMCSA 
and can apply for grant funding multiple times. SaDIP funding is 
discretionary in nature, allowing states to request the amount of 
funding they need to conduct data improvement projects. Since 2002, 
SaDIP funding has been provided to 34 states, and as of September 2005, 
awards ranged from $2,000 to $2 million totaling approximately $21 
million. (See app. IV for funding distribution by state.) 

In February 2004, the Department of Transportation Inspector General 
issued a report on SafeStat, the system FMCSA uses to target its 
compliance reviews.[Footnote 19] The report identified a number of 
problems with SafeStat, much of which stemmed from the quality of the 
crash data being used in calculating carrier ratings. The State Safety 
Data Quality map, discussed later in the report, was developed in 
response to the recommendations in the 2004 report. 

States Continue to Have Problems with CMV Crash Data Quality, but Are 
Pursuing a Variety of Improvement Efforts: 

When measured against generally accepted standards for completeness, 
timeliness, accuracy, and consistency, states continue to have 
challenges with the quality of their CMV crash data. Many of these 
challenges are based in the collection and processing of data at the 
local level, and often reflect broader crash data quality problems for 
all types of vehicle crashes. To address remaining limitations, and 
improve the completeness, timeliness, accuracy, and consistency of the 
data, states are undertaking four main types of efforts: analyzing data 
to identify problem areas, eliminating backlogs of data not yet 
entered, creating electronic systems to expedite data entry, and 
training law enforcement officers in ways to improve the data they 
submit. 

Challenges Remain in Meeting Data Quality Standards: 

The completeness, timeliness, accuracy, and consistency of CMV crash 
data is currently not meeting generally accepted data quality 
standards. Table 1 provides examples of some of the overall problems we 
(and others) have identified. Appendix V, which contains summaries of 
the six states we visited, provides more specific examples. 

Table 1: Examples of Problems with Commercial Vehicle Crash Data 
Quality: 

Completeness; 
As of 2004, FMCSA estimates that about one-third of the reportable CMV 
crashes are not being submitted to their data system-- the Motor 
Carrier Management Information System (MCMIS).[A] In addition, FMCSA 
estimates 20 percent of nonfatal crashes are not being reported. 
Further, studies indicate that even those crashes reported to MCMIS 
often have missing data.b. 

Timeliness; 
The average length of time from when a crash occurs to when the crash 
data is uploaded to MCMIS is 99 days--9 days over the required time 
limit. While this is not far from the goal, there is substantial 
variation in timeliness among states. For example, our analysis of 
state CMV crash data shows timeliness ranges from 13 days to 339 
days.[C]. 

Accuracy; 
FMCSA assesses accuracy by determining the number and percentage of 
interstate crashes uploaded to MCMIS without enough information to 
determine a carrier's DOT number (known as a non-match). As of fiscal 
year 2004, 15 percent of CMV crash records in MCMIS can not be matched 
to a carrier's DOT number.[D]. 

Consistency; 
According to an analysis by FMCSA, 33 of 50 states have crash reports 
that do not adequately follow the criteria for reporting commercial 
motor vehicle crashes to FMCSA.[E]. 

[A] MCMIS contains information on the safety fitness of commercial 
motor carriers and hazardous material shippers subject to the Federal 
Motor Carrier Safety Regulations and the Hazardous Materials 
Regulations. States upload their crash information to MCMIS via FMCSA's 
SafetyNet. 

[B] Source: FMCSA: 

[C] Source: Analysis of MCMIS data from fiscal year 2000 through fiscal 
year 2004. 

[D] Source: Analysis of fiscal year 2004 MCMIS data. 

[E] Source: Comparison of State Crash Reports with SafetyNet Selection 
Criteria and Key Data Secondary Elements Chart, FMCSA, August 12, 2005. 

[End of table] 

Data Quality Problems Often Reflect Difficulties in Collection and 
Processing of Crash Reports: 

CMV crash data quality problems often stem from issues that occur when 
data is initially collected at the scene of a crash and later processed 
through the state. We reviewed reports on crash data quality including 
individual state reviews conducted by the University of Michigan 
Transportation Research Institute (UMTRI).[Footnote 20] We also 
discussed these matters with state officials. We identified two key 
causes of poor data quality: (1) problems in interpreting how to fill 
out crash reports at the scene and (2) crash report processing issues 
ranging from competing priorities at the local level to complex routing 
of crash reports. 

Misinterpretation of Criteria and Definitions by Officers Filling Out 
Crash Reports: 

According to studies and our discussions with state officials, data 
collected at the scene of a crash can be flawed because of law 
enforcement misinterpretation of reporting criteria and 
definitions.[Footnote 21] Misinterpretation can occur for several 
reasons, such as infrequent opportunities for officers to receive 
training on filling out crash reports or unfamiliarity resulting from 
infrequent occurrences of CMV crashes in an officer's jurisdiction. 
Below are common problems with properly reporting CMV crashes:[Footnote 
22] 

Identifying reportable crashes. While crashes that result in a fatality 
are easily identifiable as a reportable crashes, tow-away or injury 
crashes are more difficult to identify. For example, UMTRI's review of 
eight states showed that five of those states experienced problems with 
reporting crashes that did not involve a fatality. According to UMTRI, 
this is likely due to a lack of understanding of criteria for reporting 
CMV crashes to FMCSA (see fig. 1). 

Identifying reportable commercial motor vehicles. FMCSA is responsible 
for enforcing safety regulations for interstate carriers, but collects 
crash information on a variety of CMVs that meet certain criteria. 
According to UMTRI reports, states had problems identifying smaller 
trucks and they were less likely to be reported because law enforcement 
officials are less likely to recognize them as qualifying CMVs. 
Further, law enforcement officials may be confused about reporting 
interstate and intrastate carriers to FMCSA. UMTRI's review of 8 states 
showed that five of the states encountered problems in identifying 
appropriate vehicle type for reporting to FMCSA (see fig. 1). 

Crash Report Processing Issues: 

Several other issues relate to the processing of crash reports. These 
tend to fall into the following three main categories: 

Competing priorities at the officer level. Data collection is 
complicated because at the scene of a crash, an officer's first 
priority is to ensure the safety of those involved; data collection is 
often a lesser concern. Only after the situation has been stabilized 
can the officer fill out the crash report. Competing priorities can 
make it difficult for law enforcement to adequately complete a crash 
report. According to a report by NHTSA, crash reporting is often a low 
priority when traffic safety data is not perceived as relevant to the 
work of law enforcement. This was confirmed in states we visited. For 
example, one state official with whom we spoke said that submitting 
crash data is often dependent on how much priority the local police 
chief places on data quality. 

Manual crash report forms. Typically, law enforcement officers complete 
handwritten crash reports, which are then manually submitted for data 
processing at a local or state agency. According to an FMCSA official, 
nearly all states use manual crash reporting forms to some extent. 
Completing crash reports by hand and manually processing the 
information has the potential to lead to errors in the data. Data 
quality may be further compromised by the use of a supplemental CMV 
crash report. States may have a general crash report for use in all 
crashes and a supplemental CMV crash report to be completed if a CMV is 
involved. For example, four of the six case study states we reviewed 
used a supplemental form to report a CMV accident. When data has to be 
captured in a separate form, law enforcement officers may not always 
complete the form, and sometimes it may be separated from the original 
crash report or even lost. UMTRI found in one state that use of the 
supplemental crash report may be leading to CMV crashes not being 
submitted to FMCSA. 

Complex processing. Even with a correctly completed crash report that 
is properly identified as a CMV crash, states may have a complicated 
process in place to route CMV crash data to FMCSA and this may 
contribute to lengthy delays. UMTRI state assessments found that 
processing issues in some states led to delays and resulted in 
incomplete submission of crash data. One official told us that all law 
enforcement agencies in his state send crash reports to the state 
patrol office where the report is scanned into electronic format. The 
electronic version of the report is then sent to the state department 
of transportation where additional location data is entered into the 
report. From here, the report enters the state database and then is 
periodically uploaded to FMCSA. This process can take significant time, 
especially when the original crash report is missing information. 

States Are Pursuing a Variety of Efforts to Address Data Quality 
Issues: 

States are undertaking a variety of projects in order to improve the 
completeness, timeliness, accuracy, and consistency of CMV crash data. 
These projects encompass four general types of efforts: research and 
analysis, reduction of reporting backlogs, electronic processing of 
crash records, and training. Figure 2 shows state participation in each 
of these four efforts. These projects directly affect the completeness, 
timeliness, accuracy, and consistency of the states' CMV crash data. 

Figure 2: State Participation in CMV Crash Data Improvement Efforts 
from Fiscal Year 2002 through Fiscal Year 2005: 

[See PDF for image] 

Note: Data available for the 34 states that have participated in 
FMCSA's data improvement program. Several states are participating in 
more than one effort. 

[End of figure] 

Data Analysis to Identify Problems: 

Eleven of the 34 states have research projects to evaluate crash data 
collection and reporting practices.[Footnote 23] Assessing data 
collection and reporting processes allows states to determine how well 
their data meet each of the data quality standards of completeness, 
timeliness, accuracy, and consistency, providing them with a useful 
tool for determining where they are going to concentrate their 
improvement efforts. 

Out of 34 states for which we have data, 11 have conducted research to 
identify data problems.During our case studies, we found examples of 
states conducting data analysis to identify their CMV crash data 
collection and reporting problems. (See app. V for more information on 
our case study states.) For example, one state conducted an analysis in 
order to determine why approximately half of the eligible crashes in 
the state crash file were not being reported to FMCSA. The analysis 
found that the state's crash file was not in the correct format to be 
correctly transferred to FMCSA. In addition, frequent errors in the 
state or zip code fields, and incomplete or inaccurate data, were 
leading to problems. A federally funded study of the state's crash data 
found approximately 50 percent of the state's crash reports for one 
year (2003) were not reported to the federal crash file due to problems 
at the state level coding and preparing crash data for transfer to 
FMCSA. Another state conducted a traffic records assessment that found 
the major deficiency in the state's crash file was the lack of timely 
data. It also identified the lack of effective quality controls during 
data entry and non-reporting of eligible crashes as deficiencies. 
Following the state's assessment, researchers made recommendations to 
help the state prioritize projects, including planning for the eventual 
implementation of electronic crash data collection. 

Reduction of Crash Database Backlogs: 

Fourteen of the 34 states have projects to reduce the backlog of crash 
reports that need to be entered into state crash files. In most states, 
until crashes are included in the state file, they are not reported to 
FMCSA. Hence, backlog reduction is essential for creating a complete 
state crash file, and with providing FMCSA with complete data about CMV 
crashes. A complete data file is critical for performing crash and 
trend analysis, for making informed policy decisions, and for gaining a 
better understanding of CMV safety issues. 

Of the 34 states for which we have data, 14 are participating in 
backlog reduction projects, including four of our six case study 
states. In these four states, temporary employees were hired to 
eliminate data backlogs. For example, in one state, contract employees 
were working to reduce a backlog of 600,000 crash reports. Data entry 
work began in July 2004 and state officials expect it to be completed 
by September 2005. Another state used temporary employees to eliminate 
a backlog of approximately 2 million crash records. This effort began 
in 2002, and as of June 2005, the data backlog has been completely 
eliminated and the state is now processing crash reports within 3 days 
of receipt. 

Using Electronic Systems to Expedite CMV Crash Data: 

Twenty-seven of the 34 states have projects to collect or input CMV 
crash data through electronic systems. Electronic reporting allows for 
more accurate and timely transfer of crash data to state and federal 
crash files. Electronic transfer of crash data can reduce duplication 
and data entry error, because paper-based data collection systems are 
subject to human error and time delays. While these projects enhance 
the quality of CMV crash data, often they are large in scope and 
require law enforcement agencies to purchase hardware. Uniform 
electronic crash reporting systems are heavily dependent on acceptance 
from all stakeholders and often some jurisdictions may already have 
their own systems. 

Of the 34 states for which we have data, 27 are participating in 
electronic crash record projects, including all six of the case study 
states. For example, one state has developed software that facilitates 
a nightly transfer of CMV crash records from the Division of Driver and 
Vehicle Services to the state patrol agency. The state patrol agency 
then submits the crash reports to FMCSA. The state is also working to 
provide computer hardware and Internet access so that state patrol 
officers can complete crash reports online. Another state is developing 
electronic crash reporting capabilities using handheld computers, which 
will eliminate handwritten crash reports, and the need to manually 
enter crash report data into an electronic system. 

Improving Law Enforcement Training: 

States are providing training to law enforcement officers in CMV crash 
data collection. Training for law enforcement officers on how to 
correctly identify a CMV and how to accurately complete a police 
accident report can help to improve the completeness, accuracy, and 
consistency of CMV crash data since many of the mistakes in reporting 
are made at the crash site.[Footnote 24] Training is also an 
opportunity to highlight the link between data collection and its end 
use in planning and prevention. 

Seventeen of the 34 states have projects aimed at providing law 
enforcement training initiatives, and all six of our case study states 
had some form of CMV training. In addition, all case study states we 
visited recognized problems with the quality of CMV crash data 
resulting from issues at the collection point. For example, one state 
provides on-going CMV training to teach officers to properly identify 
and report commercial vehicle crashes. This state also tries to show 
law enforcement officers how the data is used in planning and 
prevention in order to show the significance of good data. However, 
state officials told us that there has not been a lot of participation 
in the training from local law enforcement agencies. Another state is 
planning to develop a training video and visor cards[Footnote 25] for 
statewide dissemination. The visor cards will assist officers in CMV 
identification and provide information that will help police properly 
complete all the necessary crash reporting documentation. 

These state efforts are resulting in some progress. An analysis of 
FMCSA data shows that a greater number of crashes are being reported to 
FMCSA since 2000. Overall the total number of commercial motor vehicle 
crashes being reported to FMCSA has increased by 59 percent between 
fiscal year 2000 and fiscal year 2004, while the length of time it 
takes states to report these crashes to FMCSA has decreased as well. 

FMCSA's Efforts Have Contributed to CMV Data Quality Improvements: 

FMCSA's efforts appear to improve CMV crash data quality and have had a 
positive impact on state crash data. Specifically two efforts, the 
Safety Data Improvement Program (SaDIP) and the State Safety Data 
Quality map, have contributed to the changes. SaDIP funding has allowed 
states to improve data completeness, timeliness, accuracy, and 
consistency by supporting the implementation of new activities or 
increasing the scope and timeliness of existing projects. While FMCSA 
has provided adequate administration and oversight of SaDIP, the 
management of the program with regard to awarding grants raises some 
concerns. There are no formal guidelines in place for awarding funding 
to the states, and while this has not yet presented a problem, it may 
in the future. Funding continues to be made available, and more states 
continue to request funds for new projects. The State Safety Data 
Quality map is an evaluation tool that provides ratings for states' 
crash and inspection data quality, and displays the ratings so that a 
state's performance relative to other states is apparent. The map, 
which is publicized on FMCSA's Analysis and Information (A&I) Online 
Web site, allows state officials and the public to view the status of 
states' CMV crash and inspection data quality.[Footnote 26] The map is 
being used as an indicator of states' progress in improving the 
completeness, timeliness, and accuracy[Footnote 27] of crash data and 
by virtue of its public nature it is an incentive for states to improve 
their crash information. However, the methodology used for the ratings 
has limitations that may hinder the map's effectiveness in monitoring 
the status of CMV crash data quality. 

FMCSA's SaDIP Has Supported State Efforts to Improve Data Quality: 

As of July 2005, FMCSA had provided about $21 million to 34 states in 
order to assist in improving the quality of CMV crash data.[Footnote 
28] Several states have received funding more than once, and in fiscal 
year 2005, FMCSA made available to states almost twice the amount of 
money than it had at the beginning of the program ($3.9 million in 
fiscal year 2000 versus $7.3 million in fiscal year 2005). 
Additionally, recently passed transportation legislation authorizes a 
total of $11 million to be used for SaDIP funding for fiscal years 2006-
2009.[Footnote 29] The states that have participated in SaDIP account 
for about two-thirds of all CMV crashes occurring between April 1, 
2004, and March 31, 2005, reported to FMCSA's MCMIS database, and 70 
percent of the 2003 fatalities reported to MCMIS. FMCSA's goal is to 
provide funding for projects that will have the largest impact on 
improving state data describing CMV crashes. FMCSA also encourages 
states that are participating in data improvement projects that are 
already funded to apply for SaDIP funds in order to provide additional 
assistance to these larger efforts. (See table 2 for the annual 
distribution of SaDIP funds and app. IV for fund distribution by 
state.) 

Table 2: Annual Distribution of SaDIP funds: 

Fiscal year: 2000; 
Funds budgeted for SaDIP: $3,865,000; 
Funds awarded to states: $0. 

Fiscal year: 2001; 
Funds budgeted for SaDIP: $2,748,000; 
Funds awarded to states: $0. 

Fiscal year: 2002; 
Funds budgeted for SaDIP: $5,462,614; 
Funds awarded to states: $5,207,014. 

Fiscal year: 2003; 
Funds budgeted for SaDIP: $4,967,500; 
Funds awarded to states: $854,732. 

Fiscal year: 2004; 
Funds budgeted for SaDIP: $4,913,990; 
Funds awarded to states: $5,449,467. 

Fiscal year: 2005; 
Funds budgeted for SaDIP: $7,340,800; 
Funds awarded to states: $9,180,181. 

Total; 
Funds budgeted for SaDIP: $29,297,904; 
Funds awarded to states: $20,691,394. 

Source: FMCSA. 

Note: These data are current as of September 2005. In addition to funds 
provided directly to states, FMCSA provided $1,200,000 to contractors 
to assist states in improving data quality, bringing the total amount 
spent on state data improvement efforts to $21,891,394. 

[End of table] 

SaDIP has Yielded Several Positive Results: 

States have used the SaDIP funds to conduct a variety of projects, 
including those discussed earlier in this report. These funds have 
benefited states in several different ways, including increased focus 
on CMV data quality and advancement and expansion of ongoing broader 
data quality projects. 

SaDIP increased national attention to the problems associated with the 
quality of CMV crash data reported to the federal government. Several 
state officials we interviewed stated that they have noticed an 
increase in the amount of focus given to CMV crash data issues. For 
example, presentations made by FMCSA at several national conferences 
and workshops have highlighted the importance of data quality, and 
informed states of the various types of assistance, such as training 
and funding, that are available to them. During our case studies, we 
were told the following: 

* One state official said that by providing funds to be used for 
specific purposes, SaDIP had the effect of focusing attention on data 
quality improvement. SaDIP has sustained a high level of interest in 
data quality and has been a catalyst to improving traffic records 
coordination across state agencies. 

* Another state official said SaDIP helped to improve communication 
between the state patrol and the state's department of vehicle 
services. Because grants were provided to both agencies, analysts at 
the agencies are working together and have a better understanding of 
each other's data needs and share access to their respective databases. 

In half of our case study states, SaDIP has also allowed states to 
expedite data improvement projects that were already planned for 
implementation. 

* One state official told us that the state had been considering a plan 
to improve traffic safety data, including CMV crash data, and SaDIP 
funds provided it with the means to implement this plan. The long-term 
funding provided by the SaDIP cooperative agreement to support a full 
time employee was also a crucial element in gaining support from state 
decision-makers. 

* One state official said the SaDIP grant made it significantly easier 
to prioritize and expedite a data project that the state was already 
considering implementing. The additional funding allowed the state to 
specifically address CMV crash data and use other funding to continue 
to address broader crash data issues. 

Finally, SaDIP has allowed states to increase the scope of ongoing 
projects or develop new initiatives. Two of our six case study states 
have used SaDIP funds in addition to other resources to develop 
comprehensive and long-term data quality initiatives to address the 
completeness, timeliness, accuracy, and consistency of the CMV crash 
data they report to FMCSA. 

* Officials in one state told us that SaDIP funding supports elements 
of the state's electronic traffic information processing initiative. 
SaDIP funding has allowed the state to resolve a considerable backlog 
in crash records, creating a more complete crash file, and work has 
begun to develop a new electronic crash report that will be used--first 
by the state police--and then by local law enforcement in the rest of 
the state. Without resolution of this backlog, the state was unable to 
fully implement the CMV data component of its electronic traffic 
information processing initiative. Doing this allowed the state to have 
more timely, accurate, and consistent crash data. 

* SaDIP funding has helped another state develop an electronic crash 
report and traffic records system. As previously stated, this will 
allow the state to have more timely, accurate, and consistent crash 
data. The state's goal is to develop the new system by 2008 and the 
state is evaluating and using multiple sources of funding to achieve 
this goal. Early in the process, the state made a strategic decision to 
target all crash data, not just commercial vehicle data. While the 
state uses SaDIP to develop electronic crash reporting software, SaDIP 
funding has also been leveraged to help the state accomplish its data 
quality improvement goals more quickly. For example, the new crash 
reporting capabilities developed through the SaDIP grant allow for much 
better crash analysis and targeting of safety efforts along major 
commercial vehicle corridors in the state. 

FMCSA Has Made Improvements in the Administration of SaDIP; However, 
Management of Grant Awards Raises Concerns: 

SaDIP has evolved over time and FMCSA has made several efforts to 
improve the administration of the program. For example, beginning in 
fiscal year 2006, FMCSA will be implementing a new application package 
for states to use in applying for SaDIP. While SaDIP had some 
preexisting application requirements for states, this application 
package provides a uniform application for states to use, and it is the 
first time states are going to be required to submit quantifiable 
project objectives and program measures. This will allow FMCSA to begin 
to measure the effectiveness of state improvement efforts. 
Additionally, SaDIP will be posted on Grants.gov ([Hyperlink, 
http://www.grants.gov]) in 2006.[Footnote 30] This federal website 
provides a single source for grant applicants to electronically find 
and apply for federal funds. FMCSA will also be updating its State 
Program Manager's manual to include guidelines for the roles and 
responsibilities of state program managers in administering SaDIP. 
Finally, FMCSA will be issuing funds at designated times during the 
year. Currently, funds are awarded to states on a rolling basis, but 
beginning with the implementation of the new application package the 
funds will be awarded on specific dates to all applicants. This is 
expected to improve the program's organization and FMCSA's ability to 
keep track of grant progress as the number of program participants 
increases. Throughout these administrative changes, FMCSA has 
maintained sufficient oversight of the states participating in SaDIP. 
FMCSA has contracted with a company that is responsible for monitoring 
SaDIP participant states, and ensuring that they are submitting 
quarterly progress reports containing sufficient detail to FMCSA. This 
contractor also has regular conversations with the FMCSA Division 
Administrators in the SaDIP states, and it maintains copies of SaDIP-
related paperwork. 

While these efforts are positive steps, we have concerns with FMCSA's 
lack of guidelines for awarding funds to states. FMCSA has not yet 
established formal guidelines for determining how much money a state 
should receive, or if the state should receive the funds in the form of 
a grant or a cooperative agreement. Since the beginning of the program, 
funds awarded to states have ranged from $2,000 to $2 million for 
projects ranging from specific activities to broader efforts that span 
three or more years. (See app. IV for funds awarded by state.) These 
awards have taken the form of both grants and cooperative agreements 
between the state and the federal government. Currently these decisions 
are made on a state-by-state basis, informally by a small review panel. 
The state's application, data quality history, discussions with the 
state's FMCSA division administrator, and any other relevant and 
available data on the state are consulted when making funding awards. 

While the absence of guidelines has not proven to be problematic to 
date, having formal guidelines will better ensure consistency and 
discipline in managing the grant program among states, particularly as 
states' needs become more targeted or more states decide to participate 
in the program. It will also add integrity to the grant management 
system and assist in providing better administration and oversight of 
SaDIP projects. For example, such guidelines would likely allow FMCSA 
to better assess and prioritize states' funding requests, including 
more formally considering whether proposed activities adequately 
address problems states identify in the proposal, whether the amount of 
funding requested is appropriate for the proposed activities, and 
whether multiple-year versus one-year funding is appropriate. Further, 
formal guidelines would provide a more structured framework to evaluate 
the effectiveness of different SaDIP project activities and assist in 
guiding future state improvement efforts. 

Data Quality Map Has Spurred Improvements, but Limitations Curb Map's 
Continued Usefulness: 

The State Safety Data Quality map has encouraged states to improve 
their CMV data quality, but limitations exist that may hinder the map's 
usefulness as a tool for monitoring and measuring CMV crash data 
quality. The State Safety Data Quality map was created by FMCSA and the 
Volpe National Transportation Systems Center principally to provide 
context for both crash and inspection data used in the SafeStat system; 
however, it has evolved into a tool to evaluate state-reported crash 
and inspection data.[Footnote 31] It is based on a system that rates 
crash and inspection data quality--completeness, timeliness, and 
accuracy--as "good," "fair," or "poor."[Footnote 32] The map has proven 
to be a major incentive for states to initiate CMV crash data 
improvements, for gaining support in implementing these improvements, 
and as a tool to monitor CMV crash data quality. However, we have 
identified some important limitations that can affect the data quality 
map's future usefulness. 

The State Safety Data Quality Map is an Incentive for States to Make 
CMV Crash Data Improvements: 

Since the State Safety Data Quality map is accessible by the public and 
presents data quality ratings in a simplified form, the map is a 
motivator for states to improve their CMV crash data. According to 
officials at FMCSA, they said the map has been very influential in 
encouraging states to improve their CMV crash data. Corroboratory 
comments came from the state level, where many of the officials in case 
study states provided anecdotes of how the data quality map served as 
an impetus for initiating improvements and gaining support.[Footnote 
33] Here are examples of how the map assisted states in initiating or 
expediting improvements in CMV crash information. 

* One state includes the data quality map ratings in weekly status 
reports to agency heads and in reports to the governor. An official in 
this state said that once the data quality map was shown to the 
governor, it raised the importance of improving CMV crash data. This 
state now posts the data quality map on state agency Web sites and uses 
it as a tool to compare its data quality efforts to the rest of the 
nation. 

* One state official told us that when agency leaders understand that 
their state has a "poor" rating, they are likely to make data 
improvement a priority and focus attention and resources on the issue. 

* Another official said that the map helped officials to "see the 
light." In this state, the data quality map helped initiate a process 
that led to improved communication and coordination for data quality at 
the highest levels for all safety and security projects within the 
state. 

The Data Quality Map is Important in Addressing Data Quality: 

FMCSA and states we reviewed use the State Safety Data Quality map to 
measure states' CMV data quality and progress being made in their data 
improvement efforts. Our review indicated that although the map was 
created as a tool for providing context to FMCSA's SafeStat system, 
both FMCSA and the states were using it specifically for monitoring the 
status of CMV crash data quality. State officials were also using it to 
identify data quality problems and assist states in targeting state 
improvement efforts. Here are examples from the states we contacted: 

* In one state an official reported that it was very helpful for law 
enforcement officials and department heads to instantly recognize their 
state's data quality status on the map. 

* In another state, officials said they knew they had a data quality 
problem, but were unable to identify the specific issue. The data 
quality map indicated that the state's accuracy was poor, and the 
problem was specifically with matching carriers in the state database 
with crashes in MCMIS. 

* In another state, officials said the data quality map focused 
attention on the state's data quality spending and results, and it 
helped state officials re-prioritize spending. 

Both FMCSA and state officials in many of the states we spoke with 
recognized the map as an important tool in measuring progress in their 
crash data improvement efforts. Further, FMCSA officials said the map 
was used as an assessment of a state's success in the SaDIP program. 
While officials also said they reviewed data in the MCMIS system to 
monitor state progress, an important measure of success in the program 
was a state's status in the State Safety Data Quality map and the 
underlying CMV crash measures. 

The State Safety Data Quality Map Has Limitations that May Affect its 
Effectiveness as a Tool for Monitoring and Measuring CMV Crash Data 
Quality: 

Given the importance that FMCSA and the states attach to the data 
quality map, it appears to be a good first step in monitoring states' 
data quality. However, we found limitations with many features of the 
map that diminish its usefulness as an effective tool for specifically 
monitoring and measuring state progress in their improvement efforts. 
Specifically, we identified several limitations with the methodology 
used to develop the data quality map's ratings, as well as with the 
measures themselves that should be addressed. Below are some of the key 
limitations. A more detailed discussion of limitations is located in 
appendix II. 

* The data quality map measures used to calculate the completeness, 
timeliness, and accuracy of CMV crash data falls short of providing a 
complete measure of CMV crash data quality. While each data quality 
measure has some limitations, one key measure--completeness--has a 
number of difficulties. Most importantly, the completeness measure is 
limited in the data it is assessing. The completeness measure only 
evaluates fatal CMV crashes, which represents about 3 percent of all 
reportable CMV crashes.[Footnote 34] In addition, the completeness 
measure does not assess the completeness of the information contained 
within the crash report. 

The completeness measure's methodology relies solely on comparing CMV 
fatal crash data in MCMIS to data states submit to the Fatality 
Analysis Reporting System (FARS).[Footnote 35] This particular approach 
provides additional limitations to the completeness measure. First, 
there are some definitional differences between FARS and MCMIS data 
that can account for about 4 percent of the crash records. Secondly, 
FARS data is not timely. For example, the June 2005 map relies on 2003 
FARS data; thus the completeness measure does not reflect the current 
status of CMV crash data. In addition, since FARS data is only released 
once a year, quarterly issuances of the map do not necessarily reflect 
changes in the number of fatal crashes. Table 3 below describes in more 
detail the limitations for each crash data quality measure. 

Table 3: Comparison of Data Quality Standards and State Safety Data 
Quality Map Measures: 

Data quality standard: Completeness; 
Data quality standard: All reportable CMV crashes in the state and data 
on all appropriate crash variables are submitted to FMCSA. FMCSA 
recommends 20 CMV crash data variables that should be reported on.[A]; 
State Safety Data Quality Map measure: Percentage of Fatal Crash 
Records Reported. Compares the number of large trucks in crashes 
involving a fatality in MCMIS versus those in the FARS.[B]; 
Limitations: Includes only reported, fatal crashes; reflects only about 
3 percent of all CMV crashes required to be reported to FMCSA. Bases 
completeness on FARS data, though some differences exist in state 
definitions of a CMV fatal crash and may result in an inflated or 
deflated rating. Does not represent current status of completeness. 
Does not measure the completeness of CMV information within a crash 
report (missing variables). 

Data quality standard: Timeliness; 
Data quality standard: All reportable CMV crash records are available 
for state and federal analytical purposes in a useful timeframe. FMCSA 
recommends that CMV crash data be reported within 90 days of the crash 
occurrence; 
State Safety Data Quality Map measure: Percentage of Crash Records 
Reported within 90 Days. The percentage of State-reported fatal and non-
fatal crash records reported within 90 days in the MCMIS database for 
carriers over a 12-month time period.[C]; 
Limitations: Only reflects records that are uploaded into MCMIS and 
have not been changed or edited.[D]. Backlogs of crash reports, once 
entered, can negatively affect timeliness rating. 

Data quality standard: Accuracy; 
Data quality standard: All data within reportable CMV crash records are 
accurate and reliable; 
State Safety Data Quality Map measure: Percentage of Matched Crash 
Records. The percentage of State-reported fatal and non-fatal crash 
records in the MCMIS database for interstate carriers and intrastate 
hazardous material carriers over a 12-month time period that were 
matched to a motor carrier in MCMIS; 
Limitations: Only measures accuracy of one variable (identity of the 
motor carrier--the U.S. DOT number); neglects to measure the accuracy 
for other recommended variables. 

Data quality standard: Consistency; 
Data quality standard: Crash data should be collected uniformly. 
Officials should use the same standards and apply the same criteria 
uniformly across jurisdictions. FMCSA provides guidelines and criteria 
for reporting CMV crashes; 
State Safety Data Quality Map measure: None; 
Limitations: N/A. 

Source: FMCSA and Volpe National Transportation Systems Center. Data 
Quality Standards are based on GAO's review of data quality guidelines 
from a variety of sources including FMCSA, NHTSA, and Data Nexus Inc., 
and includes references to the Model Minimum Uniform Crash Criteria, 
National Governors' Association Elements and Criteria, and the American 
National Standard Manual for Classification of Motor Vehicle Traffic 
Records. 

[A] In 1992, FMCSA adopted the National Governors' Association (NGA) 
recommended data elements, requiring these data to be collected and 
reported on motor carrier crashes. 

[B] FMCSA created an "Crash Consistency Overriding Indicator" to 
indicate consistency in reporting of non-fatal CMV crashes. The 
indicator flags states that may be experiencing major problems in 
reporting crash data. The Crash Consistency Overriding Indicator is the 
percentage of state-reported non-fatal crashes as compared to a three- 
year average of reported non-fatal crashes. States that have reported 
fewer than 50 percent of non-fatal crash records for the current year 
based on the previous three-year average of non-fatal crash records are 
flagged and receive a rating of "poor" regardless of their ratings in 
any of the other data quality indicators. This indicator is also 
limited because it only identifies extreme cases of under-reporting and 
does not assess if there is a substantial increase in reported non- 
fatal crash records. 

[C] Includes both interstate carriers (those carriers that operate 
between states) and intrastate carriers (carriers that operate only 
within one state) though FMCSA is only responsible for regulating 
interstate carriers. 

[D] FMCSA's MCMIS currently does not have the ability to track the 
original upload date when a crash record is edited or changed. 
Therefore, records that are changed after the original upload are not 
used in the calculation for timeliness. For additional information see 
appendix II. 

[End of table] 

* The data quality map is limited in its ability to meaningfully 
monitor and track CMV crash data quality over time. Since the State 
Safety Data Quality map was first issued, a large majority of states 
have been rated as "good" or "fair" for the completeness, timeliness, 
and accuracy measures. Since the first issuance of the map, over 90 
percent of the states have been rated "good" or "fair" for 
completeness, and about three quarters of the states have been rated 
"good" or "fair" for crash timeliness and crash accuracy. However, our 
review shows that states need to continue to make improvements. In 
addition, since the majority of states are rated as "good" or "fair" in 
measures, it is also difficult to measure any progress made because 
many states have already reached the highest rating. For example, as of 
the June 2005 data quality map, 41 states were rated "good" in crash 
completeness. Because these states have already reached the highest 
rating for this measure, it may prevent measuring any subsequent 
progress in this data quality standard. 

There are other problems with using the data quality map to track 
trends. Based on our review, a state's rating can temporarily decline 
for a variety of reasons--even for implementing improvements. Hence, 
for that period of time, the data quality map is not accurately 
reflecting a state's true data quality status. Officials in one state 
told us that the state was implementing a new electronic system which 
in the long term would improve its data quality greatly. However, 
current CMV crash reporting was slowed during implementation, and the 
state's rating went down from "fair" to "poor." 

* The data quality map's ratings for overall data quality combine data 
from crashes with data from FMCSA's inspections, making it difficult 
for map users to obtain an overall picture based solely on crash data. 
For the overall data quality rating, individual crash measures for 
completeness, timeliness, and accuracy are combined with inspection 
measures for timeliness and accuracy. For each of the five individual 
crash and inspection measures a state will receive a rating of "good," 
"fair," or "poor."[Footnote 36] A state's overall rating depends on how 
well the state did across all five measures (see table 4). 

Table 4: Overall Data Quality Rating: 

Rating: Good; 
Thresholds for overall data quality rating: No "poor" and a minimum of 
one "good" in a crash or inspection measure. 

Rating: Fair; 
Thresholds for overall data quality rating: Maximum of one "poor" in a 
crash or inspection measure. 

Rating: Poor; 
Thresholds for overall data quality rating: Two or more "poor" ratings 
in crash or inspection measures or a red flag in the Crash Consistency 
Overriding Indicator. 

Source: FMCSA. 

[End of table] 

Currently, users can only view CMV crash data quality by individual 
measures (completeness, timeliness, and accuracy). Separating the 
inspection data and presenting a specific overall CMV crash data 
quality rating (a combined rating composed of completeness, timeliness, 
and accuracy for crash data only) would enhance a state's ability to 
understand its crash data status and to monitor progress in improving 
the information. 

* While the State Safety Data Quality map provides a description of the 
methodology used, it does not identify limitations to the methodology. 
While many of these limitations to the map are acknowledged by FMCSA, 
they are not publicly displayed on the State Safety Data Quality map 
Web site. The absence of this information limits users' understanding 
of the map's data and increases the potential for incorrect deductions 
and improper map-based decisions. 

FMCSA officials are aware of many of the limitations that we have 
identified and we recognize their efforts to improve the State Safety 
Data Quality map to date. However, they do not have a formal plan in 
place to implement improvements. Further, it is important to 
acknowledge these issues so that users understand the limitations of 
the data quality map as a tool. During our review we learned that not 
only was the data quality map consulted in awarding SaDIP funds, it was 
also consulted when awarding Motor Carrier Safety Assistance Program 
(MCSAP) High Priority funds.[Footnote 37] Further, according to 
officials at FMCSA and Volpe National Transportation Systems Center, 
there has been some discussion of expanding the usage of the map in the 
future for CMV crash data efforts. As the data quality map gains wider 
use, it will become even more important that these limitations are 
addressed. 

Conclusions: 

The grant program and FMCSA's collaborative efforts with states have 
had a positive impact on improving the quality of states' crash data, 
therefore ultimately enhancing the ability of both federal and state 
governments to make highway planning and safety enforcement decisions. 
While states have made progress in improving the quality of this data-
-in terms of timeliness, completeness, accuracy, and consistency--much 
remains to be done. With additional funding through the reauthorization 
of the Safe, Accountable, Flexible, Efficient Transportation Equity 
Act: A Legacy for Users and as states refine and target areas needing 
further improvement with respect to their crash data, it is expected 
that additional states will participate in the program. FMCSA will need 
a more formal framework to better ensure that the decision-making 
process for awarding funds to SaDIP program applicants is conducted 
uniformly. 

FMCSA's efforts to improve the quality of commercial motor vehicle 
crash data have brought considerable attention to the issues associated 
with poor commercial motor vehicle crash data. Providing states with 
funding to improve their CMV data quality and publicizing a rating 
through a data quality map are incentives that work in tandem to 
maximize states' efforts. It is clear that states pay attention to 
their ratings of "good," "fair," or "poor" on the data quality map. 
However, the limitations we identified highlight some important 
concerns with the data quality map's ability to measure progress and 
accurately portray states' commercial motor vehicle crash data quality. 
As FMCSA continues to make improvements, it will be important for these 
ratings to become more precise, so that FMCSA and the states can obtain 
the clearest picture possible of the progress being made. 

Recommendations for Executive Action: 

We recommend that the Secretary of Transportation direct the 
Administrator of FMCSA do the following: 

* Establish specific guidelines for assessing state proposals for SaDIP 
grants in order to better assess and prioritize states' funding 
requests and provide uniformity in awarding funds. 

* Increase the State Safety Data Quality map's usefulness as a tool for 
monitoring and measuring commercial motor vehicle crash data by 
ensuring that it adequately reflects the condition of the states' 
commercial motor vehicle crash data and continues to motivate states in 
their improvement efforts. Specifically, FMCSA should develop a plan 
for assessing and improving the data quality map's methodology. In 
addition, FMCSA should display an overall crash data rating separately 
from the inspection rating, and provide information on the limitations 
of the State Safety Data Quality map and the underlying ratings on 
FMCSA's Analysis and Information (A&I) Online Web site. 

Agency Comments and Our Evaluation: 

We provided a draft of this report to the Department of Transportation 
for its review and comment. The department agreed with our findings and 
recommendations in this report. Department officials provided some 
technical comments and some minor additions to provide more detail on 
FMCSA's training efforts. 

We will send copies of this report to the interested congressional 
committees, the Secretary of Transportation, and other interested 
parties. We will make copies available to others upon request. In 
addition, the report will be available at no charge on GAO's Web site 
at [Hyperlink, http://www.gao.gov]. 

If you or your staff has any questions about this report, please call 
me at (202) 512-6570. Contact points for our Offices of Congressional 
Relations and Public Affairs may be found on the last page of this 
report. Key contributors to this report are listed in appendix VI. 

Signed by: 

Katherine Siggerud: 
Director, Physical Infrastructure: 

[End of section] 

Appendixes: 

Appendix I: Objectives, Scope and Methodology: 

Congress asked us to review the Federal Motor Carrier Safety 
Administration's (FMCSA) program for helping states improve their 
commercial motor vehicle crash data. As part of this review, we were 
asked to describe the benefits obtained through the program, identify 
what can be done to improve the effectiveness of the program, and 
address concerns regarding crash data raised in a February 2004 
Department of Transportation (DOT) Inspector General's report. The 
specific objectives of this report were to explain (1) what is known 
about the quality of commercial motor vehicle crash data and what 
states are doing to improve it, and (2) the results of FMCSA's efforts 
to facilitate the improvement of the quality of commercial motor 
vehicle crash data sent to the federal government. 

To provide information on the quality of states' commercial motor 
vehicle (CMV) crash data and efforts to improve it, we reviewed grant 
applications and project information from 39 states. Of these 39 
states, 34 participated in the Safety Data Improvement Program (SaDIP) 
and 5 states either chose not to participate in the program or their 
proposals were not accepted. We also conducted site visits to six of 
these states--Georgia, Minnesota, North Carolina, Ohio, Oklahoma, and 
Washington. We chose our case study states based on a variety of 
criteria, including participation in the SaDIP grant program, the type 
of agencies with which the state works under SaDIP, the number of CMV 
crashes in the state, the number of reported CMV crash fatalities, and 
data quality map ratings. To help ensure that our states reflected a 
variety of experiences, we chose states that had different combinations 
of these criteria. To understand the insights and experiences of states 
that no longer participated--or had never participated--in SaDIP, we 
also interviewed officials in Michigan, Missouri, New Hampshire, and 
New Jersey.[Footnote 38]While the results from the case studies and 
interviews cannot be projected to the universe of states, they are 
nonetheless useful in illustrating the uniqueness and variation of CMV 
crash data systems and the challenges states face in improving them. 
During our case study visits we met and discussed the status of state 
crash data systems with a variety of traffic safety data 
officials.[Footnote 39] The discussions included gathering information 
on FMCSA's data quality criteria[Footnote 40] but also included, for 
those participating in the program, state objectives and progress made 
with SaDIP funds. For additional perspective, we also interviewed key 
experts from organizations responsible for the development of crash 
data systems and models used by FMCSA, carrier industry officials, and 
public interest organizations. Finally, we conducted a literature 
review of studies published by the University of Michigan 
Transportation Research Institute (UMTRI). The Institute plans to 
conduct studies in all states to determine where problems are occurring 
in the collection of CMV crash data and in the reporting of this data 
to FMCSA. As of September 2005, studies had been conducted in eight 
states. We reviewed the studies and determined the methodologies used 
in determining whether Motor Carrier Management Information System 
(MCMIS) and state CMV crash data were sound. We reviewed the studies 
conducted by UMTRI and determined that the methodologies used in 
assessing the quality of the data states submit to FMCSA were sound and 
that the studies provided sufficiently reliable results for our 
purposes. Through site visits, a review of grant applications, 
interviews with relevant stakeholders/experts, and these studies, we 
were able to determine shortcomings in the reliability of FMCSA's CMV 
crash data. However, we determined that the data was sufficiently 
reliable for our purpose of case study selection. 

To provide results of FMCSA's efforts to facilitate the improvement of 
the state CMV crash data quality, we conducted interviews with 
officials from participating states and from FMCSA concerning the 
administration and management of SaDIP. We also analyzed the guidance 
and support provided by FMCSA to states for CMV data improvement 
efforts and assessed FMCSA's role in coordinating CMV data quality 
initiatives. In addition, we reviewed FMCSA's State Safety Data Quality 
map and assessed the methodology used by FMCSA in evaluating states' 
crash data quality. We interviewed officials and key experts at FMCSA 
and the Volpe National Transportation Systems Center responsible for 
developing and managing the data quality map. We also interviewed state 
officials from states participating--and not participating--in the 
SaDIP program to find out their views on the data quality map and its 
use as a monitoring tool for CMV crash data improvements. 

We conducted our review from February 2005 through November 2005 in 
accordance with generally accepted government auditing standards. 

[End of section] 

Appendix II: State Safety Data Quality Map Limitations: 

This appendix explains, in greater detail than the body of our report, 
the concerns we (and others) have raised about the limitations in the 
methodology FMCSA uses to develop ratings for the State Safety Data 
Quality map. The measures FMCSA employs to measure the completeness, 
timeliness, and accuracy of CMV crash data quality are limited, and do 
not provide comprehensive measurements of these attributes as 
established by the general data standards discussed in the body of this 
report. As a result, the ability to draw conclusions about the actual 
quality of a state's data is limited. 

The definitions FMCSA uses for each of the crash measures are shown in 
table 5, together with the criteria that constitute a rating of "good," 
fair," or "poor." In the sections that follow we explain the 
limitations associated with each measure, followed by other limitations 
that stem from the current methodology. 

Table 5: State Safety Data Quality Map Measures for CMV Crashes: 

Data quality standard: Completeness; 
Measure: The measure compares CMV fatality data in the MCMIS database 
against those in the Fatality Analysis and Reporting System (FARS). The 
FARS data is available through 2003; 
Criteria: Good = match to FARS is greater than or equal to 90 percent; 
Fair = match to FARS is between 80 and 89 percent; Poor 
= match to FARS is below 80 percent. 

Data quality standard: Timeliness; 
Measure: This measure reflects the percentage of state-reported crash 
records uploaded to the MCMIS database within the 90-day standard; 
Criteria: Good = the percentage of reported records within 90 days is 
greater than or equal to 85 percent; Fair = the percentage of reported 
records within 90 days is between 60-84 percent; Poor = the percentage 
of reported records is less than 60 percent. 

Data quality standard: Accuracy; 
Measure: This measure reflects the percentage of state-reported crash 
records (fatal and non-fatal) that were matched to a motor carrier in 
MCMIS over a 12-month period; 
Criteria: Good = the percentage of matched records is greater than or 
equal to 95 percent; Fair = the percentage of matched records is 
between 85 and 94 percent; Poor = the percentage of matched records is 
below 85 percent. 

Source: FMCSA. 

[End of table] 

Completeness: Overall Completeness is Based on Fatal Crashes Only: 

A key limitation in FMCSA's measure of crash data completeness is the 
inability to evaluate completeness against nonfatal CMV crashes. 
Currently, the total number of nonfatal crashes occurring on a state- 
by-state basis is unknown; no baseline exists against which to measure 
these records.[Footnote 41] Consequently, FMCSA is limited to measuring 
crash data completeness with fatal CMV crashes, which is approximately 
3 percent of all CMV crashes.[Footnote 42] 

Even within this narrowed dataset, the use of FARS[Footnote 43] as the 
basis of comparison poses other limitations: 

* There are notable differences in definitions used in the two 
databases. Most importantly, MCMIS can be subject to individual state 
definitions that may differ from FARS. According to officials at the 
Volpe National Transportation Systems Center, the range of these state- 
driven definitional differences is unknown. However, based on anecdotal 
evidence received from states and reported by Volpe, these differences 
can vary by state and can account for about 4 percent of crash 
records.[Footnote 44] Some examples are below: 

* A crash fatality resulting from the private use of a large truck may 
not meet the criteria as a reportable crash for MCMIS, but it is 
considered a large truck crash fatality by FARS. 

* MCMIS defines a large truck as any truck greater than a 10,000 gross 
vehicle weight rating. Many states do not collect the gross vehicle 
weight rating of vehicles and instead define a large truck based on an 
older FMCSA reporting criterion of greater than six tires or another 
definition of their choosing. As such, counts of large trucks derived 
from state crash databases (and reported to FARS) may be inconsistent 
with counts of vehicles that were reported to MCMIS.[Footnote 45] 

The effect of these definitional differences is evident when comparing 
MCMIS fatal CMV crashes to FARS data--the state's index on completeness 
for crash data rises above 100 percent. The extent to which this occurs 
is substantial: in our analysis of the June 2005 issuance of the map, 
24 of 51 states (47 percent) had a completeness measure of greater than 
100 percent.[Footnote 46] 

* FARS data is not current. FARS is released once per year, detailing 
the prior calendar year's crash statistics. The lag between the most 
recently available FARS data and the most recent map issuance can be 
considerable. For example, the 2004 FARS annual report is not scheduled 
to be released until fall 2005 and as a result, the baseline for 
completeness data in the June 2005 map is based upon 2003 crash data. 

Timeliness: Timeliness is Not Based on All Reported Crashes: 

The timeliness measure currently relies on a subset of records states 
submit to MCMIS. Any record that has been edited or changed since it 
was originally entered into MCMIS is not included in the 
calculations.[Footnote 47] If a record is edited, the initial upload 
date is replaced with the date it was updated. Because FMCSA's 
timeliness rating is based on the percentage of crash reports uploaded 
within 90 days of the crash, and edited records no longer reflect their 
initial upload date, they cannot be used in the timeliness calculation 
without distorting a state's rating. The consequence, however, is that 
timeliness is not measured against the entire universe of crashes in 
MCMIS--the more records a state edits, the fewer records its timeliness 
rating represents. FMCSA has acknowledged this problem with the edited 
records and is taking steps to resolve it. 

Another limitation with this measure is that efforts to reduce backlogs 
of crash records--a positive effort--can have a negative effect on the 
timeliness rating. If a state submits a backlog of reports from CMV 
crashes that occurred more than 90 days previously, and the crash took 
place during the period of time which the FMCSA rating covers, then the 
state's timeliness rating will be negatively affected. Conversely, if a 
state has a large backlog, its current timeliness rating may not be a 
meaningful representation of timeliness because the methodology has no 
way to capture those records that are accruing in the backlog. 

Accuracy: Accuracy is Based on Only One Variable: 

The measure for accuracy on the data quality map is based only on a 
match of a CMV crash report in MCMIS, against a registered carrier in 
MCMIS. The U.S. Department of Transportation (DOT) assigns each 
interstate motor carrier a unique identifier number--the DOT number. 
For this measure, accuracy is evaluated primarily based on the degree 
to which a carrier's DOT number is the same as identified for a crash 
in MCMIS.[Footnote 48] Therefore, a crash report with a missing or 
invalid DOT number may be considered inaccurate, even if the rest of 
the information on the report provides accurate information on the 
crash. Currently, the data quality map does not assess the accuracy of 
any other data elements; however, FMCSA has been working with states to 
improve their collection of CMV crash information. For example, 
recently FMCSA distributed visor cards to state officials that explain 
how to determine who the carrier is, and where the correct carrier's 
DOT number can be found. See appendix III for copies of these cards. 

[End of section] 

Appendix III: FMCSA Reportable Crash, CMV, and Carrier Identification 
Visor Cards: 

The following are copies of visor identification cards that FMCSA 
created as educational tools for law enforcement officers. The cards 
can be placed in officer's sun visor and referenced to determine 
whether a CMV crash should be reported, for identifying a vehicle as a 
CMV, and for identifying the correct carrier involved in the crash. 
FMCSA provided these to states to distribute to enforcement officers 
with the intention they will increase the officers' ability to properly 
identify a CMV and a reportable CMV crash. 

Figure 3: FMCSA Reportable Crashes: 

[See PDF for image] 

[End of figure] 

Figure 4: Reportable Commercial Motor Vehicle Configurations and Cargo 
Body Type: 

[See PDF for image] 

[End of figure] 

Figure 5: Responsible Carrier and Correct DOT Number Identification: 

[See PDF for image] 

[End of figure] 

[End of section] 

Appendix IV: SaDIP Grant and Cooperative Agreement Distribution by 
State: 

Table 6: Distribution of SaDIP Grants by State: 

State: Alaska; 
FY2002: $198,000; 
FY2003: $0; 
FY2004: $0; 
FY2005: $218,626; 
Total: $416,626. 

State: California; 
FY2002: $0; 
FY2003: $0; 
FY2004: $100,000; 
FY2005: $0; 
Total: $100,000. 

State: Connecticut; 
FY2002: $234,056; 
FY2003: $0; 
FY2004: $0; 
FY2005: $0; 
Total: $234,056. 

State: Georgia; 
FY2002: $269,820; 
FY2003: $0; 
FY2004: $409,000; 
FY2005: $553,733; 
Total: $1,232,553. 

State: Indiana; 
FY2002: $179,321; 
FY2003: $242,423; 
FY2004: $0; 
FY2005: $0; 
Total: $421,744. 

State: Kansas; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $373,436; 
Total: $373,436. 

State: Kentucky; 
FY2002: $199,000; 
FY2003: $0; 
FY2004: $0; 
FY2005: $0; 
Total: $199,000. 

State: Maryland; 
FY2002: $145,600; 
FY2003: $0; 
FY2004: $0; 
FY2005: $0; 
Total: $145,600. 

State: Massachusetts; 
FY2002: $100,000; 
FY2003: $249,972; 
FY2004: $0; 
FY2005: $0; 
Total: $349,972. 

State: Michigan; 
FY2002: $0; 
FY2003: $115,845; 
FY2004: $0; 
FY2005: $0; 
Total: $115,845. 

State: Minnesota; 
FY2002: $363,000; 
FY2003: $0; 
FY2004: $620,000; 
FY2005: $0; 
Total: $983,000. 

State: Montana; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $328,564; 
Total: $328,564. 

State: Nebraska; 
FY2002: $0; 
FY2003: $0; 
FY2004: $2,000; 
FY2005: $3,342; 
Total: $5,342. 

State: Nevada; 
FY2002: $427,443; 
FY2003: $0; 
FY2004: $0; 
FY2005: $350,000; 
Total: $777,443. 

State: New Hampshire; 
FY2002: $0; 
FY2003: $98,068; 
FY2004: $0; 
FY2005: $0; 
Total: $98,068. 

State: New Mexico; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $250,000; 
Total: $250,000. 

State: New York; 
FY2002: $0; 
FY2003: $0; 
FY2004: $300,000; 
FY2005: $0; 
Total: $300,000. 

State: North Carolina; 
FY2002: $0; 
FY2003: $0; 
FY2004: $193,350; 
FY2005: $0; 
Total: $193,350. 

State: Oklahoma; 
FY2002: $0; 
FY2003: $0; 
FY2004: $131,111; 
FY2005: $0; 
Total: $131,111. 

State: Pennsylvania; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $2,000,000; 
Total: $2,000,000. 

State: Rhode Island; 
FY2002: $25,000; 
FY2003: $0; 
FY2004: $26,000; 
FY2005: $0; 
Total: $51,000. 

State: South Carolina; 
FY2002: $350,000; 
FY2003: $0; 
FY2004: $0; 
FY2005: $0; 
Total: $350,000. 

State: South Dakota; 
FY2002: $270,000; 
FY2003: $0; 
FY2004: $0; 
FY2005: $0; 
Total: $270,000. 

State: Tennessee; 
FY2002: $0; 
FY2003: $0; 
FY2004: $436,027; 
FY2005: $149,334; 
Total: $585,361. 

State: Texas; 
FY2002: $0; 
FY2003: $148,424; 
FY2004: $533,611; 
FY2005: $0; 
Total: $682,035. 

State: Utah; 
FY2002: $0; 
FY2003: $0; 
FY2004: $433,500; 
FY2005: $128,853; 
Total: $562,353. 

State: Vermont; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $134,000; 
Total: $134,000. 

State: Washington; 
FY2002: $0; 
FY2003: $0; 
FY2004: $803,935; 
FY2005: $0; 
Total: $803,935. 

State: West Virginia; 
FY2002: $157,500; 
FY2003: $0; 
FY2004: $0; 
FY2005: $603,943; 
Total: $761,443. 

Total; 
FY2002: $2,918,740; 
FY2003: $854,732; 
FY2004: $3,988,534; 
FY2005: $5,093,831; 
Total: $12,855,837. 

Source: FMCSA. 

Note: This data is current as of September 2005. 

[End of table] 

Table 7: Distribution of SaDIP Cooperative Agreements by State: 

State: Colorado; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $1,460,000; 
Total: $1,460,000. 

State: Georgia; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $612,051; 
Total: $612,051. 

State: Iowa; 
FY2002: $728,065; 
FY2003: $0; 
FY2004: $0; 
FY2005: $226,400; 
Total: $954,465. 

State: Louisiana; 
FY2002: $829,625; 
FY2003: $0; 
FY2004: $0; 
FY2005: $569,000; 
Total: $1,398,625. 

State: Ohio; 
FY2002: $406,330; 
FY2003: $0; 
FY2004: $91,000; 
FY2005: $224,400; 
Total: $721,730. 

State: Oklahoma; 
FY2002: $0; 
FY2003: $0; 
FY2004: $1,150,390; 
FY2005: $0; 
Total: $1,150,390. 

State: Tennessee; 
FY2002: $281,954; 
FY2003: $0; 
FY2004: $219,543; 
FY2005: $426,382; 
Total: $927,879. 

State: Vermont; 
FY2002: $0; 
FY2003: $0; 
FY2004: $0; 
FY2005: $136,500; 
Total: $136,500. 

State: Virginia; 
FY2002: $42,300; 
FY2003: $0; 
FY2004: $0; 
FY2005: $431,617; 
Total: $473,917. 

State: Total; 
FY2002: $2,288,274; 
FY2003: $0; 
FY2004: $1,460,933; 
FY2005: $4,086,350; 
Total: $7,835,557. 

Source: FMCSA. 

Note: This data is as of September 2005. 

[End of table] 

[End of section] 

Appendix V: SaDIP Case Study States: 

As part of our work, we conducted six case studies to examine how 
states are working to improve commercial motor vehicle crash data. Our 
visits to the six states yielded additional information about crash 
data quality improvement activities, the nature of their efforts, and 
the extent of progress made. States were chosen on a wide variety of 
factors, including crash data quality and participation in the SaDIP 
program. 

Georgia: 

Georgia received a total of $1,844,604 between 2002 and 2005 (see table 
8) to conduct crash data improvement projects. During this period, 
Georgia made significant improvement in its crash data quality, despite 
undergoing a major state government reorganization process. 

Table 8: Georgia SaDIP Funding History: 

Award year: 2005; 
Funds awarded: $612,051; 
Agency: FMCSA; 
Award type: Cooperative agreement. 

Award year: 2005; 
Funds awarded: $553,733; 
Agency: FMCSA; 
Award type: Grant. 

Award year: 2004; 
Funds awarded: $409,000; 
Agency: FMCSA; 
Award type: Grant. 

Award year: 2002; 
Funds awarded: $269,820; 
Agency: NHTSA; 
Award type: Grant. 

Source: FMCSA. 

[End of table] 

SaDIP Projects: 

Georgia received its first SaDIP grant from the National Highway 
Traffic Safety Administration (NHTSA), and this grant was used to 
accomplish several projects, including: (1) hiring temporary employees 
to extract CMV crash reports from backlogged paper copies and microfilm 
them; (2) precoding the paper crash reports in preparation for their 
entry into the state repository system; (3) renting a mobile trailer to 
house temp employees; (4) developing a system to electronically 
transfer crash data from the state repository to the national database; 
(5) adding edit checks to the state crash database; (6) purchasing a 
new microfilm scanner/reader; and, (7) hiring a contractor to update 
the state's crash report manual. Progress on these projects was 
reported quarterly to NHTSA. 

Georgia received two additional grants after FMCSA became the lead 
agency for the SaDIP program. The first grant was used to maintain the 
temporary employees. Additionally, these funds were used to hire a 
statistician as a SaDIP advocate. This grant allowed Georgia to 
eliminate its reporting backlog from 2003 to the present. The second 
grant that Georgia received was used to fund its electronic crash 
records system. This system will electronically transfer crash records 
from local jurisdictions to the state repository system. This project 
has been put on hold following a state government reorganization. 

Crash Data Collection/Reporting Issues: 

Georgia historically has had the worst CMV crash data quality rating in 
the nation (see crash data quality statistics, table 9). This is due to 
the fact that until recently, Georgia was not submitting crash data to 
FMCSA. Georgia's current overall rating is "good," and it is rated 
"good" for completeness and "fair" for timeliness and accuracy. For all 
other rating periods prior to March 2004, Georgia had an overall rating 
of "poor." This improvement in its data quality is due in large part to 
the SaDIP program. Georgia's non-reporting was due to a technical 
problem, but the state was able to correct it in a timely manner 
because of the SaDIP funding it received. 

Table 9: Georgia Crash Data Quality Statistics (Percentages): 

Completeness; 
March 2004: 117; 
June 2004: 115; 
September 2004: 95; 
December 2004: 95; 
March 2005: 95; 
June 2005: 95. 

Timeliness; 
March 2004: 6; 
June 2004: 8; 
September 2004: 27; 
December 2004: 48; 
March 2005: 65; 
June 2005: 80. 

Accuracy; 
March 2004: 84; 
June 2004: 82; 
September 2004: 82; 
December 2004: 83; 
March 2005: 94; 
June 2005: 93. 

Source: FMCSA. 

[End of table] 

SaDIP funds are the only federal funds currently being used to address 
data quality improvement in Georgia. Because the state's data quality 
was so poor, both FMCSA and the Federal Highway Administration 
indicated they would withhold federal funding unless Georgia improved 
its reporting. It was at this point that Georgia applied for SaDIP, and 
began focusing on its data quality. As of July 1, 2005, Georgia began a 
major state government reorganization process, and officials were 
unsure of how this would affect the quality of its crash data because 
all collection and reporting functions are moving to new departments. 
Officials expect to see a decrease in the state's rating during the 
transition period, but do not expect long-term problems. 

State/Federal Coordination Issues: 

State and federal coordination on issues related to traffic safety data 
has not always been effective in Georgia. Between 1998 and 2002, 
Georgia had only submitted 250 fatal CMV crash records to FMCSA, yet 
between 2002 through 2003 Georgia submitted 410 CMV fatalities to FARS. 
Georgia's Department of Public Safety, the agency responsible for crash 
reporting, did not have a functional computer system to upload the 
state's crash file to FMCSA. The problem was discovered in 2002 and 
FMCSA offered assistance to fix the problem, but the Department of 
Public Safety did not accept the offer. FMCSA then threatened to 
withhold Georgia's highway safety funds until the data issues were 
resolved; at this point the Department of Public Safety applied for-- 
and received--SaDIP funding, and began work to resolve the crash data 
problems. 

Data Quality Map: 

Georgia officials stated that the data quality map provided an 
incentive for the state to make data quality improvements in order to 
raise its rating from "poor" to "good." State and federal officials 
acknowledge that the map is one indication of the progress the state 
has made in improving its CMV data quality. 

Georgia officials never disagreed with their state's rating on the data 
quality map. However, officials felt that the map did not recognize 
improvements that were taking place within the state to improve crash 
data. The state plans to continue to make improvements to the crash 
records system even after the state reaches the "good" rating. 

Minnesota: 

Minnesota received a total of $983,000 in 2002 and 2004 (see table 10) 
to conduct crash data improvement projects. During this period, 
Minnesota received safety data improvement grants from NHTSA and FMCSA. 

Table 10: Minnesota SaDIP Funding History: 

Award year: 2004; 
Funds awarded: $370,000; 
Agency: FMCSA; 
Award type: Grant. 

Award year: 2004; 
Funds awarded: $250,000; 
Agency: FMCSA; Award type: Grant. 

Award year: 2002; 
Funds awarded: $363,000; 
Agency: NHTSA; 
Award type: Grant. 

Source: FMCSA. 

[End of table] 

SaDIP Projects: 

Minnesota received its first SaDIP grant while NHTSA was administering 
the program. This grant went to the Department of Public Safety, Office 
of Traffic Safety--the agency that receives NHTSA highway safety funds. 
The grant was used to fund two projects, including an electronic post- 
crash inspection form (Minnesota conducts in-depth post-crash 
inspections for crashes involving CMVs), and to purchase software that 
will allow for electronic transfer of CMV crash records between Driver 
and Vehicle Services and the state patrol (both agencies within the 
Department of Public Safety). In Minnesota, Driver and Vehicle Services 
collects crash reports from law enforcement agencies, and the state 
patrol is the office in charge of reporting crashes to FMCSA. 

The Minnesota state patrol received subsequent funding after FMCSA 
became the lead agency on the SaDIP program. These funds have been used 
to provide computer hardware so that state patrol officers can access 
the Internet and submit their crash reports electronically through an 
online form. The second component of the grant is to provide ongoing 
CMV crash training to law enforcement officers throughout the state. 
This training will focus on correctly identifying CMVs and teaching 
officers why CMV crash data is important, and what it is used for at 
the state and national levels. 

Crash Data Collection/Reporting Issues: 

Before SaDIP funds were received, the Minnesota state patrol had 
limited access to Driver and Vehicle Services records. As a result, the 
state patrol was only able to report crashes to FMCSA that were 
investigated by their officers or were sent directly to the state 
patrol by local jurisdictions. Now all CMV records that are available 
at the Driver and Vehicle Services are reported to FMCSA (see crash 
data quality statistics, table 11). 

Table 11: Minnesota Crash Data Quality Statistics (Percentages): 

Completeness; 
March 2004: 117; 
June 2004: 117; 
September 2004: 116; 
December 2004: 123; 
March 2005: 121; 
June 2005: 121. 

Timeliness; 
March 2004: 72; 
June 2004: 59; 
September 2004: 58; 
December 2004: 50; 
March 2005: 67; 
June 2005: 81. 

Accuracy; 
March 2004: 78; 
June 2004: 74; 
September 2004: 82; 
December 2004: 86; 
March 2005: 91; 
June 2005: 93. 

Source: FMCSA. 

[End of table] 

The State of Minnesota encountered difficulty receiving crash reports 
from local law enforcement agencies. In addition, law enforcement 
officers incorrectly identified CMVs on crash reports. If law 
enforcement officers do not correctly identify the vehicle involved in 
the crash as a CMV, then the crash may not be extracted from the state 
crash file for submission to FMCSA. 

State/Federal Coordination Issues: 

Minnesota officials said that the state now has a crash data users 
group as a subsection of its Traffic Records Coordinating Committee. 
This group allows all users of crash data to discuss their data needs 
and limitations, and provides a forum for making collaborative 
recommendations regarding crash data improvements. 

Minnesota officials and the FMCSA Division Administrator work very 
closely with each other to monitor and improve CMV crash data quality. 
Neither of these groups feels that the FMCSA data quality rating map is 
an accurate portrayal of a state's data quality. For example, 
Minnesota's rating decreased due to the fact that it was implementing 
its electronic data transfer software, and during that period crash 
records were not reported to FMCSA in a timely fashion. The map showed 
the state's rating as "poor" during this period, which did not reflect 
the fact that the state was making strides in improving its data 
quality. 

Minnesota uses several sources of funding to improve its overall crash 
data and CMV crash data. In addition to FMCSA funding, NHTSA and state 
funds have been used to improve the state's data quality. 

Data Quality Map: 

Minnesota crash data has gone from an overall "fair" rating to "poor" 
to its current rating as "good." The drop in the state's rating was 
actually a result of the state not being able to report crashes during 
the implementation of SaDIP-funded improvement projects. After the 
electronic systems were fully implemented, the state's rating improved 
to "good" for completeness, and "fair" for timeliness and accuracy. 

Officials in Minnesota recognize that the map is an effective tool for 
getting the attention of decision-makers. While the map may not be an 
accurate assessment of the state's current data quality, both state and 
federal officials recognize that it does help to spotlight Minnesota's 
data quality status. Federal and state officials use the data quality 
map as a measure of the state's progress in improving data quality. 
State officials also use the map as an indicator of the success of the 
state's SaDIP project. Finally, officials in Minnesota said that the 
data quality map has been used to identify data quality problems in the 
state, including timeliness and accuracy, and to create projects to 
improve those measures. 

North Carolina: 

North Carolina received a total of $193,350 in 2004 (see table 12) to 
conduct crash data improvement projects. While North Carolina has 
excellent crash data reporting at the state level, North Carolina 
continues to have complications reporting that data to FMCSA. North 
Carolina is a priority state for FMCSA given the fact that it is one of 
the top five states for commercial motor vehicle accidents in the 
nation. 

Table 12: North Carolina SaDIP Funding History: 

Award year: 2004; 
Funds awarded: $193,350; 
Agency: FMCSA; 
Award type: Grant. 

Source: FMCSA. 

[End of table] 

SaDIP Projects: 

North Carolina is using its SaDIP grant to conduct two projects. First, 
the state conducted an analysis to determine what differences existed 
between the state data file and the federal crash file, and to try to 
determine why these differences were occurring. The completed analysis 
did not appear to fully address North Carolina's biggest problem, which 
is the data transfer between the two files. 

North Carolina is also using SaDIP to update a backlog it has in 
entering crash reports into the state data file by providing overtime 
pay to employees. 

Crash Data Collection/Reporting Issues: 

Historically, North Carolina has had poor CMV crash data (see table 
13). North Carolina is also one of FMCSA's priority states because of 
the large amount of crashes that take place in the state. It was rated 
as having "poor" data for each period since FMCSA began rating states. 

Table 13: North Carolina Crash Data Quality Statistics (Percentages): 

Completeness; 
March 2004: 82; 
June 2004: 82; 
September 2004: 77; 
December 2004: 77; 
March 2005: 77; 
June 2005: 77. 

Timeliness; 
March 2004: 0; 
June 2004: 100; 
September 2004: 0; 
December 2004: 0; 
March 2005: 1; 
June 2005: 2. 

Accuracy; 
March 2004: 81; 
June 2004: 100; 
September 2004: 81; 
December 2004: 80; 
March 2005: 80; 
June 2005: 81. 

Source: FMCSA. 

[End of table] 

North Carolina is unique in that it does not have a problem receiving 
crash reports from localities across the state. In other states, this 
is the biggest problem contributing to the states' data quality 
issues.[Footnote 49] In North Carolina the problem is getting records 
from the state crash file into the FMCSA data systems. FMCSA officials 
worked extensively with North Carolina to fix the file compatibility 
issues that are creating the problem exporting the data to FMCSA, but 
North Carolina has failed to correct its information technology 
problems. According to state officials, the biggest issue in North 
Carolina is backlog. Even if the state corrects its backlog, the 
state's data rating will not improve because the records are not 
transferring correctly into the federal database. Additionally, North 
Carolina does not require certain elements to be captured on its crash 
report or in its state database that are required in the federal 
database. Specifically, North Carolina does not require that motor 
carrier DOT numbers are collected, which may contribute to the state's 
"poor" accuracy rating. 

While North Carolina has very good data at the state level, the state's 
largest problem is reporting that data to the federal database. It 
appears that North Carolina's SaDIP projects are not focusing on this 
issue. 

State/Federal Coordination Issues: 

North Carolina coordination between state agencies involved in CMV 
crash data has not always been effective. A major reorganization of the 
state's highway safety agency contributed to this lack of coordination, 
but the state is working to improve cooperation among these agencies 
with regards to traffic records. The largest factor contributing to 
North Carolina's poor crash data quality appears to be a lack of 
understanding by some state officials regarding how to convert a crash 
data file to the correct format in order to submit it to FMCSA. The 
SaDIP grant has provided the state the opportunity to review its crash 
data problems, a project that the state would not have conducted 
otherwise. The state is also using its own funds to implement an 
electronic crash reporting system in order to get crash data into the 
state data file more quickly. 

Data Quality Map: 

State and federal officials in North Carolina use the data quality map 
to measure the progress the state is making with its SaDIP grant and 
with data improvements overall. 

North Carolina state officials use the map as an incentive for 
implementing data improvements and they appear motivated to change 
North Carolina's rating for the better. The map has also been used in 
North Carolina to identify data quality problems and target improvement 
efforts, although more still needs to be done. 

Both federal and state officials expect to continue to make 
improvements to the state's traffic records infrastructure after the 
state achieves the highest data rating. 

Ohio: 

Ohio received a total of $721,730 between 2002 and 2005 (see table 14) 
to conduct crash data improvement projects. Ohio has been very 
proactive in addressing traffic safety data concerns and the state has 
paid particular attention to using crash data for planning. 

Table 14: Ohio SaDIP Funding History: 

Award year: 2005; 
Funds awarded: $224,400; 
Agency: FMCSA; 
Award type: Cooperative agreement. 

Award year: 2004; 
Funds awarded: $91,000; 
Agency: FMCSA; 
Award type: Cooperative agreement. 

Award year: 2002; 
Funds awarded: $406,330; 
Agency: NHTSA/FMCSA; 
Award type: Cooperative agreement. 

Source: FMCSA. 

[End of table] 

SaDIP Projects: 

Ohio was one of the first 5 pilot states to participate in the SaDIP 
program. It received funds through a cooperative agreement with the 
General Services Administration (GSA) and NHTSA. Through this 
agreement, the state created electronic crash reporting capabilities, 
purchased handheld devices, modified crash reporting software, and is 
providing training to law enforcement officers to help them properly 
identify CMV crashes. 

Until recently, when FMCSA gained authority for the SaDIP cooperative 
agreements, Ohio had not submitted any progress reports to NHTSA on the 
status of these projects. 

Crash Data Collection/Reporting Issues: 

Ohio has a "fair" overall data rating, with "good" ratings for 
completeness and accuracy, but a "poor" rating for timeliness (see 
crash data quality statistics, table 15). Ohio has been very aware of 
the importance of CMV crash data and state officials have been working 
to improve it for a long time, even before the SaDIP program. 

Table 15: Ohio Crash Data Quality Statistics (Percentages): 

Completeness; 
March 2004: 95; 
June 2004: 95; 
September 2004: 95; 
December 2004: 95; 
March 2005: 95; 
June 2005: 95. 

Timeliness; 
March 2004: 15; 
June 2004: 17; 
September 2004: 11; 
December 2004: 23; 
March 2005: 40; 
June 2005: 46. 

Accuracy; 
March 2004: 84; 
June 2004: 97; 
September 2004: 97; 
December 2004: 98; 
March 2005: 98; 
June 2005: 98. 

Source: FMCSA. 

[End of table] 

The state of Ohio has over 1,000 local law enforcement jurisdictions 
that are responsible for reporting CMV accidents. Both large and small 
jurisdictions are likely responsible for Ohio's "poor" crash data 
timeliness rating. The state has reduced the time lag in receiving 
crash reports from local jurisdictions from 62 days in 2004 to 30 days 
in 2005. 

State/Federal Coordination Issues: 

Ohio state agencies involved in the collection and reporting of CMV 
crash data appear coordinated with each other, and have an active 
Traffic Records Coordinating Committee, which includes participation by 
FMCSA division officials. Ohio uses multiple sources of funding to 
address data quality issues, including state funds, and the state is 
proactive in data improvement projects. 

Data Quality Map: 

State and federal officials said that the map provided one gauge of 
Ohio's data quality, but they felt the map would be more useful if it 
were updated more often. The map has brought more attention to data 
quality issues in Ohio and is included in reports to state leaders. 

State and federal officials in Ohio said that data quality improvements 
would still be a top priority regardless of the data quality map. Ohio 
officials also stated that the inaccuracies in the map had a big effect 
on staff morale. Local officials said they are working hard to improve 
their state's data quality, but the map does not accurately capture 
that improvement. State officials recognize the map provides a major 
incentive for implementing data quality improvements and maintaining 
the state's standing as a leader in traffic data quality. 

Oklahoma: 

Oklahoma received a total of $1,281,501 in 2004 (see table 16) to 
conduct crash data improvement projects. Oklahoma's traffic records 
coordinating committee has taken the lead in the coordination of crash 
data projects in the state. 

Table 16: Oklahoma SaDIP Funding History: 

Award year: 2004; 
Funds awarded: $1,150,390; 
Agency: FMCSA; 
Award type: Cooperative agreement. 

Award year: 2004; 
Funds awarded: $131,111; 
Agency: FMCSA; 
Award type: Grant. 

Source: FMCSA. 

[End of table] 

SaDIP Projects: 

Oklahoma began participation in the SaDIP program in July 2004 when it 
received a grant to purchase computer equipment, and then entered into 
a four year cooperative agreement with FMCSA in December 2004 to 
support its long term data quality improvement plans. This includes 
conducting a traffic records assessment, hiring a SaDIP coordinator, 
funding data entry for the record backlog, revising the state's crash 
report, and initiating a mobile data collection pilot project. 

Oklahoma officials said that the projects that are taking place using 
SaDIP funds would not have been funded otherwise. The SaDIP grant and 
long-term cooperative agreement have allowed the state to focus on CMV 
data and to begin to make data-driven decisions in its highway safety 
planning. 

Crash Collection/Reporting Issues: 

Until recently Oklahoma had a nine month crash report backlog to be 
entered into the state system (see crash data quality statistics, table 
17). The backlog was primarily due to insufficient state resources. 
Overtime hours funded by SaDIP has helped to alleviate this backlog. 

Table 17: Oklahoma Crash Data Quality Statistics (Percentages): 

Completeness; 
March 2004: 104; 
June 2004: 104; 
September 2004: 102; 
December 2004: 102; 
March 2005: 101; 
June 2005: 101. 

Timeliness; 
March 2004: 93; 
June 2004: 92; 
September 2004: 93; 
December 2004: 92; 
March 2005: 90; 
June 2005: 83. 

Accuracy; 
March 2004: 84; 
June 2004: 86; 
September 2004: 87; 
December 2004: 86; 
March 2005: 86; 
June 2005: 84. 

Source: FMCSA. 

[End of table] 

State/Federal Coordination Issues: 

Oklahoma has been working with the GSA to receive payments on its 
cooperative agreement, and officials said that this has caused the 
state some confusion. 

Oklahoma's crash data divisions are housed in the state's Department of 
Public Safety and Department of Transportation. While crash records 
staff in both agencies work closely together to make sure that the 
state's crash file is complete, Oklahoma officials were unaware of the 
value of CMV crash data at the national level, or of the criteria that 
FMCSA uses to rate CMV crash data. 

Oklahoma is using funds from several sources to improve its entire 
crash data system, including funds provided by NHTSA and the Federal 
Highway Administration. SaDIP funds are the only funds that the state 
is using specifically for CMV data improvements. 

Data Quality Map: 

While high-ranking officials in Oklahoma's Department of Public Safety 
and the Highway Patrol are familiar with the data quality map, state 
officials involved with the SaDIP program were less familiar. We found 
that state officials involved with the SaDIP program had a detailed 
understanding of their state's data quality, but had not used the data 
quality map as an indicator of progress for the SaDIP cooperative 
agreement. 

Oklahoma officials stated that they would undertake their data 
improvement program regardless of whether or not the state was ranked 
"good." State officials, for example, recognized that more improvements 
could be made in the state's timeliness measure even though the state 
had a "good" rating in this category. 

Washington: 

Washington received a total of $803,935 in 2004 (see table 18) to 
conduct crash data improvement projects. The state's traffic records 
coordinating committee is leading an electronic information processing 
initiative designed to reduce crash reporting inefficiencies and help 
the state meet national traffic safety goals. 

Table 18: Washington SaDIP Funding History: 

Award year: 2004; 

Funds awarded: $188,460; 
Agency: FMCSA; 
Award type: Grant. 

Award year: 2004; 
Funds awarded: $615,475; 
Agency: FMCSA; 
Award type: Grant. 

Source: FMCSA. 

[End of table] 

SaDIP Projects: 

SaDIP is primarily being used to implement an electronic data feed 
between the Washington Department of Transportation and the state 
patrol. This will allow records to be submitted instantly to the state 
patrol when they are entered into the state department of 
transportation database. It will also make records searchable so that 
eligible CMV crashes that are misidentified and not sent to the state 
patrol can be identified as CMV crashes, improving the completeness of 
the state's CMV crash data. 

SaDIP is also being used to eliminate a six-month backlog of crash 
reports that need to be entered into the state's database housed at the 
Washington Department of Transportation. Until reports are entered into 
the state Department of Transportation system, they cannot be 
transferred to the state patrol electronically, nor can those reports 
incorrectly identified as non-CMV crashes be easily identified. Until 
this takes place, Department of Transportation employees identify CMV 
reports among all crash reports and provide the state patrol with paper 
copies of the reports. 

Crash Collection/Reporting Issues: 

The largest problem that Washington State has with its crash data is 
receiving the entire crash report, including supplemental form, from 
the law enforcement offices that generate them (see crash data quality 
statistics, table 19). 

Table 19: Washington Crash Data Quality Statistics (Percentages): 

Completeness; 
March 2004: 96; 
June 2004: 96; 
September 2004: 103; 
December 2004: 103; 
March 2005: 108; 
June 2005: 108. 

Timeliness; 
March 2004: 99; 
June 2004: 98; 
September 2004: 98; 
December 2004: 99; 
March 2005: 99; 
June 2005: 99. 

Accuracy; 
March 2004: 95; 
June 2004: 96; 
September 2004: 96; 
December 2004: 97; 
March 2005: 97; 
June 2005: 97. 

Source: FMCSA. 

[End of table] 

Washington State also has a problem with incorrect identification of 
CMVs on police accident reports. If police do not correctly identify 
the vehicle involved in the crash as a CMV, then it may not get 
extracted from the state crash file for submission to FMCSA. 

State/Federal Coordination Issues: 

Washington State has very good cooperation among state agencies 
involved in crash data collection and reporting. The state also has a 
good working relationship with its FMCSA division office. It also uses 
FMCSA and state funds to address data quality issues, making decisions 
on how to use these funds effectively through an active Traffic Records 
Coordinating Committee. 

While SaDIP initiatives in Washington State are a topic of discussion 
at the Traffic Records Coordinating Committee meetings, the committee 
coordinator was not specifically aware of FMCSA's grant-making process 
and how those grants are accessed and then allocated by the state. 
Whereas NHTSA funds are accessed via the Traffic Records Coordinating 
Committee forum, FMCSA funds are processed within the Washington State 
Patrol's Commercial Vehicle Division. To date there has been no problem 
targeting safety data funding to immediate priorities as identified by 
the Traffic Records Coordinating Committee, but this is an area that 
may need to be addressed in the future. 

Data Quality Map: 

Washington State officials recognize that the data quality map is an 
important indicator of the state's progress in improving its crash 
records system. According to state officials, the data quality map has 
been used to measure the progress of the state's data quality 
improvements in general. 

State officials also indicated that they would continue to make 
improvements to their traffic records systems regardless of the state's 
data quality rating. Specifically, the Traffic Records Coordinating 
Committee is helping to coordinate the state's electronic reporting 
system. 

Even though the state has consistently ranked "good" on the map since 
its inception, state officials report that the map has been helpful for 
identifying areas where the state can make data quality improvements. 

[End of section] 

Appendix VI: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Katherine Siggerud (202) 512-6570 or [Hyperlink, siggerudk@gao.gov] 

Staff Acknowledgments: 

In addition to the contact named above, Randall Williamson, Assistant 
Director; Tida Barakat; Jonathan Carver; Bert Japikse; Sara Ann 
Moessbauer; Fred Swenson; Sharon Silas; and Stan Stenersen made key 
contributions to this report. 

(542054): 

FOOTNOTES 

[1] In 2003, there were also 289 fatal crashes involving buses. 

[2] FMCSA was formerly a part of the Federal Highway Administration. 
Its creation as a separate entity was pursuant to the Motor Carrier 
Safety Improvement Act of 1999. Pub. L. No. 106-159, § 101. 

[3] In 1999, Pub. L. No. 106-159, § 225, directed the Secretary of 
Transportation to carry out a program, which became known as the 
Commercial Vehicle Analysis Reporting System (CVARS). It is currently 
known as the Safety Data Improvement Program (SaDIP). 

[4] Safe, Accountable, Flexible, and Efficient Transportation Equity 
Act of 2005: A Legacy for Users (SAFETEA-LU), Pub. L. No. 109-59. 

[5] FMCSA began SaDIP in FY2000, but did not begin awarding funds to 
states until FY2002. In August 2005, Congress, through the 
reauthorization of surface transportation programs, authorized FMCSA's 
SaDIP an additional $11 million over the next 4 years. This will be 
used in addition to FMCSA funds to provide assistance to states. 

[6] DOT Inspector General, Improvements Needed in the Motor Carrier 
Safety Status Measurement System, MH-2004-034, (Washington, D.C.: 
February 13, 2004). 

[7] Senate Report 108-342. 

[8] The 34 states are Alaska, California, Colorado, Connecticut, 
Georgia, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maryland, 
Massachusetts, Michigan, Minnesota, Montana, Nebraska, Nevada, New 
Hampshire, New Mexico, New York, North Carolina, Ohio, Oklahoma, 
Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, 
Texas, Utah, Vermont, Virginia, Washington, and West Virginia. 
Participation was voluntary. 

[9] The six case studies were conducted in Georgia, Minnesota, North 
Carolina, Ohio, Oklahoma, and Washington. 

[10] Telephone interviews were conducted with FMCSA officials from New 
Hampshire, New Jersey, Michigan, and Missouri. We found during our 
interviews that New Hampshire is still participating in the program. 

[11] Compliance reviews are detailed audits of carriers, the results of 
which sometimes lead to enforcement action. 

[12] Safety audits review safety management systems and initial 
operating performance of new motor carriers. 

[13] Roadside inspections are conducted primarily by states. The 
information from these inspections is collected and reported to FMCSA. 

[14] The SafeStat data are also used to target carriers for roadside 
inspections. 

[15] Along with definitions developed by FMCSA, the National Governors' 
Association also provides guidance for submitting complete and accurate 
data on commercial vehicle crashes. In addition, the National Highway 
Traffic Safety Administration has provided guidance for reporting crash 
data based on the Model Minimum Uniform Crash Criteria and the American 
National Standard Manual for Classification of Motor Vehicle Traffic 
Records. 

[16] There are 20 data elements that were originally recommended by the 
National Governors' Association in 1989 and later adopted by FMCSA in 
1992. 

[17] Pub. L. No. 106-159, § 225. 

[18] The program was designed to be a system similar to the National 
Highway Transportation Safety Administration's Fatality Analysis 
Reporting System (FARS). 

[19] The SafeStat data are also used by to target carriers for roadside 
inspections. Department of Transportation Inspector General, 
Improvements Needed in the Motor Carrier Safety Status Measurement 
System, MH-2004-034 (Washington, D.C.: February 13, 2004). 

[20] FMCSA contracts with the UMTRI to conduct individual state data 
quality assessments. The assessments focused on the completeness of the 
data but also reviewed some accuracy and timeliness issues. To date, 
they have completed assessments of the following eight states: 
California (2005), Florida (2004), Illinois (2005), Michigan (2004), 
Missouri (2004), New Jersey (2005), North Carolina (2005), and Ohio 
(2003). UMTRI assessed 2003 data for all states except for Ohio, which 
used 2000 data and Missouri which used 2001 data. 

[21] Appendix III provides detailed information on what is a 
"reportable crash" and what is a "reportable commercial motor vehicle." 

[22] While FMCSA regulates interstate motor carriers (i.e., carriers 
operating across states), information on crashes involving both 
interstate or intrastate (i.e., carriers operating only within a state) 
should be submitted to FMCSA. CMVs that are being used only for private 
use and are involved in a crash do not need to be submitted to FMCSA. 

[23] Additional research and analysis is also being conducted by other 
public and private research entities. 

[24] FMCSA provides training classes for collecting and coding state 
crash data to representatives of the enforcement community in a growing 
number of states. In addition FMCSA has developed and maintained 
customized large truck crash data collection training materials for 
each state, including train-the-trainer visuals and student workbook 
materials, to enable the states to carry the training forward to 
officers throughout their state and to incorporate the training into 
their academies. 

[25] These are information cards that can be attached to a law 
enforcement officer's sun visor. 

[26] The A&I Online website, developed and maintained by FMCSA Analysis 
Division provides useful motor carrier safety information and analysis 
over the Internet. The Web site increases FMCSA's effectiveness in 
carrying out its programs and provides the motor carrier industry and 
the public with information to make safety-minded decisions. The 
website is a valuable information resource to FMCSA in promoting motor 
carrier safety and is widely used by FMCSA and state field staff in 
their preparation for and in conducting on-site motor carrier safety 
audits. 

[27] The data quality map includes an overriding indicator of crash 
consistency of a state's non-fatal crash data submitted to FMCSA, 
however, it is not measuring the consistency of data; rather it is 
another indicator of completeness. 

[28] These data are current as of September 2005. In addition to funds 
provided directly to states, FMCSA provided $1,200,000 to contractors 
to assist states in improving data quality, bringing the total amount 
spent on state data improvement efforts to $21,891,394. 

[29] FMCSA's Office of Information Management plans on requesting funds 
in line with SAFETEA-LU to include $2 million in grant funds for 
FY2006, and $3 million for FY2007-2009, as provided for in SAFETEA-LU 
for a total of $11 million. 

[30] The creation of this Web site (http://www.grants.gov) is part of 
the federal government's grants streamlining initiative. The initiative 
is the government-wide set of organizations and activities responsible 
for implementing the Federal Financial Assistance Management 
Improvement Act of 1999. Pub. L. No. 106-107, §113. 

[31] Crash data includes information about CMV crashes as reported by 
the states. Inspection data is submitted by the states to FMCSA and 
includes information collected through roadside inspections. Crash data 
is assessed for completeness, timeliness, and accuracy. Inspection data 
is only assessed for timeliness and accuracy. 

[32] The map's methodology includes a "Crash Consistency Overriding 
Indicator." A state is flagged if it submits less than 50 percent of 
the average number of reported non-fatal crash records for the current 
year based on the previous three-year average of non-fatal crash 
records. 

[33] This is based on our six case study visits to states that are 
currently participating in SaDIP and telephone interviews with state 
officials that had previously participated or had not participated in 
SaDIP. 

[34] Of the 436,000 police-reported crashes involving large trucks in 
2003, about 150,000 were required to be reported to FMCSA and of these, 
4,289 resulted in at least one fatality. 

[35] NHTSA's National Center for Statistics and Analysis created and 
developed the Fatality Analysis Reporting System (FARS). Fatality 
information derived from FARS includes motor vehicle traffic crashes 
that result in the death of an occupant of a vehicle or a nonmotorist 
within 30 days of the crash. FARS contains data on all fatal traffic 
crashes within the 50 states, the District of Columbia, and Puerto 
Rico. Each state employs a federal analyst who conducts a number of 
quality control procedures to ensure correct information about the 
fatality crash. A final FARS file is completed once a year. 

[36] They are also assessed on whether a state's reporting meets a 
minimum threshold based on past reporting (i.e., the "Crash Consistency 
Overriding Indicator"). 

[37] MCSAP high priority funds are funds provided to states and local 
governments to carry out activities and projects that directly support 
the MCSAP, including supporting, enriching, or evaluating state 
commercial motor vehicle safety programs. These funds are allocated at 
the discretion of FMCSA. States apply for funding and are awarded funds 
in an 80/20 split with the state (80 percent of funding is provided via 
federal sources and states are required to provide the other 20 percent 
of the funds) except if the funds are used for education and outreach 
activities. Currently if a state is rated as "poor" in its crash data 
quality and it applies for High Priority funding, the state is required 
to use its High Priority funding for crash data improvements. 

[38] New Hampshire was intended to represent a state that no longer 
participated in SaDIP; however, we learned during our interview that 
the original grant expiration date had recently been extended to 2006. 

[39] These officials, in general, included representatives from state 
traffic records coordinating committees, the governor's highway safety 
offices, departments of public safety, departments of transportation, 
and departments of motor vehicles. 

[40] FMCSA's data quality criteria include completeness, timeliness, 
accuracy, and consistency. 

[41] FMCSA is currently engaged in two efforts to assess baselines for 
non-fatal crashes in individual states. One effort is being conducted 
by Data Nexus and another is being conducted through the UMTRI 
assessments mentioned earlier in the report. 

[42] Of the 436,000 police-reported crashes involving large trucks in 
2003, about 150,000 were required to be reported to FMCSA and of these, 
4,289 resulted in at least one fatality. 

[43] FARS is a database of all fatal vehicle crashes maintained by the 
National Highway Traffic Safety Administration. 

[44] Also, according to Volpe, one state official estimated that 
definitional differences account for about 10 percent of fatal CMV 
crashes in their state. 

[45] Recently, FMCSA distributed visor cards to state officials that 
provide illustrations of FMCSA reportable CMVs. See appendix III for 
copies of these cards. 

[46] This includes the District of Columbia. 

[47] Editing might be needed, for example, if activity at the state 
level discloses errors or incomplete information in the record as 
originally submitted to MCMIS. 

[48] In general, if a crash report is missing a DOT number, FMCSA 
attempts to use other identifiers in the report to match it to a 
registered carrier, but officials told us that this has been met with 
limited success. 

[49] Additionally, an FMCSA official told us that this problem is 
unique to North Carolina. 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: