This is the accessible text file for GAO report number GAO-09-683 
entitled 'Career and Technical Education: States Have Broad Flexibility 
in Implementing Perkins IV' which was released on July 29, 2009. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Requesters: 

United States Government Accountability Office: 
GAO: 

July 2009: 

Career and Technical Education: 

States Have Broad Flexibility in Implementing Perkins IV: 

GAO-09-683: 

GAO Highlights: 

Highlights of GAO-09-683, a report to congressional requesters. 

Why GAO Did This Study: 

The Carl D. Perkins Career and Technical Education Act of 2006 (Perkins 
IV) supports career and technical education (CTE) in high schools and 
postsecondary institutions, such as community colleges. Perkins IV 
established student performance measures at the secondary and 
postsecondary levels for state agencies, such as state educational 
agencies, and local recipients, such as school districts, eligible to 
receive funds. GAO examined (1) how states have implemented the Perkins 
IV performance measures and what, if any, challenges they have faced in 
implementing the measures; (2) to what extent the Department of 
Education (Education) has ensured that states are implementing the new 
performance measures and supported states in their efforts; and (3) 
what Education knows about the effectiveness of CTE programs. To 
collect national-level data, GAO surveyed state CTE directors in the 50 
states and District of Columbia between January and April 2009, and 
received responses from all states and the District of Columbia. To 
view survey results, click on [hyperlink, 
http://redesign-www.gao.gov/special.pubs/gao-09-737sp/index.html]. We 
provided a draft copy of this report to Education for comment. We 
received technical comments, which we incorporated into the draft where 
appropriate. 

What GAO Found: 

States are implementing some of the Perkins IV performance measures 
using different approaches and report that the greatest challenge is 
collecting data on technical skill attainment and student placement. 
Flexibility in Perkins IV and Education’s guidance permits differences 
in how states implement the measures. According to our surveys, 34 
states at the secondary level and 29 at the postsecondary level intend 
to adopt Education’s recommended use of assessments—such as those for 
industry certifications—to measure technical skills. States reported 
that they face the most challenge collecting data on the technical 
skill attainment and student placement measures because of cost and 
concerns with their ability to access complete and accurate data. 

Education ensures states are implementing the Perkins IV accountability 
requirements through on-site monitoring and off-site document reviews, 
and supports states through technical assistance and guidance. 
Monitoring findings were most often related to states failing to submit 
complete or reliable data, and Education uses its findings to guide the 
technical assistance it provides to states. States reported that 
Education’s assistance has helped them implement the performance 
measures, but that more assistance with technical skill attainment 
would be helpful. Education is aware of states’ need for additional 
assistance and has taken actions to address this, including 
facilitating a state-led committee looking at technical assessment 
approaches. 

State performance measures are the primary source of data available to 
Education for determining the effectiveness of CTE programs, and 
Education relies on student outcomes reported through these measures to 
gauge the success of states’ programs. Because only 2 of 11 measures 
(secondary and postsecondary have 3 measures in common) have been 
implemented and reported on thus far, Education has little information 
to date on program outcomes. In addition, Perkins IV does not require 
states to report to Education the findings of their program 
evaluations. In our surveys of state CTE directors, nearly half of 
states responded that they have conducted or sponsored a study to 
examine the effectiveness of their CTE programs. We reviewed 7 of these 
studies and found that only 4 were outcome evaluations.  

Figure: Perkins IV Performance Measures: 

[Refer to PDF for image: illustration] 

Secondary measures: 
* Academic attainment in reading/language arts and mathematics; 
* Secondary school completion; 
* Student graduation rate. 

Overlapping measures: 
* Technical skill attainment; 
* Student placement; 
* Nontraditional participation and completion. 

Postsecondary measures: 
* Credential, certificate, or degree attainment; 
* Student retention or transfer. 

Source: GAO. 

[End of figure] 

What GAO Recommends: 

This report contains no recommendations. 

View [hyperlink, http://www.gao.gov/products/GAO-09-683] or key 
components. For more information, contact George A Scott at (202) 512-
7215 or scottg@gao.gov. 

[End of section] 

Contents: 

Letter: 

Background: 

States Are Implementing Some Performance Measures Using Different 
Approaches and Report That the Greatest Challenge Is Collecting Data on 
Technical Skill Attainment and Student Placement: 

Education Uses Risk-Based Monitoring to Ensure Implementation of the 
Performance Measures and Supports States through Technical Assistance 
and Guidance: 

Education Relies on the Performance Measures to Gauge the Success of 
State CTE Programs: 

Concluding Observations: 

Agency Comments: 

Appendix I: Scope and Methodology: 

Appendix II: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Perkins IV Performance Measures at the Secondary Level: 

Table 2: Perkins IV Performance Measures at the Postsecondary Level: 

Table 3: Approaches That States Will Use to Collect Data on the Student 
Technical Skill Attainment Measure, by Number of States: 

Table 4: Education's Assistance to States for Perkins IV 
Implementation: 

Figures: 

Figure 1: Perkins IV Performance Measures at the Secondary and 
Postsecondary Levels: 

Figure 2: Number of States Planning to Use Technical Assessments 
Administered at Various Times, by Type of Assessment: 

Figure 3: Number of States Reporting Data Collection for Perkins 
Performance Measures as a Great or Very Great Challenge at the 
Secondary Level, by Performance Measure: 

Figure 4: Number of States Reporting Data Collection for Perkins 
Performance Measures as a Great or Very Great Challenge at the 
Postsecondary Level, by Performance Measure: 

Figure 5: Most Commonly Used Methods to Collect Student Placement Data, 
by Number of States and Educational Level: 

Abbreviations: 

CTE: career and technical education: 

GED: general educational development: 

GPA: grade point average: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

July 29, 2009: 

The Honorable Edward M. Kennedy: 
Chair: 
The Honorable Michael B. Enzi: 
Ranking Member: 
Committee on Health, Education, Labor, and Pensions: 
United States Senate: 

The Honorable Patty Murray: 
Chair: 
The Honorable Johnny Isakson: 
Ranking Member: 
Subcommittee on Employment and Workplace Safety: 
Committee on Health, Education, Labor, and Pensions: 
United States Senate: 

The shift to a global economy and rapid advances in technology 
underscore the importance of preparing our current and future workforce 
for high-demand careers with 21st century skills, such as those that 
emphasize problem solving and teamwork. In the 2006-2007 program year, 
[Footnote 1] more than 15 million high school and college students 
nationwide participated in career and technical education (CTE) 
programs, which are designed to provide students with the academic and 
career and technical skills to help them succeed in the workforce. As 
authorized by the Carl D. Perkins Career and Technical Education Act of 
2006[Footnote 2] (Perkins IV), Congress provided states with $1.2 
billion in fiscal year 2008 to support career and technical education 
in high schools and to support programs in postsecondary institutions, 
such as community colleges. The U.S. Department of Education 
(Education) estimates that approximately 5 percent of all funds that 
states use for CTE programs are federal funds, with state and local 
funding generally covering the remainder. The American Recovery and 
Reinvestment Act of 2009 provides additional funds that states can use 
to help support their CTE programs. Federal funds for CTE programs are 
likely to take on increasing importance as states continue to confront 
mounting fiscal pressures that may lead them to propose cuts to 
secondary and postsecondary education spending used to support career 
and technical education. 

Perkins IV aims to prepare students for current or emerging high-skill, 
high-wage, or high-demand jobs by emphasizing rigorous student academic 
and technical skill achievement, increased accountability for student 
outcomes, and enhanced coordination between secondary and postsecondary 
career and technical education. It also seeks to increase state and 
local flexibility in providing career and technical education by 
involving multiple groups such as students, parents, and local 
administrators in planning and administration, and by allowing states 
flexibility in the design of their accountability systems. To increase 
accountability for student outcomes, Perkins IV established student 
performance measures at the secondary and postsecondary levels for 
state agencies,[Footnote 3] such as state educational agencies or state 
college and university systems, as well as for local recipients of 
funds, such as school districts. Key performance measures include 
student attainment of academic content standards and student academic 
achievement standards, as adopted by the state in accordance with the 
requirements of Title I of the Elementary and Secondary Education Act. 
Overall, Perkins IV reflects a shift from an emphasis on vocational 
education--once considered by some to be an occupationally specific 
track for students with lower academic skills--to an emphasis on 
preparing students for entry into high-demand occupations. 

Education provides technical assistance and guidance to states 
regarding their data collection and student definitions and measurement 
approaches. States report annually to Education on their progress in 
meeting their performance targets for the measures. In light of a 
governmentwide focus on performance and accountability, you asked us to 
examine (1) how states have implemented the Perkins IV performance 
measures, and what, if any, challenges they have faced in implementing 
the measures; (2) to what extent Education has ensured that states are 
implementing the new performance measures and supported states in their 
efforts; and (3) what Education knows about the effectiveness of CTE 
programs. 

To answer our three research questions, we collected data through 
multiple methods. First, to gather state-level information on Perkins 
IV implementation, we collected information through two Web-based 
surveys of state CTE directors, at the secondary and postsecondary 
levels, in the 50 states and the District of Columbia. The surveys 
obtained information on the types of data states collect for the 
student performance measures and challenges they face; technical 
assistance, guidance, and monitoring states receive from Education; and 
how states evaluate their CTE programs. We administered the surveys 
between January and April 2009 and received responses from all 50 
states and the District of Columbia. While we did not fully validate 
specific information that state officials reported through our surveys, 
we reviewed the information to determine that their responses were 
complete and reasonable and found the information to be sufficiently 
reliable for the purposes of this report. This report does not contain 
all the results from the surveys. The surveys and a more complete 
tabulation of the results can be viewed online at GAO-09-737SP. In 
addition to our surveys, we collected information from site visits to 
California, Minnesota, and Washington state. These states represent 
variation across characteristics such as the type of state agency 
eligible to receive Perkins funds; amount of Perkins IV funds received 
in fiscal year 2008; and type of approach used to assess how students 
attain technical skills, a key program outcome. We interviewed 
secondary and postsecondary officials at the state level and officials 
from local recipients of Perkins funds--that is, school districts and 
postsecondary institutions--that varied by number of CTE students 
served, amount of Perkins funding received, and geographic location 
(urban versus rural). We also reviewed relevant federal legislation and 
agency guidance and interviewed Education officials to obtain 
information on how states have implemented the performance measures, 
how Education has monitored and supported states in their efforts to 
implement the performance measures, and what Education knows about how 
states are evaluating their local CTE programs. To analyze how states 
are evaluating CTE programs, we reviewed state Perkins plans and annual 
reports submitted to Education from the 50 states and the District of 
Columbia. See appendix I for detailed information on our surveys and 
site visits. 

We conducted this performance audit from August 2008 to July 2009 in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

Background: 

Under Perkins IV, Education Allocates Funds for Career and Technical 
Education to States in order to Improve Local CTE Programs: 

The principal source of federal funding for CTE, Perkins IV authorizes 
federal grant funds for the enhancement of CTE for secondary and 
postsecondary students. In fiscal year 2008, Congress appropriated $1.2 
billion for the improvement of local CTE programs. Education's Office 
of Vocational and Adult Education allocates the funds to states, 
[Footnote 4] which retain up to 15 percent of the funds for 
administration and state leadership of CTE programs,[Footnote 5] before 
passing at least 85 percent of the funds on to local recipients of 
funds, such as local school districts and community colleges. States 
determine the percentage of funds that will be allocated to the 
secondary and postsecondary levels. The majority of funds allocated to 
the secondary level are passed on to local recipients based on the 
school district's share of students from families below the poverty 
level for the preceding fiscal year. Postsecondary funds are primarily 
allocated based on the institution's share of Pell Grant recipients. 
[Footnote 6] 

Perkins IV Established Performance Measures for Secondary and 
Postsecondary Levels and Requires States and Local Recipients to Report 
on Program Outcomes: 

Perkins IV established six student performance measures at the 
secondary level and five performance measures at the postsecondary 
level. These measures represent a range of student outcomes, such as 
attainment of technical skills and placement in employment or further 
education following the completion of CTE programs. In addition, the 
measures include the nontraditional participation and completion of 
students from an underrepresented gender in programs with significant 
gender disparities (such as women participating in auto repair), among 
others (see tables 1 and 2 for a description of the Perkins IV 
performance measures). To ease states' transition to the new provisions 
in Perkins IV, Education permitted states to submit a 1-year transition 
plan that covered only the first program year of Perkins IV 
implementation, 2007-2008. Accordingly, states were required only to 
implement and report performance on two secondary performance measures 
for the 2007-2008 program year: academic attainment and student 
graduation rates. These two measures are based on the same academic 
attainment and student graduation rate measures required by Title I of 
the Elementary and Secondary Education Act. Beginning in the 2008-2009 
program year, states are required to report on student outcomes for all 
of the performance measures. States will report these outcomes to 
Education in December 2009. 

Table 1: Perkins IV Performance Measures at the Secondary Level: 

Performance measure: Academic attainment in reading/language arts and 
mathematics; 
Description: Student attainment of challenging academic content and 
academic achievement standards, adopted from Title I of the Elementary 
and Secondary Education Act; 
Program year implemented: 2007-2008. 

Performance measure: Technical skill attainment; 
Description: Student attainment of career and technical skill 
proficiencies, including achievement on technical assessments aligned 
with industry-recognized standards, if available and appropriate; 
Program year implemented: 2008-2009. 

Performance measure: Secondary school completion; 
Description: Student rates of attainment of a secondary school diploma; 
a General Educational Development (GED) credential or equivalent; and 
proficiency credential, certificate, or degree, in conjunction with a 
secondary school diploma; 
Program year implemented: 2008-2009. 

Performance measure: Student graduation rate; 
Description: Student graduation rates, as described in Title I of the 
Elementary and Secondary Education Act; 
Program year implemented: 2007-2008. 

Performance measure: Student placement; 
Description: Student placement in postsecondary education or advanced 
training, military service, or employment; 
Program year implemented: 2008-2009. 

Performance measure: Nontraditional participation and completion; 
Description: Student participation in, and completion of, CTE programs 
that lead to nontraditional fields, such as women in automotive 
programs or men in child development; 
Program year implemented: 2008-2009. 

Source: GAO analysis of Education program guidance and the Carl D. 
Perkins Career and Technical Education Act of 2006. 

Note: States report to Education on attainment in reading and language 
arts as two separate measures, consistent with how states report on 
academic achievement under Title I of the Elementary and Secondary 
Education Act. 

[End of table] 

Table 2: Perkins IV Performance Measures at the Postsecondary Level: 

Performance measure: Technical skill attainment; 
Description: Student attainment of challenging career and technical 
skill proficiencies, including achievement on technical assessments 
aligned with industry-recognized standards, if available and 
appropriate; 
Program year implemented: 2008-2009. 

Performance measure: Credential, certificate, or degree attainment; 
Description: Student attainment of an industry-recognized credential, a 
certificate, or a degree; 
Program year implemented: 2008-2009. 

Performance measure: Student retention or transfer; 
Description: Student retention in postsecondary education or transfer 
to a baccalaureate degree program; 
Program year implemented: 2008-2009. 

Performance measure: Student placement; 
Description: Student placement in military service or apprenticeship 
programs, or placement or retention in employment; 
Program year implemented: 2008-2009. 

Performance measure: Nontraditional participation and completion; 
Description: Student participation in, and completion of, CTE programs 
that lead to employment in nontraditional fields, such as women in 
automotive programs or men in child development; 
Program year implemented: 2008-2009. 

Source: GAO analysis of Education program guidance and the Carl D. 
Perkins Career and Technical Education Improvement Act of 2006. 

[End of table] 

Perkins IV requires states to negotiate specific performance targets 
with Education and to annually report their performance to Education. 
It also requires local recipients to negotiate performance targets with 
the states and to annually report to the state their progress toward 
meeting these targets. Perkins IV established additional accountability 
requirements for states and local recipients, including actions to 
address states that do not meet all of their performance targets. Under 
Perkins IV, if a state does not meet at least 90 percent of its targets 
for one or more of the performance measures, it is required to develop 
and implement a program improvement plan that describes how it will 
address its failing performance targets. Prior to Perkins IV, states 
were only required to develop and implement a program improvement plan 
if they failed to meet their targets in all of their performance 
measures, not just one measure. States can also face financial 
sanctions. For example, Education can withhold all or a portion of 
funds if a state does not implement a program improvement plan, show 
improvement in meeting its failing performance measure, or meet the 
target for the same performance measure for 3 consecutive years. 
[Footnote 7] Local recipients that do not meet at least 90 percent of 
their performance targets have the same program improvement 
requirements as the state and face similar sanctions from the state. In 
the event of financial sanctions, Education is required to use the 
withheld funds to provide technical assistance to the state for 
improving its performance on the measures and the state is to use funds 
withheld from local recipients to provide CTE services and activities 
to students. 

Education Developed Nonregulatory Guidance to Assist States with 
Perkins IV: 

In order to implement the performance measurement requirements of 
Perkins IV, states must define which students will be included in the 
measures and collect data for each of the performance measures at the 
secondary and postsecondary levels. For example, states define the 
minimum requirements, such as a certain number of CTE credits that a 
student would need to obtain in order to be identified as a student 
concentrating in CTE. Education has taken a range of actions to help 
states with these activities. For example, in January 2007, Education 
began issuing nonregulatory guidance to states to help them develop 
their student definitions and data collection approaches for the 
performance measures.[Footnote 8] Education also issued guidance to 
states on the information states must include in their state Perkins 
plans and in the annual reports that they submit to Education. In the 
state plans, states must detail how they intend to implement the 
performance measures, and in the annual reports states must describe 
their progress in meeting the negotiated performance targets.[Footnote 
9] 

In Addition to Implementing the Perkins Performance Measures, Perkins 
IV Also Requires States to Annually Evaluate Their Local CTE Programs: 

In addition to implementing performance measures, states are required 
to evaluate programs, services, and activities supported with Perkins 
funds and to report to Education in their state plans how they intend 
to conduct these evaluations. To meet this requirement, states describe 
the approaches, such as the use of state-developed standards, they will 
use to evaluate local CTE programs. In addition, Education requires 
states to include a description of how they used Perkins funds to 
evaluate their local CTE programs in their annual reports. 

States Are Implementing Some Performance Measures Using Different 
Approaches and Report That the Greatest Challenge Is Collecting Data on 
Technical Skill Attainment and Student Placement: 

Flexibility in Law and Guidance Allows for Differences in How States 
Implement Some Performance Measures and Results in Variation in the 
Student Outcome Data Education Will Collect: 

A key feature of Perkins IV--to enhance state and local flexibility in 
developing, implementing, and improving career and technical education--
allows for considerable variation in how states implement some 
performance measures.[Footnote 10] While Perkins IV was designed to 
strengthen accountability for results at the state and local levels, it 
also allows states to establish their own accountability systems, 
including their own data collection methods for the performance 
measures. Of the 11 performance measures, the secondary and 
postsecondary levels have 3 measures in common: technical skill 
attainment, student placement, and participation in and completion of 
nontraditional programs (see figure 1).[Footnote 11] States may also 
include additional, state-developed performance measures in their 
accountability systems. For example, Washington state added three 
performance measures--earnings, employer satisfaction, and CTE student 
satisfaction--to its accountability system. 

Figure 1: Perkins IV Performance Measures at the Secondary and 
Postsecondary Levels: 

[Refer to PDF for image: illustration] 

Secondary measures: 
* Academic attainment in reading/language arts and mathematics; 
* Secondary school completion; 
* Student graduation rate. 

Overlapping measures: 
* Technical skill attainment; 
* Student placement; 
* Nontraditional participation and completion. 

Postsecondary measures: 
* Credential, certificate, or degree attainment; 
* Student retention or transfer. 

Source: GAO analysis of Education program guidance and the Carl D. 
Perkins Career and Technical Education Act of 2006. 

[End of figure] 

Consistent with Perkins IV, Education's guidance to states also allows 
for flexibility. Education issued nonregulatory guidance that proposed 
specific definitions that could be adopted by states to develop each of 
the secondary and postsecondary performance measures. It also 
identified preferred approaches for collecting data for certain 
measures such as student technical skill attainment. However, Education 
noted that in accordance with Perkins IV, states could propose other 
definitions and approaches to collect data for the required performance 
measures if they meet the requirements of the law. 

We found through our surveys of state CTE directors that states vary 
considerably in the extent to which they plan to follow Education's 
guidance--specifically with regard to the technical skill attainment 
and secondary school completion measures. As a result, Education will 
collect student outcome data that vary across states for the same 
measures. This can create challenges for Education to aggregate student 
outcomes at the national level. For example, a majority of states 
reported that they will use technical assessments--the approach 
recommended in Education's guidance--to measure student attainment of 
skills at the secondary and postsecondary levels. These include 
assessments leading to industry-based certificates or state licenses. 
However, a number of states will rely on other approaches to collect 
data for the performance measure, including grade point average (GPA), 
program completion, or other methods (see table 3). 

Table 3: Approaches That States Will Use to Collect Data on the Student 
Technical Skill Attainment Measure, by Number of States: 

Approach: Technical assessments; 
Secondary level: 34; 
Postsecondary level: 29. 

Approach: Grade point average; 
Secondary level: 8; 
Postsecondary level: 17. 

Approach: Program completion; 
Secondary level: 13; 
Postsecondary level: 13. 

Approach: Other methods; 
Secondary level: 10; 
Postsecondary level: 9. 

Source: GAO analysis of secondary and postsecondary surveys of state 
CTE directors. 

Note: States may use more than one data collection method. 

[End of table] 

Officials in the states we visited provided a variety of reasons for 
their use of alternate methods to measure students' attainment of 
technical skills. For example, postsecondary state officials in 
California said that a CTE instructor's overall evaluation of a 
student's technical skill proficiency, in the form of a final grade, is 
a better measure of technical skill attainment than third-party 
technical assessments, and can more effectively lead to program 
improvement. They questioned the value of technical assessments, in 
part because assessments often cannot keep pace with technology and 
changing CTE program curricula, such as curricula for digital 
animation. A Washington state official told us that the state plans to 
use program completion to measure technical skills at the postsecondary 
level, noting that each postsecondary CTE program incorporates industry-
recognized standards into the curriculum. He noted that a national 
system of third-party assessments may not be adequate or appropriate, 
because it would not necessarily incorporate the same standards. Local 
school officials in Minnesota said that they will report on CTE course 
completion for this measure. Because CTE courses undergo curriculum 
review by teachers as well as industry advisors, and align with 
relevant postsecondary programs in the area, school officials told us 
course completion is sufficient to satisfy the definition of technical 
skill attainment. 

Education's guidance also allows for considerable variation in the 
types of technical assessments states can use and when they can 
administer them. Most states at the secondary level reported in our 
survey that they plan to use industry-developed certificates or 
credentials most often administered at the end of a program, such as a 
certificate awarded for an automotive technician. At the postsecondary 
level, states plan to most often rely upon the results of assessments 
for state licenses, such as state nursing licenses, to measure 
technical skills (see figure 2). 

Figure 2: Number of States Planning to Use Technical Assessments 
Administered at Various Times, by Type of Assessment: 

[Refer to PDF for image: multiple vertical bar graph] 

Secondary level: Assessment for industry-developed certificate or 
credential; 
Number of states: 32. 

Secondary level: Assessment for state license; 
Number of states: 25. 

Secondary level: Nationally developed assessment; 
Number of states: 29. 

Secondary level: State-developed assessment; 
Number of states: 19. 

Secondary level: Locally developed assessment; 
Number of states: 12. 

Postsecondary level: Assessment for industry-developed certificate or 
credential; 26. 
Number of states: 

Postsecondary level: Assessment for state license; 
Number of states: 29. 

Postsecondary level: Nationally developed assessment; 
Number of states: 15. 

Postsecondary level: State-developed assessment; 
Number of states: 12. 

Postsecondary level: Locally developed assessment; 
Number of states: 16. 

Source: GAO analysis of secondary and postsecondary surveys of state 
CTE directors. 

Note: A state may administer technical assessments at different times 
to CTE students. For example, assessments can follow the completion of 
a CTE course or program. This figure includes all states that reported 
that they will administer technical assessments at various times. 

[End of figure] 

However, we found that while a majority of states plan to use 
assessments to report to Education, the assessments are not currently 
in widespread use. For example, more than half of states at the 
secondary and postsecondary levels reported that they plan to use these 
assessments to report on few to none of their state-approved CTE 
programs in the 2008-2009 program year. Some states at the secondary 
level reported that they will use a combination of methods--including 
GPA or program completion--to report on technical skill attainment. 

We also found that states differ in whether they plan to report student 
data on GED credentials, part of the secondary school completion 
measure.[Footnote 12] Thirty states reported through our survey that 
they do not plan to report GED data to Education for the 2008-2009 
program year, while 18 reported that they would. About one-third of all 
states cited their ability to access accurate GED data as a great or 
very great challenge. For example, state officials we interviewed said 
states face difficulty tracking the students that leave secondary 
education and return, sometimes several years later, to earn a GED 
credential. An Education official said that the agency is aware of the 
challenges and limitations states face in collecting GED data and that 
the agency may need to provide technical assistance to states on ways 
to collect these data. 

States Face the Most Challenge Collecting Data on Student Technical 
Skill Attainment and Placement Measures because of Cost and Data 
Concerns: 

States reported in our surveys that they face the most difficulty in 
collecting student data for two of the performance measures: technical 
skill attainment and student placement (see figure 3 and figure 4). 
Thirty-eight states at the secondary level reported that they face 
great or very great challenges in collecting data on student technical 
skill attainment, while, similarly, 14 said they face challenges 
collecting data on student placement. The results were similar at the 
postsecondary level: 39 states reported great or very great challenges 
with the technical skill attainment measure and 11 cited a similar 
level of difficulty with student placement. 

Figure 3: Number of States Reporting Data Collection for Perkins 
Performance Measures as a Great or Very Great Challenge at the 
Secondary Level, by Performance Measure: 

[Refer to PDF for image: horizontal bar graph] 

Secondary level: Technical skill attainment; 
Number of states: 38; 

Secondary level: Nontraditional participation; 
Number of states: 14; 

Secondary level: Nontraditional completion; 
Number of states: 6; 

Secondary level: Secondary school completion; 
Number of states: 6; 

Secondary level: Student placement; 
Number of states: 4. 

Source: GAO analysis of secondary surveys of state CTE directors. 

[End of figure] 

Figure 4: Number of States Reporting Data Collection for Perkins 
Performance Measures as a Great or Very Great Challenge at the 
Postsecondary Level, by Performance Measure: 

[Refer to PDF for image: horizontal bar graph] 

Postsecondary level: Technical skill attainment; 
Number of states: 39; 

Postsecondary level: Student placement; 
Number of states: 11; 

Postsecondary level: Credential, Certificate, or degree; 
Number of states: 8; 

Postsecondary level: Student retention or transfer; 
Number of states: 7; 

Postsecondary level: Nontraditional completion; 
Number of states: 0; 

Postsecondary level: Nontraditional participation; 
Number of states: 0. 

Source: GAO analysis of secondary surveys of state CTE directors. 

[End of figure] 

States reported that the technical skill attainment measure at the 
secondary and postsecondary levels was most challenging to implement 
because of costs and the states' ability to collect accurate and 
complete student data. Specifically, states reported that the costs of 
state-developed assessments and third-party technical assessments-- 
such as those for industry certifications--are high and often too 
expensive for many districts, institutions, or students.[Footnote 13] 
Several state CTE directors commented in our surveys that their Perkins 
funds are inadequate to pay for these assessments and additional funds 
would be necessary to cover the costs. Another CTE director stated that 
economically disadvantaged students cannot afford the cost of 
assessments. In addition to challenges due to cost, states are limited 
in their ability to access accurate and complete data. For example, a 
state official said that Washington state does not have data-sharing 
agreements with assessment providers to receive the results of student 
assessments. As a result, the state will have to rely largely on 
students to self-report the results of their assessments, which raises 
concerns of data quality. Challenges such as these likely contribute to 
some states' use of other data--such as GPA or program completion--to 
collect and report information for this key student performance 
measure. 

Some states also reported difficulty collecting data on CTE students 
after they leave the school system. States at the secondary and 
postsecondary levels reported that their greatest challenge with the 
student placement measure is collecting data on students that are 
employed out of state. As we previously reported, state wage records, 
such as Unemployment Insurance data, track employment-related outcomes 
only within a state, not across states.[Footnote 14] A number of states 
commented in our surveys on challenges in tracking students because of 
the lack of data sharing across states.[Footnote 15] We found that 
states face challenges in tracking students employed out of state 
regardless of the method they most commonly use to collect student 
placement data. Thirty-eight states at the secondary level will use 
student survey data from the state, school district, or a third party 
to track student placement and report to Education, while 41 states at 
the postsecondary level will rely on state wage record data, despite 
potential gaps in student data (see figure 5). 

Figure 5: Most Commonly Used Methods to Collect Student Placement Data, 
by Number of States and Educational Level: 

[Refer to PDF for image: horizontal bar graph] 

Type of data used: Survey data; 
Secondary level: 38 states; 
Postsecondary level: 24 states. 

Type of data used: Unemployment insurance wage records; 
Secondary level: 17 states; 
Postsecondary level: 41 states. 

Source: GAO analysis of secondary and postsecondary surveys of state 
CTE directors. 

Note: States may use more than one data collection method. 

[End of figure] 

States also cited other challenges in obtaining data on student 
placement for CTE students. At the secondary level, states reported 
that their next greatest challenge is linking secondary and 
postsecondary data systems in order to track students that pursue 
higher education after graduation. To help overcome this challenge, 
Minnesota--one of the states we visited--recently passed legislation to 
allow data sharing between the secondary and postsecondary levels. 
[Footnote 16] Our survey also found that states' next greatest 
challenge at the postsecondary level was collecting data on students 
who are self-employed after leaving postsecondary institutions. 
Community college officials in California said that while they rely on 
Unemployment Insurance wage record data, the data are incomplete and do 
not capture information on the self-employed, a group that is important 
for the measurement of CTE outcomes at the postsecondary level. 

States face similar challenges of cost and ability to access accurate 
data for the remaining performance measures. For example, states at the 
secondary level commented on data challenges for the academic 
attainment and student graduation rate measures.[Footnote 17] 
Specifically, several states cited problems in obtaining data from 
separate student data systems containing academic and CTE information. 
This can be particularly challenging for states that are trying to 
match student data from different systems in order to track required 
CTE student outcomes. In addition, at the postsecondary level, states 
cited challenges in tracking student retention in postsecondary 
education or student transfer to a baccalaureate degree program. In 
particular, accessing student data from out-of-state and private 
institutions and the high costs required to track these students were 
identified as the most challenging issues. States most often reported 
that they will track these students through their state postsecondary 
data systems. 

Education Uses Risk-Based Monitoring to Ensure Implementation of the 
Performance Measures and Supports States through Technical Assistance 
and Guidance: 

Education Uses Risk-Based Monitoring and Reviews State Annual Reports 
to Ensure Implementation of the Performance Measures: 

As we have previously reported, effective monitoring is a critical 
component of grant management. The Domestic Working Group's suggested 
grant practices state that financial and performance monitoring is 
important to ensure accountability and attainment of performance goals. 
[Footnote 18] Additionally, GAO recently reported on the importance of 
using a risk-based strategy to monitor grants and noted that it is 
important to identify, prioritize, and manage potential at-risk grant 
recipients, given the large number of grants awarded by federal 
agencies.[Footnote 19] Education's approach to monitoring Perkins is 
consistent with these suggested grant practices. According to its 
Perkins monitoring plan, Education selects which states to monitor 
based on a combination of risk factors and monitors states in two ways: 
through on-site visits and off-site reviews of state plans, budgets, 
and annual reports for those states not visited in a given year. 
[Footnote 20] To determine which states it will visit for on-site 
monitoring, Education uses a combination of risk factors, such as grant 
award size, issues identified through reviews of state Perkins plans, 
and time elapsed since Education's last monitoring visit. Education 
officials told us that their goal is to visit each state at least once 
every 5 years and reported that they have conducted on-site monitoring 
visits to 28 states since 2006. Education officials also told us that 
the same monitoring team performs both on-site and off-site reviews, 
which officials said helps to ensure continuity between the reviews. 
Furthermore, when conducting the off-site reviews, the monitoring team 
looks for trends in state data and for any problems with state data 
validity and reliability. The team uses a checklist to match 
performance data to the data states report in their required annual 
reports. 

According to Education's inventory of open monitoring findings, as of 
May 2009, 9 of the 28 open findings were related to accountability and 
states failing to submit complete or reliable data. For example, in a 
February 2008 monitoring visit, Education found that the monitored 
state's data system had design limitations that affected the state's 
ability to collect and assess data on career and technical education 
students. Specifically, Education found that the various data systems 
across the local secondary and postsecondary levels did not share data 
with the state-level CTE data system. This data-sharing issue raised 
doubts about the validity and reliability of the state's Perkins data. 
Education tracks the findings from each state's monitoring visit in a 
database and reviews the findings in an internal report that is updated 
monthly. Additionally, if a state has open findings, the state may be 
required to report corrective actions to Education in the state's 
annual report. Officials told us that the amount of time it takes for a 
state to close out a finding depends upon the nature of the finding. 
For example, a finding related to accountability may take up to a year 
to resolve because a state may have to undertake extensive actions to 
address the deficiency. Education officials reported that their 
monitoring process emphasizes program improvement rather than focusing 
solely on compliance issues and that they use monitoring findings to 
guide the technical assistance they provide to the states. 

To evaluate its monitoring process, Education sends a survey to the CTE 
directors of states that were monitored that year and asks them to rate 
the format and content of Education's Perkins monitoring process. For 
example, the survey asks states to report on whether they received 
sufficient notice that the site visit was going to take place, whether 
the monitoring team provided on-site technical assistance, and whether 
the state received a written report within a reasonable time frame 
following the visit. We reviewed Education's summaries of the state 
surveys and found that for 2004 and 2005, the results of these surveys 
were generally positive. For example, in a 2004 monitoring evaluation 
report, the 10 states that were surveyed all reported that they had 
received sufficient notice about the monitoring visit and that 
Education staff provided on-site technical assistance. According to our 
survey of secondary-level CTE directors, about half of states have had 
a monitoring visit within the last 3 years, and almost all of the 
states whose monitoring visit resulted in findings said that Education 
worked with them to ensure that the findings were addressed. 

Education Supports States by Providing Technical Assistance and 
Guidance: 

Education provides states with guidance, technical assistance, and a 
variety of other resources and is taking actions to meet states' need 
for additional help. Since Perkins IV was enacted, Education has issued 
guidance to states on topics such as instructions for developing the 
state Perkins plans and annual reports, as well as guidance related to 
the performance measures. For example, Education's guidance provides 
clarification to states on what information each state has to submit to 
Education before it can receive its grant award for the next program 
year, such as any revisions a state wants to make to its definitions of 
student populations, measurement approaches, and proposed performance 
levels for each of the measures. Some of the guidance resulted from 
Education's collaborative efforts with states. For example, Education's 
guidance to states on student definitions and measurement approaches 
incorporated the input given by state CTE directors during national 
conference calls between states and Education. Other guidance addresses 
questions raised by states during national Perkins IV meetings, such as 
how a state should negotiate performance levels with its local 
recipients. 

In addition to guidance, Education offers states technical assistance 
from Education staff--called Regional Accountability Specialists--and 
through a private contractor. Education officials told us that each 
Regional Accountability Specialist works with a specific group of 
states to negotiate state data collection approaches for the 
performance measures. In addition, each specialist maintains regular 
contact with his or her states throughout the year and provides 
assistance on other issues, such as reporting requirements and program 
improvement plans. In addition to the Regional Accountability 
Specialists, Education also provides states with technical assistance 
by using MPR Associates, a private contractor.[Footnote 21] MPR 
Associates provides technical assistance that generally includes on- 
site visits and follow-up discussions to help states improve their CTE 
programs and facilitate data collection for the performance measures. 
For example, MPR Associates met with one state to assist with 
developing population definitions and measurement approaches that 
aligned with Education's guidance and helped another state with 
developing a plan for implementing secondary and postsecondary 
technical skill assessments. After providing technical assistance to a 
state, MPR Associates develops a summary report, which is then 
published on Education's information-sharing Web site, the Peer 
Collaborative Resource Network. Education also offers states a range of 
other resources, including data work groups and monthly conference 
calls. See table 4 for a description of the various ways in which 
Education provides assistance to states. 

Table 4: Education's Assistance to States for Perkins IV 
Implementation: 

Form of assistance: Data Quality Institute; 
Description: The Data Quality Institute is an Education-hosted seminar 
that helps states improve the quality and consistency of the data 
states use to report on the Perkins performance indicators. Education 
has hosted 15 seminars that have included participation by state 
officials at the secondary and postsecondary levels. Seminars typically 
focus on a particular issue, such as measuring technical skill 
attainment. Since 2000, the seminars have been held once a year for 2 
days. The 2009 seminar was held via the Internet and included 460 
participants representing 50 states. 

Form of assistance: Next Steps Work Group; 
Description: The Next Steps Work Group is primarily composed of 
Education accountability staff and state CTE directors and their 
accountability staff. Education officials told us that the work group 
is one of their primary communication tools for working with state data 
contacts. The Next Steps Work Group also has subgroups based on the 
larger group's interests. Current subgroups are focusing on technical 
skill assessments, data disaggregation, and the consistency of certain 
performance measures. 

Form of assistance: Peer Collaborative Resource Network; 
Description: The Peer Collaborative Resource Network is a resource-and 
information-sharing forum for state CTE professionals. It serves as a 
peer-to-peer forum for states to improve Perkins IV implementation and 
data quality, as well as providing information on other Education grant 
programs and national initiatives related to career and technical 
education. 

Form of assistance: State directors conference calls; 
Description: Education hosts periodic conference calls and Web-based 
seminars with state CTE directors, during which technical assistance 
and guidance are provided. Call topics have included technical skill 
assessments, recaps of national CTE policy meetings, and state 
presentations on past experiences with customized technical assistance. 

Form of assistance: Attendance at national conferences; 
Description: Education officials have conducted workshops and presented 
at a number of national conferences. These include conferences of the 
Association of Career and Technical Education and the National 
Association of State Directors of Career and Technical Education 
Consortium. 

Form of assistance: CTE Research; 
Description: Through the National Center for Research in Career and 
Technical Education, the Department of Education supports research and 
evaluation, development, dissemination, technical assistance and 
training activities, as well as other activities aimed at improving CTE 
education. 

Source: GAO analysis of Education documents and interviews with 
Education officials. 

[End of table] 

Most states reported that the assistance provided by Education has 
helped them implement the performance measures, but that more 
assistance in the area of technical skill attainment would be helpful. 
In our survey, states responded positively about their Regional 
Accountability Specialist and all of Education's other forms of 
assistance, including the Data Quality Institute and the Next Steps 
Work Group. States also reported that more nonregulatory guidance and 
more individual technical assistance would improve their ability to 
implement the performance measures. Of the states that provided 
additional information on the areas in which they wanted assistance, 4 
of 16 states at the secondary level and 9 of 20 states at the 
postsecondary level said that they wanted assistance on the technical 
skill attainment measure. Specifically, some of the states that 
provided additional information said they would like Education to 
clarify its expectations for this measure, to provide states with a 
library of technical assessments, and to provide state-specific 
assistance with developing low-cost, effective technical assessments. 
States also raised issues regarding the performance measures and their 
state's data collection challenges. For example, one state reported 
that it was unsure how a state should report technical skill attainment 
as a single measure for over 400 distinct CTE programs. 

We found that Education officials were aware of states' need for 
additional assistance and that Education has taken some actions to 
address these needs, particularly in the area of technical assessments. 
For example, through the Next Steps Work Group, Education facilitated a 
technical skills attainment subgroup that is led by state officials and 
a national research organization. The subgroup reviewed state Perkins 
plans and annual reports for technical skill assessment strategies that 
states reported to Education for consideration in upcoming guidance. 
Education also collaborated with MPR Associates to conduct a study on 
the feasibility of a national technical assessment clearinghouse and 
test item bank.[Footnote 22] The study, conducted with several CTE 
research organizations and state-level consortia, proposed national 
clearinghouse models for technical assessments. MPR Associates 
concluded that clarifying ownership, such as who is responsible for the 
development and management of the system, and securing start-up funding 
were the two most likely impediments to creating such a system. The 
report was presented to states at the October 2008 Data Quality 
Institute seminar, and Education officials reported that they are 
working with organizations such as the National Association for State 
Directors of Career and Technical Education Consortium and the Council 
of Chief State School Officers to implement next steps. 

In addition to helping states with the technical skill attainment 
measure, Education also has taken actions to improve its information- 
sharing Web site, the Peer Collaborative Resource Network. 
Specifically, a Next Steps Work Group subcommittee surveyed states for 
suggested ways to improve the Web site and reported that states wanted 
to see the information on the site kept more current. The subcommittee 
reported in December 2008 that Education would use the survey results 
to develop a work plan to update the Web site. In May 2009, Education 
officials reported that they had implemented the work plan and were 
piloting the revamped site with selected state CTE directors before the 
department finalizes and formally launches the site. 

Education Relies on the Performance Measures to Gauge the Success of 
State CTE Programs: 

State performance measures are the primary source of data available to 
Education for determining the effectiveness of CTE programs, and 
Education relies on student outcomes reported through these measures to 
gauge the success of states' programs. While Perkins IV requires states 
to evaluate their programs supported with Perkins funds, it only 
requires states to report to Education--through their state plans--how 
they intend to evaluate the effectiveness of their CTE programs. It 
does not require states to report on the findings of their evaluations 
and does not provide any specific guidance on how states should 
evaluate their programs. 

Because only 2 of 11 measures have been implemented and reported on 
thus far, Education has little information to date on program outcomes. 
In program year 2007-2008, Education required states to implement and 
report only the academic skill attainment and graduation rate measures. 
States are required to provide Education with outcome data for the 
remaining 9 secondary and postsecondary measures in December 2009. 
According to Education's annual report for program year 2007-2008, 43 
states met their targets for the academic attainment in reading/ 
language arts measure, 38 states met their targets for the academic 
attainment in mathematics measure, and 46 states met their targets for 
the graduation rate measure.[Footnote 23] 

We analyzed the state plans of all 50 states and the District of 
Columbia and found that, as required by Perkins IV, states provide a 
description to Education on how they are evaluating their CTE programs. 
[Footnote 24] The type of information that states provided varied. For 
example, some states described the databases they use to capture key 
data and others explained how they use state-developed performance 
measures to evaluate their programs. Perkins IV does not require that 
states include information on what their evaluations may have found in 
terms of the success of a program. In our surveys of state CTE 
directors, nearly half of states (23 states at the secondary level and 
21 states at the postsecondary level) responded that they have 
conducted or sponsored a study, in the past 5 years, to examine the 
effectiveness of their CTE programs. In response to these survey 
results, we collected seven studies that states identified as 
evaluations of their program effectiveness. We developed an instrument 
for evaluating these studies and determined the type of evaluation and 
methodology used by states in these studies. We determined that four of 
the studies were outcome evaluations and the remaining three studies 
were not outcome, impact, or process evaluations[Footnote 25]. For 
example, one state found in its outcome evaluation that high school 
graduates who completed a CTE program of study entered postsecondary 
institutions directly after high school at the same rate as all 
graduates. 

Concluding Observations: 

Perkins IV provides states with considerable flexibility in how they 
implement the required performance measures and how they evaluate the 
effectiveness of their CTE programs. While this flexibility enables 
states to structure and evaluate their programs in ways that work best 
for them, it may hinder Education's ability to gain a broader 
perspective on the success of state CTE programs. Specifically, 
differences in how states collect data for some performance measures 
may challenge Education's ability to aggregate student outcomes at a 
national level and compare student outcomes on a state-by-state basis. 
Further, Education is limited in what it knows about the effectiveness 
of state CTE programs, beyond what states report through the 
performance measures. Perkins only requires that states report on how 
they are evaluating their programs, and does not provide any guidance 
on how states should evaluate their programs or require that states 
report on the outcomes of their evaluations. Education is working with 
states to help them overcome challenges they face in collecting and 
reporting student outcomes, and over time, states may collect more 
consistent data for measures such as technical skill attainment. As 
states become more adept at implementing the Perkins performance 
measures, they will be better positioned to conduct more rigorous 
evaluations of their CTE programs. However this information may not be 
reported to Education. If policymakers are interested in obtaining 
information on state evaluations, they will need to weigh the benefits 
of Education obtaining this information with the burden of additional 
reporting requirements. 

Agency Comments: 

We provided a draft of this report and the electronic supplement to the 
Department of Education for review and comment. Education provided 
technical comments on the report, which we incorporated as appropriate. 
Education had no comments on the electronic supplement. 

We are sending copies of this report to appropriate congressional 
committees, the Secretary of Education, and other interested parties. 
In addition, the report will be available at no charge on GAO's Web 
site at [hyperlink, http://www.gao.gov]. 

If you or your staff have any questions about the report, please 
contact me at (202) 512-7215 or scottg@gao.gov. Contact points for our 
Offices of Congressional Relations and Public Affairs may be found on 
the last page of this report. GAO staff that made major contributions 
to this report are listed in appendix II. 

Signed by: 

George A. Scott: 
Director, Education, Workforce, and Income Security Issues: 

[End of section] 

Appendix I: Scope and Methodology: 

Survey of States: 

To obtain national-level information on states' implementation of 
Perkins IV, we designed and administered two Web-based surveys, at the 
secondary and postsecondary levels, to state directors of career and 
technical education (CTE) in the 50 states and the District of 
Columbia. The surveys were conducted between January and April 2009, 
with 100 percent of state CTE directors responding to each survey. The 
surveys included questions about the types of data states collect for 
the student performance measures and challenges they face; the various 
kinds of technical assistance, guidance, and monitoring states received 
from Education; and how states evaluate their CTE programs. The surveys 
and a more complete tabulation of the results can be viewed at GAO-09-
737SP. 

Because this was not a sample survey, there are no sampling errors. 
However, the practical difficulties of conducting any survey may 
introduce nonsampling errors, such as variations in how respondents 
interpret questions and their willingness to offer accurate responses. 
We took steps to minimize nonsampling errors, including pretesting 
draft survey instruments and using a Web-based administration system. 
Specifically, during survey development, we pretested draft instruments 
with officials in Minnesota, Washington state, and Vermont in December 
2008. We also conducted expert reviews with officials from the National 
Association of State Directors of Career and Technical Education 
Consortium and MPR Associates, who provided comments on the survey. In 
the pretests and expert reviews, we were generally interested in the 
clarity of the questions and the flow and layout of the survey. For 
example, we wanted to ensure that terms used in the surveys were clear 
and known to the respondents, categories provided in closed-ended 
questions were complete and exclusive, and the ordering of survey 
sections and the questions within each section were appropriate. On the 
basis of the pretests and expert reviews, the Web instruments underwent 
some revisions. A second step we took to minimize nonsampling errors 
was using Web-based surveys. By allowing respondents to enter their 
responses directly into an electronic instrument, this method 
automatically created a record for each respondent in a data file and 
eliminated the need for and the errors associated with a manual data 
entry process. When the survey data were analyzed, a second, 
independent analyst checked all computer programs to further minimize 
error. 

While we did not fully validate all of the information that state 
officials reported through our surveys, we reviewed the survey 
responses overall to determine that they were complete and reasonable. 
We also validated select pieces of information by corroborating the 
information with other sources. For example, we compared select state 
responses with information submitted to Education in state Perkins 
plans. On the basis of our checks, we believe our survey data are 
sufficiently reliable for the purposes of our work. 

Site Visits: 

To better understand Perkins IV implementation at the state and local 
levels, we conducted site visits to three states--California, 
Minnesota, and Washington state--between September 2008 and February 
2009. In each state we spoke with secondary and postsecondary officials 
at the state level with CTE and Perkins responsibilities. We also 
interviewed officials from local recipients of Perkins funds--that is, 
school districts and postsecondary institutions. Through our interviews 
with state and local officials, we collected information on efforts to 
implement the Perkins performance measures and uses of Perkins funding, 
experiences with Education's monitoring and technical assistance, and 
methods for CTE program evaluation. The states we selected represent 
variation across characteristics such as the type of state agency 
(i.e., state educational agencies or state college and university 
systems) eligible to receive Perkins funds, the amount of Perkins IV 
funds received in fiscal year 2008, and type of approach used to 
measure student attainment of technical skills. The localities selected 
for site visits provided further variation in geographic location 
(urban versus rural), number of CTE students served, and amount of 
Perkins funding received. 

We conducted this performance audit from August 2008 to July 2009, in 
accordance with generally accepted government auditing standards. Those 
standards require that we plan and perform the audit to obtain 
sufficient, appropriate evidence to provide a reasonable basis for our 
findings and conclusions based on our audit objectives. We believe that 
the evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

[End of section] 

Appendix II: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

George A. Scott, (202) 512-7215 or scottg@gao.gov: 

Staff Acknowledgments: 

In addition to the contact named above, Elizabeth Morrison (Assistant 
Director), Avani Locke, Robin Nye, Charlotte Gamble, Stephen 
Steigleder, Jessica Orr, Jean McSween, Christine San, and Jessica 
Botsford made key contributions to this report. 

[End of section] 

Footnotes: 

[1] The program year generally operates from July 1 to June 30. 

[2] 20 U.S.C. § 2301 et seq. 

[3] Throughout this report we refer to state agencies as "states." 

[4] Education initially allocates funds to states based on the 
population of three age groups (15 to 19, 20 to 24, and 25 to 65) and 
the state's per capita income. If a state's allocation exceeds the 
amount allocated during fiscal year 2006, a new formula is used to 
allocate funds at the state level. 

[5] Administration funds may be used by states to support the 
development of the state Perkins plans, review of the local plans, 
monitoring and evaluation of program effectiveness, ensuring compliance 
with federal laws, technical assistance, and supporting and developing 
state data systems. State leadership funds must be used by the states 
to improve CTE programs through nine different activities, including 
the assessment of CTE programs and expanding the use of technology in 
CTE programs. 

[6] The Federal Pell Grant Program provides need-based grants to low- 
income undergraduate and certain postbaccalaureate students to promote 
access to postsecondary education. 

[7] Education officials told us that, to date, they have not withheld 
funds from any states. 

[8] Section 318 of the Carl D. Perkins Career and Technical Education 
Act of 2006 provides that the Secretary of Education may issue 
regulations only to the extent necessary to administer and ensure 
compliance with the specific requirements of the act. 

[9] Perkins IV requires each state receiving Perkins funds to submit to 
Education a state plan describing how the state will meet the 
requirements of the act. We refer to these plans as "state Perkins 
plans." States are also required by Perkins IV to submit an annual 
report that Education refers to as a "Consolidated Annual Report" and 
we refer to as an "annual report." 

[10] Section 2 of the Carl D. Perkins Career and Technical Education 
Act of 2006. 

[11] Slight variations exist in the definitions for the technical skill 
attainment and student placement measures at the secondary and 
postsecondary levels. 

[12] Although the collection of GED data is required by Perkins IV and 
included as part of Education's guidance, an Education official said 
that the agency approved all state Perkins plans despite the fact that 
some states would not be able to accurately track and report GED 
attainment data. The official said that the lack of GED data will be 
addressed in future state monitoring and auditing visits. 

[13] For example, Cisco computer-based certification exams generally 
range from $80 to $325. Some certifications, however, may cost as much 
as $1,400. 

[14] Each state maintains Unemployment Insurance wage records to 
support the process of providing unemployment compensation to 
unemployed workers. The records are compiled from data submitted to the 
state each quarter by employers and primarily include information on 
the total amount of income earned during that quarter by each of their 
employees. 

[15] An Education official said that the agency provides information to 
states about various potential sources of student placement data, 
including the Wage Record Interchange System. This system, a Department 
of Labor initiative, was developed to facilitate the exchange of wage 
data between participating states for the purpose of assessing and 
reporting on state and local performance for programs authorized under 
the Workforce Investment Act of 1998. 

[16] Under the Minnesota law, the following educational data may be 
shared between the state educational agency and the state office of 
higher education for improvement purposes: attendance data, including 
name of school or institution, school district, year or term of 
attendance, and term type; student demographic and enrollment data; 
academic performance and testing data; and special academic services 
received by a student. However, data may be analyzed or reported only 
in the aggregate. 

[17] In December 2008, states were required to report only on the 
student academic attainment and graduation rate measures as required 
under Title I of the Elementary and Secondary Education Act for the 
2007-2008 program year. 

[18] The Domestic Working Group Grant Accountability Project, Guide to 
Opportunities for Improving Grant Accountability, October 2005. The 
group was composed of representatives from federal, state, and local 
audit organizations and is chaired by the Comptroller General of the 
United States. 

[19] See [hyperlink, http://www.gao.gov/products/GAO-08-486]. 

[20] According to Education's fiscal year 2009 monitoring plan, full 
visits are weeklong, on-site reviews that address compliance in seven 
areas, including accountability, state or program administration, 
fiscal program responsibility, and programs of study. Targeted visits 
are 2-day, on-site reviews that address one or more of the seven areas, 
depending on the issues and needs of the state. An Education official 
told us that it is typical for some states to receive several targeted 
reviews before receiving a full review. Education officials also told 
us that targeted reviews can be used to follow up on a state's progress 
implementing corrective actions following a full monitoring review. 

[21] MPR Associates provides technical assistance to individual states 
as they implement Perkins IV and will have worked with about 20 states 
between 2007 and December 2009. 

[22] According to the MPR Associates' study, a test item bank contains 
questions submitted by various business, industry, and education 
sources and an assessment clearinghouse contains information about 
industry-recognized national assessments that may be adopted or adapted 
for use. 

[23] Student outcome data for these performance measures are collected 
and reported by the local recipients to the state. The state reports 
this information to Education. 

[24] Education's instructions to states mirror the language in Perkins 
IV. The instructions ask states to describe in their state plans how 
the eligible agency will annually evaluate the effectiveness of its 
career and technical education programs, and describe, to the extent 
practicable, how the state is coordinating such programs to ensure 
nonduplication with other federal programs. 

[25] GAO, Performance Measurement and Evaluation: Definitions and 
Relationships, [hyperlink, http://www.gao.gov/products/GAO-05-739SP] 
(Washington, D.C.: May, 2005). This product explains three principal 
types of program evaluation. An outcome evaluation assesses the extent 
to which a program achieves its outcome- oriented objectives. It 
focuses on outputs and outcomes (including unintended effects) to judge 
program effectiveness but may also assess program process to understand 
how outcomes are produced. An impact evaluation is a form of outcome 
evaluation that assesses the net effect of a program by comparing 
program outcomes with an estimate of what would have happened in the 
absence of the program. This form of evaluation is employed when 
external factors are known to influence the program's outcomes, in 
order to isolate the program's contribution to achievement of its 
objectives. A process evaluation assesses the extent to which a program 
is operating as it was intended. It typically assesses program 
activities' conformance to statutory and regulatory requirements, 
program design, and professional standards or customer expectations. 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: