This is the accessible text file for GAO report number GAO-07-521 entitled 'Vocational Rehabilitation: Improved Information and Practices May Enhance State Agency earnings Outcomes for SSA Beneficiaries' which was released on May 23, 2007. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: May 2007: Report to Congressional Requesters: Vocational Rehabilitation: Improved Information and Practices May Enhance State Agency Earnings Outcomes for SSA Beneficiaries: GAO-07-521: GAO Highlights: Highlights of GAO-07-521, a report to congressional requesters Why GAO Did This Study: State vocational rehabilitation (VR) agencies, under the Department of Education (Education), play a crucial role in helping individuals with disabilities prepare for and obtain employment, including individuals receiving disability benefits from the Social Security Administration (SSA). In a prior report (GAO-05-865), GAO found that state VR agencies varied in the rates of employment achieved for SSA beneficiaries. To help understand this variation, this report analyzed SSA and Education data and surveyed state agencies to determine the extent to which (1) agencies varied in earnings outcomes over time; (2) differences in state economic conditions, client demographic traits, and agency strategies could account for agency performance; and (3) Education’s data could be used to identify factors that account for differences in individual earnings outcomes. What GAO Found: Our analysis of data on state agency outcomes for SSA beneficiaries completing VR found that state agencies varied widely across different outcome measures for the years of our review. For example, from 2001 to 2003 average annual earnings levels among those SSA beneficiaries with earnings during the year after completing VR varied across state agencies from about $1,500 to nearly $17,000. Figure: Distribution of State Agency Average Annual Earnings for SSA Beneficiaries during the Year: [See PDF for Image] Source: GAO analysis of SSA data. Note: Earnings are in 2004 dollars. [End of figure] After controlling for a range of factors, we found that much of the differences in state agency earnings outcomes could be explained by state economic conditions and the characteristics of the agencies’ clientele. Together state unemployment rates and per capita income levels accounted for roughly one-third of the differences between state agencies in the proportion of SSA beneficiaries that had earnings during the year after VR. The demographic profile of SSA clients being served at an agency—such as the proportion of women beneficiaries—also accounted for some of the variation in earnings outcomes. We also found that after controlling for other factors, a few agency practices appeared to yield positive earnings results. For example, state agencies with a higher proportion of state-certified counselors had more SSA beneficiaries with earnings during the year after completing VR. However, we were unable to determine what factors might account for differences in earnings outcomes at the individual level. This was due in part to Education’s data, which lacked information on important factors that research has linked to work outcomes, such as detailed data on the severity of clients’ disabilities. Although Education collects extensive client-level data, some key data are self-reported and not always verified by state agencies. What GAO Recommends: GAO recommends that Education promote certain promising practices identified in our analysis, reassess the data it collects on clients, and consider economic factors when measuring state agency performance. Education generally agreed with our recommendations, but disagreed that economic factors should be incorporated into performance measures. It considers these factors during monitoring and believes its approach to be effective. We maintain that these factors are critical to measuring agencies’ relative performance. [Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-521]. To view the full product, including the scope and methodology, click on the link above. For more information, contact Denise Fantone at (202) 512-4997 or fantoned@gao.gov. [End of section] Contents: Letter: Results in Brief: Background: State VR Agencies Consistently Showed Very Different Rates of Success for SSA Beneficiaries Who Completed VR Programs: State Economic Conditions and SSA Beneficiary Characteristics Account for Much of the Difference in State VR Agency Success Rates: A Few Agency Practices Appeared to Yield Better Earnings Outcomes, while the Results of Other Practices Were Inconclusive: Limitations in Education's Data May Have Hampered Analyses of Individual Earnings Outcomes: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Section 1: Data Used, Information Sources, and Data Reliability: Section 2: Study Population and Descriptive Analyses: Section 3: Econometric Analyses: Section 4: Limitations of our Analyses: Appendix II: Comments from the Department of Education: Appendix III: Comments from the Social Security Administration: Appendix IV: GAO Contacts and Staff Acknowledgments: Related GAO Products: Tables: Table 1: Explanatory Variables from the TRF Subfile: Table 2: Explanatory Variables from Education's RSA-2 Data: Table 3: State Economic and Demographic Explanatory Variables and Their Sources: Table 4: Explanatory Variables from the VR Agency Survey Data: Table 5: Dependent Variables Used in the Analyses: Table 6: Coefficients for Multivariate Models Estimating the Effects of State and Agency Characteristics on Three VR Outcomes, and the Proportion of Variance Explained (R-Squared) by Each Model: Figures: Figure 1: Distribution of State VR Agencies by Percentage of SSA Beneficiaries with Earnings during the Year after VR: Figure 2: Distribution of State VR Agency Average Annual Earnings for SSA Beneficiaries with Earnings during the Year after VR: Figure 3: Distribution of State VR Agencies by Percentage of SSA Beneficiaries Leaving the Rolls: Figure 4: Range across State VR Agencies of the Percentage of SSA Beneficiaries with Earnings during the Year after VR by Year: Figure 5: Range of State VR Agency Average Earnings for SSA Beneficiaries by Year: Figure 6: Range across State VR Agencies of the Percentage of SSA Beneficiaries with Earnings during the Year after VR by Agency Type: Figure 7: Range of State VR Agency Average Earnings for SSA Beneficiaries by Agency Type: Figure 8: Range of State VR Agency Average Rates of SSA Beneficiaries Leaving the Rolls by Agency Type: Abbreviations: CPI-U: Consumer Price Index for All Urban Consumers: CSPD: Comprehensive System of Personnel Development: DI: Disability Insurance: GSP: gross state product: IPE: individual plan of employment: MEF: Master Earnings File: OLS: ordinary least squares: SSA: Social Security Administration: SSI: Supplemental Security Income: TRF: Ticket Research File: VR: vocational rehabilitation: WIA: Workforce Investment Act: United States Government Accountability Office: Washington, DC 20548: May 23, 2007: The Honorable Charles B. Rangel: Chairman: The Honorable Jim McCrery: Ranking Minority Member: Committee on Ways and Means: House of Representatives: The Honorable Michael R. McNulty: Chairman: The Honorable Sam Johnson: Ranking Minority Member: Subcommittee on Social Security: Committee on Ways and Means: House of Representatives: The Honorable Sander M. Levin: House of Representatives: State vocational rehabilitation (VR) agencies, under the auspices of the Department of Education (Education), play a crucial role in helping individuals with disabilities prepare for and obtain employment. In fiscal year 2005, state VR agencies received $2.6 billion to provide people with disabilities a variety of supports such as job counseling and placement, diagnosis and treatment of impairments, vocational training, and postsecondary education. The VR program serves about 1.2 million people each year, and over a quarter of those who complete VR are beneficiaries of the Disability Insurance (DI) or Supplemental Security Income (SSI) programs administered by the Social Security Administration (SSA). This proportion has increased steadily since 2002. As our society ages, the number of SSA disability beneficiaries is expected to grow, along with the cost of providing SSA disability benefits, and it will be increasingly important to manage this growth by optimizing the ability of VR programs to help and encourage SSA beneficiaries to participate in the workforce. In 2005, GAO reported that state VR agencies varied substantially in terms of the employment rates they achieved for their clients, particularly for SSA beneficiaries who, according to research, attain lower employment and earnings outcomes than other VR clients.[Footnote 1] Depending on the state agency, as many as 68 percent and as few as 9 percent of SSA beneficiaries exited VR with employment. In addition, GAO found that Education's management of the VR program was lacking in several respects and recommended that Education revise its performance measures to account for economic differences between states, make better use of incentives for state VR agencies to meet performance goals, and create a means for disseminating best practices among state VR agencies. Education agreed with these recommendations but has yet to implement them. As a follow-up to our 2005 report, you asked us to determine what may account for the wide variations in state VR agency outcomes with respect to SSA beneficiaries. Therefore, we examined the extent to which (1) differences in VR agency outcomes for SSA beneficiaries continued over several years and across different outcome measures, (2) differences in VR agency outcomes were explained by state economies and demographic traits of the clientele served, (3) differences in VR agency outcomes were explained by specific policies and strategies of the VR agencies, and (4) Education's data allowed for an analysis of factors that account for differences in individual-level (as opposed to agency-level) outcomes. To perform our work, we used several data sources: (1) a newly available longitudinal dataset that includes administrative data from Education and SSA on SSA beneficiaries who completed the VR program between 2001 and 2003,[Footnote 2] (2) original survey data collected by GAO from 78 of the 80 state VR agencies, (3) data from Education on yearly spending information by service category for each VR agency, and (4) data from the Census Bureau, Bureau of Labor Statistics, and other data sources regarding state demographic and economic characteristics. We conducted reliability assessments of these data and found them to be sufficiently reliable for our analyses. We took several steps to analyze these data. To answer our questions, we analyzed outcomes by state agency using three different earnings outcomes: (1) the percentage of beneficiaries with earnings during the year after VR, (2) the average beneficiary's annual earnings level during the year after VR, and (3) the percentage of beneficiaries that left the disability rolls by the close of 2005.[Footnote 3] For objective one, we conducted descriptive statistical analyses of the data. For objectives two, three, and four, we conducted econometric analyses that controlled for a variety of explanatory factors.[Footnote 4] We also identified and interviewed academic and agency experts in an effort to determine what variables to include in our models. As is the case with most statistical analyses, our work was limited by certain factors, such as the unavailability of certain information and the inability to control for unobservable characteristics and those that are not quantifiable. Our results only describe earnings outcomes of SSA beneficiaries included in our study and cannot be generalized beyond that population. We conducted our review from December 2005 through April 2007 in accordance with generally accepted government auditing standards. See appendix I for a more detailed description of our scope and methods. Results in Brief: When we analyzed state agency outcomes for SSA beneficiaries who completed VR between 2001 and 2003, we found that differences in agency outcomes continued over several years and across several outcome measures--i.e., rates of beneficiaries with earnings, earnings levels, and departures from the disability rolls. The proportion of beneficiaries with earnings during the year after their completion of the VR program ranged from as little as 0 percent in one state agency to as high as 75 percent in another. Similarly, average annual earnings levels among those SSA beneficiaries with earnings varied across state agencies from $1,500 to nearly $17,000 in the year following VR. Additionally, the proportion of SSA beneficiaries who left the disability rolls varied greatly among agencies, with departure rates ranging anywhere from 0 to 20 percent. After controlling for certain economic, demographic, and agency factors, we found that state economic conditions and the characteristics of agencies' clientele accounted for much of the differences in average earnings outcomes across state agencies. Specifically, state unemployment rates and state per capita income levels accounted for a substantial portion--as much as one-third--of the differences between state agencies' VR outcomes for SSA beneficiaries. For example, significantly fewer SSA beneficiaries had earnings during the year after VR in those states with higher unemployment rates and lower per capita incomes. Despite the significant effect that state economies have on state agency outcomes, Education currently does not consider such factors when analyzing state agency outcomes and assessing their performance. Variations in the demographic profile of SSA client populations also accounted for some of the differences in earnings outcomes among agencies. For example, state VR agencies serving a higher percentage of women beneficiaries had significantly fewer SSA clients with earnings during the year after VR. We also found, after controlling for the same factors, that a few agency practices helped explain differences in state agency outcomes for SSA beneficiaries--and some were associated with positive outcomes. For example, agencies with a higher proportion of state-certified VR counselors--a certification now mandated by Education--had more SSA beneficiaries exiting the VR program with earnings. Further, agencies with closer ties to the business community also achieved higher average annual earnings for SSA beneficiaries and higher rates of departures from the disability rolls. Currently, Education promotes ties to the business community through an employer network. Our findings also show that agencies that received a greater degree of support and cooperation from other public programs or that spent a greater proportion of their service expenditures on training of VR clients had higher average annual earnings for SSA beneficiaries completing VR. We were unable to account for differences in individual beneficiary outcomes, which might further explain differences in state agency outcomes, in part because of limitations in Education's data. Our statistical models were able to explain a greater percentage of the differences in earnings outcomes when we analyzed state agency earnings outcomes compared to individual earnings outcomes (i.e., as much as 77 percent compared to 8 percent). With so little variation explained by our analyses of individual-level outcomes, we decided not to report our individual-level analyses. Education's data lack information that we believe is critical to assessing earnings outcomes, and this may have hindered our ability to explain the variation in individual earnings outcomes. Specifically, although Education collects extensive client- level data, it does not systematically collect data that research has linked to work outcomes, such as detailed information on the severity of the client's disability--data that some state agencies independently collect for program purposes. Knowing the severity of a disability can indicate whether a person is physically or mentally limited in his or her ability to perform work, a fact that may influence the person's earnings outcomes. Further, other key data are self-reported and may not be verified by state agencies. We are recommending that Education consider the implications of the results of our analyses in its management of the VR program. Specifically, Education should further promote certain agency practices that we found show an effect on state agency outcomes and reassess the client-level data it collects through its state agencies. We also continue to believe that, as we recommended in our 2005 report, Education should consider economic factors, such as unemployment rates, when evaluating state agency performance. We received written comments on a draft of this report from Education and SSA. While Education generally agreed with the substance of our recommendations, it disagreed on when economic conditions and state demographics should be considered in assessing performance. Instead of using this information to help set performance measures, the department said that it takes these factors into account when it monitors agency performance results and believes that its approach is more effective. We continue to believe that incorporating this contextual information in assessing performance measures is essential to provide the state agencies with a more accurate picture of their relative performance. Although Education stated that it was open to our recommendation on improving data quality, it suggested that validating self-reported information would be a potential burden to state agencies and suggested other approaches, such as conducting periodic studies. Our recommendation that Education explore cost-effective ways to validate self-reported data was based on the experience of some VR agencies that have obtained data successfully from official sources and not relied solely on self-reported information. SSA stated that our report has methodological flaws that introduced aggregation bias and false correlations, and suggested that we should have focused on individual-level analysis or reported the results of both individual and aggregate-level analysis. We used aggregated data- -a widely used means of analysis--because our primary objective was to understand better the wide variation in outcomes for state VR agencies that serve SSA beneficiaries rather than the outcomes for individuals. We used appropriate statistical techniques to ensure against bias and false correlations. Both Education and SSA provided additional comments, which we have addressed or incorporated, as appropriate. Education's and SSA's comments are reprinted in appendixes II and III respectively, along with our detailed responses. Background: Challenges Facing the Social Security Disability Program: In 2005, the Social Security Administration provided income support to more than 10 million working age people with disabilities. This income support is provided in the form of monthly cash benefits under two programs administered by the Social Security Administration--the Disability Insurance program and the Supplemental Security Income program. Some individuals, known as concurrent beneficiaries, qualify for both programs. The federal government's cost of providing these benefits was almost $101 billion in 2005. Over the last decade, the number of disability beneficiaries has increased, as has the cost of both the SSI and DI programs. This growth, in part, prompted GAO in 2003 to designate modernizing federal disability programs as a high-risk area--one that requires attention and transformation to ensure that programs function in the most economical, efficient, and effective manner possible. GAO's work found that federal disability programs were not well positioned to provide meaningful and timely support for Americans with disabilities. For example, despite advances in technology and the growing expectations that people with disabilities can and want to work, SSA's disability programs remain grounded in an outmoded approach that equates disability with incapacity to work. In 1999, GAO testified that even relatively small improvements in return-to-work outcomes offer the potential for significant savings in program outlays. GAO estimated that if an additional 1 percent of working age SSA disability beneficiaries were to leave the disability rolls as a result of returning to work, lifetime cash benefits would be reduced by an estimated $3 billion. SSA has had a long-standing relationship with Education's VR program, whereby SSA may refer beneficiaries to the VR program for assistance in achieving employment and economic independence.[Footnote 5] As part of this relationship, SSA reimburses VR state agencies for the cost of providing services to beneficiaries who meet SSA's criteria for successful rehabilitation (i.e., earnings at the substantial gainful activity level for a continuous 9-month period). To further motivate beneficiaries to seek VR assistance and expand the network of VR providers, Congress enacted legislation in 1999 that created SSA's Ticket to Work (Ticket) Program.[Footnote 6] Under the Ticket program, beneficiaries receive a document, known as a ticket, which can be used to obtain VR and employment services from an approved provider such as a state VR agency. Thus far, only a small fraction of SSA beneficiaries have used the Ticket program to obtain VR services. Administered by SSA, this program was intended to (1) increase the number of beneficiaries participating in VR by removing disincentives to work, and (2) expand the availability of VR services to include private VR providers. To date private VR providers have not participated heavily in the Ticket program, with over 90 percent of SSA beneficiaries participating in the Ticket program still receiving services from state VR agencies. Despite programs such as Ticket, SSA beneficiaries who wish to participate in the workforce still face multiple challenges. As we have previously reported, some SSA beneficiaries will not be able to return to work because of the severity of their disability.[Footnote 7] But those who do return to work may face other obstacles that potentially deter or prevent them from leaving the disability rolls, such as (1) the need for continued health care, (2) lack of access to assistive technologies that could enhance their work potential, and (3) transportation difficulties. Description of Education's Vocational Rehabilitation Program: The Vocational Rehabilitation Program is the primary federal government program helping individuals with disabilities to prepare for and obtain employment. Authorized by Title I of the Rehabilitation Act of 1973, the VR program is administered by the Rehabilitation Services Administration, a division of the Department of Education, in partnership with the states. The Rehabilitation Act contains the general provisions states should follow in providing VR services. Each state and territory designates a single VR agency to administer the VR program--except where state law authorizes a separate agency to administer VR services for blind individuals. Twenty-four states have two separate agencies, one that exclusively serves blind and visually impaired individuals (known as blind agencies) and another that serves individuals who are not blind or visually impaired (known as general agencies). Twenty-six states, the District of Columbia, and five territories have a single combined agency that serves both blind and visually impaired individuals and individuals with other types of impairments (known as combined agencies). In total, there are 80 state VR agencies.[Footnote 8] Although Education provides the majority of the funding for state VR agencies, state agencies have significant latitude in the administration of VR programs. Within the framework of legal requirements, state agencies have adopted different policies and approaches to achieve earnings outcomes for their clients. For example, although all state VR agencies are required to have their VR counselors meet Comprehensive System of Personnel Development (CSPD) standards, states have the ability to define the CSPD certification standard for their VR counselors. Specifically, under the CSPD states can establish certification standards for VR counselors based on the degree standards of the highest licensing, certification, or registration requirement in the state, or based on the degree standards of the national certification. For example, if an agency bases its certification standard on the national standard, VR counselors are required to have a master's degree in vocational counseling or another closely related field, hold a certificate indicating they meet the national requirement, or take certain graduate-level courses. Regardless of the individual state's definition of the certification standard, research has shown that VR agencies are concerned about meeting their needs for state-certified counselors because many experienced VR counselors may retire in the coming years, and a limited supply of qualified VR counselors are entering the labor market.[Footnote 9] VR agencies also vary in their locations within state government and their operations. Some are housed in state departments of labor or education, while others are free-standing agencies or commissions. Similarly, while all VR agencies are partners in the state workforce investment system, as mandated in the Workforce Investment Act (WIA) of 1998, VRs vary in the degree to which they coordinate with other programs participating in this system.[Footnote 10] For example, some VRs have staff colocated at WIA one-stop career centers, while others do not. By law, each of the 80 VR agencies is required to submit specific information to Education regarding individuals that apply for, and are eligible to receive, VR services. Some of the required information includes (1) the types and costs of services the individuals received; (2) demographic factors, such as impairment type, gender, age, race, and ethnicity; and (3) income from work at the time of application to the VR program. Education also collects additional information such as (1) the weekly earnings and hours worked by employed individuals, (2) public support received,[Footnote 11] (3) whether individuals sustained employment for at least 90 days after receiving services,[Footnote 12] and (4) summary information on agency expenditures in a number of categories from each state VR agency. Education also monitors the performance of state VR agencies, and since 2000, Education has used two standards for evaluating their performance. One assesses the agencies' performance in assisting individuals in obtaining, maintaining, or regaining high-quality employment. The second assesses the agencies' performance in ensuring that individuals from minority backgrounds have equal access to VR services. Education also publishes performance indicators that establish what constitutes minimum compliance with these performance standards. Six performance indicators were published for the employment standard, and one was published for the minority service standard. To have passing performance, state VR agencies must meet or exceed performance targets in four of the six categories for the first standard, and meet or exceed the performance target for the second standard. In 2005, GAO reported that Education could improve performance of this decentralized program through better performance measures and monitoring.[Footnote 13] Specifically, we recommended that Education account for additional factors such as the economies and demographics of the states' populations in its performance measures, or its performance targets, for individual state VR agencies to address these issues. We also noted that whatever system of performance measures Education chooses to use, without consequences or incentives to meet performance standards, state VR agencies will have little reason to achieve the targets Education has set for them. We recommended that Education consider developing new consequences for failure to meet required performance targets and incentives for encouraging good performance. While Education agreed with our recommendations, it is currently considering them as part of the development of its VR strategic performance plan, and has not adopted them to date. Earlier this year, GAO reported on national-level earnings outcomes for SSA beneficiaries who completed VR from 2000 to 2003.[Footnote 14] Among other findings, this report estimated that as a result of work, some DI and concurrent beneficiaries saw a reduction in their DI benefits--for an overall annual average benefit reduction of $26.6 million in the year after completing VR compared to the year before VR. Further, we reported that 10 percent of SSA beneficiaries who exited VR in 2000 or 2001 were able to leave the disability rolls at some point. However, almost one quarter of those who left had returned by 2005 for at least 1 month. State VR Agencies Consistently Showed Very Different Rates of Success for SSA Beneficiaries Who Completed VR Programs: Before controlling for factors that might explain differences in outcomes among state VR agencies, our analysis of state agency outcomes over a 3-year period showed very different rates of success for SSA beneficiaries. This was the case in terms of the proportion of beneficiaries with earnings, earnings levels, and departures from the disability rolls. The wide range in average earnings outcomes among agencies was generally consistent from 2001 through 2003 and within each of the three types of agencies--referred to as blind, general, and combined agencies. Proportion with Earnings, Earnings Levels, and Departures from the Disability Rolls for SSA Beneficiaries Differed Substantially among State Agencies: Between 2001 and 2003, VR agencies varied widely in terms of outcomes for SSA beneficiaries who completed their VR programs. While the agency average for beneficiary earnings was 50 percent, the proportion of beneficiaries with earnings during the year following VR varied substantially among agencies: from 0 to 75 percent. (See fig. 1.) Figure 1: Distribution of State VR Agencies by Percentage of SSA Beneficiaries with Earnings during the Year after VR: [See PDF for image] Source: GAO analysis of SSA data. Note: n = 234, average = 50 percent. The 234 observations result from 78 VR agencies providing data for 3 years (2001 through 2003). [End of figure] Similarly, while the agency average for annual earnings levels for SSA beneficiaries who had earnings was $8,140, such earnings ranged by agency from about $1,500 to nearly $17,000. (See fig. 2.) Figure 2: Distribution of State VR Agency Average Annual Earnings for SSA Beneficiaries with Earnings during the Year after VR: [See PDF for image] Source: GAO analysis of SSA data. Note: n = 232, average = $8,140. The number in figure 2 differs from that in figure 1 because two agencies did not have any beneficiaries with reported earnings in fiscal year 2002. All earnings are in 2004 dollars. [End of figure] Agencies also differed in the proportion of SSA beneficiaries who had left the disability rolls by 2005, with departure rates ranging anywhere from 0 to 20 percent. The average departure rate was 7 percent. (See fig. 3.) Figure 3: Distribution of State VR Agencies by Percentage of SSA Beneficiaries Leaving the Rolls: [See PDF for image] Source: GAO analysis of SSA data. Note: n = 234, average = 7 percent. [End of figure] Trends Were Similar over Time and by Agency Type: In general, the range of earnings outcomes across agencies was similar over the 3 years we examined. While the average percentage of SSA beneficiaries with earnings during the year after VR declined slightly over this period from 53 percent in 2001 to 48 percent in 2003, the spread in the percentage of beneficiaries with earnings remained widely dispersed across agencies for all 3 years, as shown in figure 4. Figure 4: Range across State VR Agencies of the Percentage of SSA Beneficiaries with Earnings during the Year after VR by Year: [See PDF for image] Source: GAO analysis of SSA data. [End of figure] Likewise, the range of average earnings among agencies was similar for all 3 years, as shown in figure 5.[Footnote 15] Figure 5: Range of State VR Agency Average Earnings for SSA Beneficiaries by Year: [See PDF for image] Source: GAO analysis of SSA data. Note: Two agencies did not have any beneficiaries with reported earnings in fiscal year 2002. All earnings are in 2004 dollars. [End of figure] There were also wide differences in performance within the three types of agencies that serve different types of clientele--known as blind, general, and combined agencies. Specifically, among blind agencies, the percentage of SSA beneficiaries with earnings during the year after VR ranged from 23 to 67 percent, with an average of 46 percent. Among general agencies, the percentage of SSA beneficiaries with earnings after VR varied from 37 to 74 percent, with an average of 55 percent, and for combined agencies the percentage varied from 0 to 75 percent, with an average of 49 percent. (See fig. 6.) Figure 6: Range across State VR Agencies of the Percentage of SSA Beneficiaries with Earnings during the Year after VR by Agency Type: [See PDF for image] Source: GAO analysis of SSA data. [End of figure] Average annual SSA client earnings among blind agencies varied the most--from $4,582 to $16,805, with an average of $10,699 per year. SSA client earnings among the combined agencies varied anywhere from $1,528 to $10,889, with an average of $7,088 per year. General agencies showed the least variation in earnings among their SSA clients--from $4,654 to $9,424--but the lowest average ($6,867). (See fig. 7.) Figure 7: Range of State VR Agency Average Earnings for SSA Beneficiaries by Agency Type: [See PDF for image] Source: GAO analysis of SSA data. Note: Two combined agencies did not have any beneficiaries with reported earnings in fiscal year 2002. All earnings are in 2004 dollars. [End of figure] Finally, for rates of departure from the SSA disability rolls by 2005, blind agencies ranged from 0 to 16 percent, with an average of 6.7 percent; general agencies varied from 4 to 15 percent, with an average of 7.5 percent; and combined agencies varied from 0 to 20 percent, with an average of 7 percent. (See fig. 8.) Figure 8: Range of State VR Agency Average Rates of SSA Beneficiaries Leaving the Rolls by Agency Type: [See PDF for image] Source: GAO analysis of SSA data. [End of figure] State Economic Conditions and SSA Beneficiary Characteristics Account for Much of the Difference in State VR Agency Success Rates: After controlling for a range of factors, we found that much of the differences in state VR agency success rates could be explained by state economic climates and the characteristics of the SSA beneficiary populations at the VR agencies. Specifically, among a range of possible factors we considered, the economic conditions of the state appeared to explain up to one-third of the differences between state agency outcomes for SSA beneficiaries.[Footnote 16] Additionally, differences in the characteristics of the clientele accounted for some of the variation in performance among VR agencies. Differences in Agency Outcomes Were Largely Due to a State's Economic Conditions: When we controlled for a variety of factors using multivariate analysis, we found that state economic conditions accounted for a substantial portion of the differences in VR outcomes across state agencies. Not surprisingly, we found that fewer SSA beneficiaries had earnings during the year after completing VR in states with high unemployment rates after controlling for other factors. Moreover, our analysis showed that for each 1 percent increase in the unemployment rate, the percentage of SSA beneficiaries who had earnings during the year after completing VR decreased by over 2 percent.[Footnote 17] Across agencies, unemployment rates ranged from 2.3 to 12.3 percent between 2001 and 2003, with an average of 4.7 percent. We also found that after controlling for other factors, VR agencies in states with lower per capita incomes saw fewer SSA beneficiaries who had earnings, lower earnings levels, and fewer departures from the disability rolls in the year after VR. Across states, per capita incomes ranged from approximately $4,400 to $46,000 dollars, with an average of approximately $28,000. Together, state unemployment rates and per capita incomes explained over one-third of the differences between states agencies in the proportion of SSA beneficiaries that had earnings during the year after VR and the proportion that left the rolls.[Footnote 18] Agency officials commented that difficult economic environments result in lower earnings outcomes because a state's economy has a direct impact on an agency's ability to find employment for individuals. Our findings are also consistent with past research that has found labor market conditions to be among the most influential determinants of agency performance.[Footnote 19] Education, however, does not currently consider state economic conditions when evaluating agency performance.[Footnote 20] Although Education agreed with our prior recommendation to consider economic and demographic characteristics when evaluating agency performance, Education is currently considering it as part of the development of its VR strategic performance plan and has not yet adopted this recommendation. Demographic Characteristics and the Types of Disabilities of Clientele Also Accounted for Some of the Disparities in State Agency Performance: After controlling for a variety of factors, certain characteristics of the clientele served by state agencies accounted for some of the state agency differences in earnings outcomes for SSA beneficiaries. Among the factors we examined the influence of were: demographic characteristics, types of disabilities, and the proportion of SSA beneficiaries served by each state agency.[Footnote 21] Demographic Differences: Several clientele characteristics influenced state agency earnings outcomes.[Footnote 22] In particular, after controlling for other factors, state agencies that served a higher proportion of women beneficiaries had fewer beneficiaries with earnings during the year after completing VR. According to our analysis, a 10 percent increase in the percentage of women served by a VR agency resulted in a 5 percent decrease in the percentage of SSA beneficiaries with earnings. Research shows that for the population of low-income adults with disabilities, women were found to have lower employment rates than men.[Footnote 23] Further, we found that after controlling for other factors, state agencies serving a larger percentage of SSA beneficiaries between 46 and 55 years old when they applied for the VR program saw fewer SSA beneficiaries leave the disability rolls.[Footnote 24] For every 10 percent increase in the percentage of beneficiaries in this age group, the percentage of SSA beneficiaries leaving the rolls decreased by approximately 1 percent. Differences in Types of Disabilities: When we considered the influence of various types of medical impairments on earnings outcomes, we found that some state agency outcomes were related to the proportion of SSA beneficiaries who had mental or visual impairments. Average earnings and departures from the disability rolls for SSA beneficiaries were lower in agencies that served a larger percentage of individuals with mental impairments, after controlling for other factors. Specifically, our analysis indicated that a 10 percent increase in the proportion of the beneficiary population with a mental impairment resulted in a decrease of almost 1 percent in the proportion of SSA beneficiaries who left the rolls. Some SSA beneficiaries may not leave the disability rolls because, as research has shown, they fear a loss of their public benefits or health coverage.[Footnote 25] This is particularly true for individuals with mental impairments. Agencies with a higher proportion of blind or visually impaired beneficiaries had fewer departures from the disability rolls after controlling for other factors. We found that an increase of 10 percent in the proportion of individuals with a visual impairment resulted in a decrease of 0.5 percent of beneficiaries leaving the rolls. Some SSA beneficiaries with visual impairments are classified as legally blind. As such, they are subject to a higher earnings threshold, in comparison to those that are not legally blind, before their benefits are reduced or ceased. Our analysis also showed that holding other factors equal, blind agencies--those serving only clientele with visual impairments-- had fewer SSA beneficiaries with earnings during the year after completing VR than agencies that served a lower proportion of beneficiaries with visual impairments.[Footnote 26] Proportion of SSA Beneficiaries Served: Differences in the proportion of SSA beneficiaries served by an agency also affected earnings outcomes for SSA beneficiaries. Specifically, agencies with a greater proportion of SSA beneficiaries had more beneficiaries with earnings during the year after VR, but saw lower earnings levels for their SSA beneficiaries, holding other factors constant. VR state agency officials and experts with whom we consulted were unable to provide an explanation for this result.[Footnote 27] We also found that after controlling for other factors, agencies with a higher proportion of SSA beneficiaries who were DI beneficiaries had lower average annual earnings among SSA beneficiaries and a lower percentage of beneficiaries leaving the rolls. The earnings result might be explained by differences in the work incentive rules between the two programs. Specifically, the work incentive rules are more favorable for SSI beneficiaries who want to increase their earnings while not incurring a net income penalty.[Footnote 28] The lower rates of departures from the rolls among agencies with a greater proportion of DI beneficiaries might be due to the limited time frames of our study and the fact that DI beneficiaries are allowed to work for a longer period of time before their benefits are ceased.[Footnote 29] A Few Agency Practices Appeared to Yield Better Earnings Outcomes, while the Results of Other Practices Were Inconclusive: When we analyzed outcomes at the agency level, a few agency practices appeared to yield some positive results, albeit in different ways. Specifically, after controlling for other factors, we found that state agencies with a higher proportion of state-certified VR counselors, or stronger relationships with businesses or other public agencies appeared to have better earnings outcomes. Further, agencies that devoted a greater proportion of their service expenditures to training of VR clients had higher average annual earnings for SSA beneficiaries completing VR, holding other factors equal. On the other hand, our multivariate analyses suggest that agencies using in-house benefits counselors saw fewer beneficiaries with earnings following VR, but these results may not be conclusive because the benefits counseling program has changed considerably since the time period of our study. Agencies with State-Certified Counselors or Strong Relationships with Businesses or Other Public Agencies Appeared to Have Better Earnings Outcomes: State VR agencies that reported employing a higher percentage of counselors meeting the state certification standards had higher rates of beneficiaries with earnings among those beneficiaries who completed VR between 2001 and 2003, holding other factors constant. On average, 62 percent of counselors at an agency met the states' certification requirements, but the range was from 0 to 100 percent. According to our analysis, for every 10 percent increase in the percentage of counselors meeting state requirements, the percentage of SSA beneficiaries with earnings during the year after VR increased by 0.5 percent. This appeared to be consistent with research indicating that more highly qualified VR counselors are more likely to achieve successful earnings outcomes.[Footnote 30] While the certification requirements vary by state, agency officials reported that counselors with master's degrees in vocational rehabilitation are more likely to be versed in the history of the VR program and the disability rights movement and are likely to be more attuned to the needs of their clients than those without specialized degrees. VR agencies that had stronger relationships with the business community had higher average earnings among SSA beneficiaries during the year after completing VR and higher rates of departures from the disability rolls, holding other factors equal. These were agencies that reported interacting with the business community more frequently by sponsoring job fairs, hosting breakfasts, attending business network meetings, meeting with local businesses, meeting with local chambers of commerce, and interacting with civic clubs. To support these practices, Education has helped establish the Vocational Rehabilitation Employer Business and Development Network, which aims to connect the business community to qualified workers with disabilities through the efforts of staff located at each of the VR agencies who specialize in business networking.[Footnote 31] VR agency officials with whom we spoke said that through interaction with the business community, they could dispel myths about the employability of people with disabilities, and they could tailor services for their clients to the specific needs of different businesses. In addition to business outreach, our multivariate analysis indicated that agencies that reported receiving a greater degree of support and cooperation from more than one public program--such as from state social services, mental health, and education departments--also showed higher average earnings among SSA beneficiaries. One VR agency official commented that people with disabilities need multiple supports and services and therefore are more effectively served through partnerships between government programs.[Footnote 32] Another VR official said that coordination with other programs facilitated the provision of a complete package of employment-related services. For example, VR might provide employment training to an individual, while the department of labor might provide transportation services to get the person to work. Although many agencies said they were successful in coordinating with other programs, some reported difficulties. For example, they cited barriers to coordinating with WIA one-stops such as inability to share credit for successful earnings outcomes, staff that are not trained to serve people with disabilities, and inaccessible equipment, particularly for those with visual or hearing impairments. Agency Expenditures on Training Yield Positive Outcomes: Additionally, agencies with a greater proportion of their service expenditures spent on training of VR clients--including postsecondary education, job readiness and augmentative skills, and vocational and occupational training--had higher average annual earnings for SSA beneficiaries completing VR, holding other factors equal.[Footnote 33] The average percentage of service expenditures devoted to training of VR clients was 47 percent, but this ranged from 3 to 84 percent across agencies. Research has shown that the receipt of certain types of training services, such as business and vocational training, leads to positive earnings outcomes.[Footnote 34] Effect of Using In-house Benefits Counselors is Unclear: Our analysis suggests that after controlling for other factors, agencies with in-house benefits counselors--counselors who advise VR clients on the impact of employment on their benefits--had lower rates of SSA beneficiaries with earnings during the year after completing VR than agencies without them. Over the years we studied, only 14 percent of state agencies reported using in-house benefits counselors. However, this was a period of transition for the benefits counseling program. There was wide variation in how this service was provided, and clients in states that did not have on-site benefits counselors may have received benefits counseling from outside the agency. According to one researcher, the benefits counseling program has become more standardized since that period. In fact, other empirical research shows that benefits counselors have had a positive effect on earnings.[Footnote 35] VR Officials in Some Agencies Credited Other Practices with Yielding Results: Some agency officials credited certain other practices with yielding positive results, but we were not able to corroborate their ideas with our statistical approach. For example, VR agency officials cited the following practices as being beneficial: (1) collaborative initiatives between the state VR agency and other state agencies aimed to help specific client populations, such as individuals with mental impairments or developmental disabilities; (2) unique applications of performance measures, such as measuring performance at the team level rather than the individual counselor level; and (3) improved use of computer information systems, such as real-time access to the status of individual employment targets. Although we were able to examine many state practices with our survey data, there were not enough agencies employing these practices for us to determine whether these practices led to improved earnings outcomes for SSA beneficiaries among state VR agencies. Limitations in Education's Data May Have Hampered Analyses of Individual Earnings Outcomes: Although we were able to explain a large amount of the differences in earnings outcomes among state agencies, we could only explain a small amount of the differences in earnings outcomes among individual SSA beneficiaries. Specifically, while our models accounted for between 66 and 77 percent of the variation in agency-level earnings outcomes, our models using the individual-level data had low explanatory power, accounting for only 8 percent of variation in earnings levels across individuals and rarely producing reliable predictions for achieving earnings or leaving the rolls. With so little variation explained in individual-level outcomes, we could not be confident that our individual-level analyses were sufficiently reliable to support conclusions. As a result, we chose not to report on these analyses. Other researchers told us they have experienced similar difficulties using Education's client database to account for individual differences in earnings outcomes among VR clients. Education's data lack information that we believe is critical to assessing earnings outcomes, and not having this information may have hindered our ability to explain differences in individual earnings outcomes.[Footnote 36] Specifically, Education does not collect certain information on VR clients that research has linked to work outcomes, such as detailed information on the severity of the disability and historical earnings data. Research indicates that both of these factors are, or could be, important to determining employment success for people with disabilities.[Footnote 37] With regard to obtaining information on the severity of the client's disability, knowing the severity of the disability can indicate the extent to which a person is physically or mentally limited in the ability to perform work, a fact that may influence the person's earnings outcomes. While Education's client data include information indicating whether a disability is significant--which is defined by the Rehabilitation Act--the data do not include more detailed information on the severity of the disability, such as the number and extent of functional limitations.[Footnote 38] Additionally, Education does not collect information on a client's historical earnings, which may provide a broader understanding of the client's work experience and likelihood to return to work. States may be able to obtain earnings data from other official sources, such as other state and federal agencies. Another limitation with Education's data is the collection of self- reported information from the client that may not be validated by the VR agency. For example, one agency official said that clients are asked to report their earnings at the time of application--information that Education is legally required to collect--and that these data may not be accurate. Reliable information on a client's earnings at the time of application to VR is essential for evaluating the impact of the VR program on earnings. However, some clients may misreport their earnings. One researcher reported, for example, that VR clients sometimes report net as opposed to gross earnings. Instead of relying on self-reported information, agencies may be able to obtain or validate this information from official sources. Specifically, some state VR agencies have agreements with other state and federal agencies to obtain earnings data on their clients. For example, agency officials from one state told us that they match their data against earnings data from the Department of Labor, while another agency relies on data from their state's Employment Development Department. However, in some cases state agencies are required to pay for these data. Conclusions: The federal-state vocational rehabilitation program is still the primary avenue for someone with a disability to prepare for and obtain employment. Given the growing size of the disability rolls and the potential savings associated with moving beneficiaries into the workforce, it is important to make the nation's VR program as effective as possible to help people with disabilities participate in the workforce. Our findings indicate that it will be difficult to maximize the effectiveness of the VR program with assessments of state agency performance that do not account for important factors, such as the economic health of the state. Such comparisons will be misleading. Without credible indicators, VR agencies do not have an accurate picture of their relative performance, and Education may continue its reluctance to use sanctions or incentives to encourage compliance. Our findings underscore the recommendation that we made in 2005 that Education consider economic factors in assessing the performance of state vocational rehabilitation agencies. Moreover, our study points to deficiencies in Education's data that may hinder more conclusive analyses of individual-level earnings outcomes. Without data on the severity of a client's disability or information on historical earnings, VR programs may not be able to conduct valuable analysis to explain differences in individual outcomes. With the growing emphasis on the role of VR in helping people with disabilities enter the workforce, the need for such analyses--and data that can be used to conduct them--is likely to increase. Despite the deficiencies in Education's data, our findings show that certain agency practices may improve VR success across the country and give weight to current efforts by Education to promote such practices. The fact that agencies with stronger ties to the business community have achieved higher earnings among their SSA beneficiaries suggests the importance of such practices, such as Education's initiative to promote business networks. Our findings also demonstrate the value of having VR counselors meet state certification standards and having agencies collaborate with more than one supportive public agency to help their clients. Our study also suggests that other practices, such as state agencies devoting more resources to targeted training services for VR clients, may have positive benefits. Recommendations for Executive Action: To improve the effectiveness of Education's program evaluation efforts and ultimately the management of vocational rehabilitation programs, we recommend that the Secretary of Education: 1. Further promote agency practices that show promise for helping more SSA disability beneficiaries participate in the workforce. Such a strategy should seek to increase: * the percentage of VR staff who meet state standards and certifications established under the CSPD, * partnership or involvement with area business communities, and: * collaboration with other agencies that provide complementary services. 2. Reassess Education's collection of VR client data through consultation with outside experts in vocational rehabilitation and the state agencies. In particular, it should: * consider the importance of data elements that are self-reported by the client and explore cost-effective approaches for verifying these data, and: * consider collecting additional data that may be related to work outcomes, such as more detailed data on the severity of the client's disability and past earnings history, collaborating whenever possible with other state and federal agencies to collect this information. 3. In a 2005 report, we recommended that Education revise its performance measures or adjust performance targets for individual state VR agencies to account for additional factors. These include the economic conditions of states, as well as the demographics of a state's population. We continue to believe that Education should adopt this recommendation, especially in light of our findings on the impact of state unemployment rates, per capita incomes, and demographic factors on earnings outcomes. Agency Comments and Our Evaluation: We received written comments on a draft of this report from Education, which oversees the VR program, and SSA, from which we received data that were used to evaluate its Ticket to Work program. Education commended our use of multiple data sources and said that it opens up new analytical possibilities in evaluating how VR programs serve SSA beneficiaries, including identifying low-performing and high- performing VR programs. However, Education also questioned whether the statistical relationships we found can be applied to how it administers a state-operated formula grant program. We continue to believe our findings have important implications for improving what data are collected and how VR services are delivered. While Education generally agreed with the substance of our recommendations, it disagreed on when economic conditions and state demographics should be considered in assessing agency performance. Instead of using this information to help set performance measures, the department said that it takes these factors into account when it monitors agency performance results and believes that its approach is effective. We believe that incorporating this contextual information into assessing performance is essential to provide the state agencies with a more accurate picture of their relative performance. Although Education stated that it was open to our recommendation on improving data quality, it suggested that validating self-reported information would be a potential burden to state agencies and suggested other approaches, such as conducting periodic studies. Our recommendation that Education explore cost-effective ways to validate self-reported data was based on the experience of some VR agencies that have obtained data successfully from official sources and not relied solely on self-reported information. We made additional technical changes as appropriate based on Education's comments. See appendix II for a full reprinting of Education's comments and our detailed responses. SSA stated that our report has methodological flaws that introduced aggregation bias and false correlations, and suggested that we should have focused on individual-level analysis or reported the results of both individual and aggregate-level analyses. We used aggregated data- -a widely used means of analysis--because our primary objective was to understand better the wide variation in outcomes for state VR agencies that serve SSA beneficiaries rather than the outcomes for individuals. Further, we used appropriate statistical techniques to ensure the lack of bias due to clustering of individual cases within agencies (see app. I for a more detailed discussion). Because we used aggregated data, we did not attempt to infer the effects of individual behavior or individual outcomes. Additionally, SSA had concerns about the implications of our analysis of state economic factors on agency-level outcomes. Our findings related to the influence of state economic characteristics were highly statistically significant as well as corroborated by previous research, and we believe these results have important implications for VR agency performance measures. SSA provided additional comments, which we addressed or incorporated, as appropriate. See appendix III for a full reprinting of SSA's comments as well as our detailed responses. Copies of this report are being sent to the Secretary of Education, the Commissioner of SSA, appropriate congressional committees, and other interested parties. The report is also available at no charge on GAO's Web site at http://www.gao.gov. If you have any questions about this report, please contact me at (202) 512-7215. Other major contributors to this report are listed in appendix IV. Signed by: Denise M. Fantone: Acting Director, Education, Workforce, and Income Security Issues: [End of section] Appendix I: Scope and Methodology: To understand the variation in state agency outcomes for Social Security Administration (SSA) disability beneficiaries completing the vocational rehabilitation (VR) program, we conducted two sets of analyses. First, we used descriptive analyses to compare agency performance with three measures of earnings outcomes from 2001 to 2003. Second, using agency and survey data, we conducted econometric analyses of the three measures of earnings outcomes to determine how state and agency characteristics related to state agency performance. We developed our analyses in consultation with GAO methodologists, an expert consultant, and officials from SSA and the Department of Education (Education).[Footnote 39] To choose the appropriate variables for our analyses, we reviewed pertinent literature and consulted with agency officials and academic experts. This appendix is organized in four sections: Section 1 describes the data that were used in our analyses and our efforts to ensure data reliability. Section 2 describes the study population, how the dependent variables used in the analyses were constructed, and the descriptive analyses of those variables. Section 3 describes the econometric analyses. Section 4 explains the limitations of our analyses. Section 1: Data Used, Information Sources, and Data Reliability: This section describes each of the datasets we analyzed, the variables from each dataset that were used in our analyses, and the steps that were taken to assess the reliability of each dataset. To conduct our analyses, we used several data sources: (1) a newly available longitudinal dataset that includes information from several SSA and Education administrative databases on all SSA disability beneficiaries who completed the VR program from 2001 through 2003; (2) data from Education on yearly spending information by service category for each state VR agency; (3) data from the Census Bureau, the Bureau of Labor Statistics, and other data sources regarding state demographic and economic characteristics; and (4) original survey data collected by GAO from state VR agencies. To perform our analyses, we used variables from each of the above datasets by merging, by agency and year, each of the datasets into one large data file. Education and SSA Beneficiary Data: We obtained a newly available longitudinal dataset--a subfile of SSA's Ticket Research File (TRF)--which contains information from several SSA and Education administrative databases on all SSA disability beneficiaries who completed the federal-state VR program between 1998 and 2004.[Footnote 40] SSA merged this dataset with its Master Earnings File (MEF), which contains information on each beneficiary's annual earnings from 1990 through 2004. The combined data provide information about each beneficiary's disability benefits, earnings, and VR participation.[Footnote 41] See section 2 of this appendix for a description of how these data were used to create our dependent variables on earnings outcomes. We were interested in how earnings outcomes were affected by differences across agencies, including differences in characteristics of the individuals served by the different agencies. Table 1 shows information from the TRF subfile on characteristics of our study population that we included among our explanatory variables.[Footnote 42] Table 1: Explanatory Variables from the TRF Subfile: State agency demographic characteristics: Percentage of beneficiaries between the ages of 18 and 25. State agency demographic characteristics: Percentage of beneficiaries between the ages of 26 and 35. State agency demographic characteristics: Percentage of beneficiaries between the ages of 36 and 45. State agency demographic characteristics: Percentage of beneficiaries between the ages of 46 and 55. State agency demographic characteristics: Percentage of beneficiaries between the ages of 56 and 64. State agency demographic characteristics: Percentage of female beneficiaries. State agency demographic characteristics: Percentage of white beneficiaries. State agency demographic characteristics: Percentage of African- American beneficiaries. State agency demographic characteristics: Percentage of Native- American beneficiaries. State agency demographic characteristics: Percentage of Asian and Pacific Islander beneficiaries. State agency demographic characteristics: Percentage of Hispanic beneficiaries. State agency demographic characteristics: Percentage of multiracial beneficiaries. Stage agency medical characteristics: Percentage of beneficiaries who are blind or have visual impairments. Stage agency medical characteristics: Percentage of beneficiaries with sensory impairments. Stage agency medical characteristics: Percentage of beneficiaries with physical impairments. Stage agency medical characteristics: Percentage of beneficiaries with mental impairments. State agency program participation: Percentage of beneficiaries receiving Supplemental Security Income. State agency program participation: Percentage of beneficiaries receiving Disability Insurance. State agency program participation: Percentage of concurrent beneficiaries (receiving both SSI and DI). State agency program participation: Proportion of SSA beneficiaries served by an agency[A]. Source: SSA and Education data. [A] To construct this variable, additional information was obtained from Education on the total number of clients completing the VR program. [End of table] To determine the reliability of the TRF subfile, we: * reviewed SSA and Education documentation regarding the planning for and construction of the TRF subfile, * conducted our own electronic data testing to assess the accuracy and completeness of the data used in our analyses, and: * reviewed prior GAO reports and consulted with GAO staff knowledgeable about these datasets. On the basis of these steps, we determined that despite the limitations outlined in section 4, the data that were critical to our analyses were sufficiently reliable for our use. VR Agency Administrative Data: To determine whether differences in agency size and expenditure patterns affected earnings outcomes, we obtained information on state VR agency expenditures for the years 2000 through 2002 from the RSA-2 data, an administrative dataset compiled by Education. The RSA-2 data contain aggregated agency expenditures for each of the 80 state VR agencies as reported in various categories, such as administration and different types of services. Table 2 shows the variables that were derived from the RSA-2 data. Table 2: Explanatory Variables from Education's RSA-2 Data: Agency structure: Type of agency: (1) general, (2) blind, and (3) combined agencies. Agency structure: Number of people receiving services (proxy for size). Agency structure: Total expenditures on services (proxy for size). Agency expenditures: Percentage of all service expenditures spent on assessment. Agency expenditures: Percentage of all service expenditures spent on diagnosis/treatment. Agency expenditures: Percentage of all service expenditures spent on training services for VR clients. Agency expenditures: Percentage of all service expenditures spent on maintenance. Agency expenditures: Percentage of all service expenditures spent on transportation. Agency expenditures: Percentage of all service expenditures spent on personal assistance services. Agency expenditures: Percentage of all service expenditures spent on placement services. Agency expenditures: Percentage of all service expenditures spent on post employment services. Agency expenditures: Percentage of all service expenditures spent on other services. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on assessment[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on diagnosis/treatment[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on training services for VR clients[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on maintenance[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on transportation[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on personal assistance services[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on placement[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on post employment services[A]. Agency expenditures: Percentage of total service expenditures (not including assessment, counseling, guidance, and placement) spent on other services[A]. Agency expenditures: Percentage of total expenditures spent on administration. Agency expenditures: Percentage of total expenditures spent on services provided directly by VR personnel. Agency expenditures: Percentage of total expenditures spent on purchased services. Agency expenditures: Percentage of total expenditures spent on services purchased from public vendors. Agency expenditures: Percentage of total expenditures spent on services purchased from private vendors. Agency expenditures: Percentage of total expenditures spent on services to individuals with disabilities. Agency expenditures: Percentage of total expenditures spent on services to groups with disabilities. Source: Education data. [A] These total expenditures include those optional services that are provided to clients based on their specific needs. They do not include assessment, counseling, guidance, and placement services provided directly by VR personnel since these services are generally provided to all VR clients. [End of table] To determine the reliability of the RSA-2 data, we: * reviewed relevant agency documentation and interviewed agency officials who were knowledgeable about the data, and: * conducted our own electronic data testing to assess the accuracy and completeness of the data used in our analyses. On the basis of these steps, we determined that the data that were critical to our analyses were sufficiently reliable for our use. State Economic and Demographic Data: We were interested in how differences in state characteristics affected earnings outcomes of SSA beneficiaries completing VR at different VR agencies. The state characteristics we considered included economic conditions (unemployment rates, per capita income, and gross state product growth rates), population characteristics (including size, density, and percentage living in rural areas and on Disability Insurance), and availability of the Medicaid Buy-in program. Data on state characteristics were downloaded from several sources, including federal agencies and research institutes. The research institutes from which we obtained data included Cornell University Institute for Policy Research and Mathematica Policy Research, Inc., both authorities in social science research. Table 3 summarizes the state data that were collected and the sources for those data. Table 3: State Economic and Demographic Explanatory Variables and Their Sources: Variable: Annual state unemployment rates; Data source: Department of Labor, Bureau of Labor Statistics. Variable: Gross state product (GSP) growth rate; Data source: Department of Commerce, Bureau of Economic Analysis. Variable: Annual per capita income; Data source: Department of Commerce, Bureau of Economic Analysis. Variable: Annual population; Data source: Department of Commerce, Census Bureau. Variable: Population density; Data source: Department of Commerce, Census Bureau. Variable: Percentage of rural population; Data source: Department of Commerce, Census Bureau. Variable: Medicaid Buy-In participation; Data source: Cornell University Institute of Policy Research and Mathematica Policy Research, Inc. (primary sources). Variable: Ticket to Work program implementation; Data source: Mathematica Policy Research, Inc. Source: Various data sources listed in table. [End of table] For each of these data sources we reviewed documentation related to the agency's or research organization's efforts to ensure the accuracy and integrity of their data. On the basis of these reviews, we concluded that the data were sufficiently reliable for the purposes of our review. VR Agency Survey Data: We were also interested in how differences in the VR agencies themselves affected earnings outcomes. To obtain information about the policies, practices, and environment of each state VR agency, we conducted a detailed survey of all state agencies. The survey was intended to collect information that may be relevant to explaining earnings outcomes of SSA beneficiaries who exited the VR program between federal fiscal years 2001 through 2003. Specifically, we collected information on the structure of the VR program, staffing and turnover rates, performance measures, service portfolios, and the extent of integration with outside partners such as other state and federal agencies and the business community.[Footnote 43] In developing our survey, we identified relevant areas of inquiry by conducting a review of the literature on state VR agency performance and consulting with state agency officials and outside researchers. For the final survey, we sent e-mail notifications asking state agency officials to complete either a Web-based version of the survey (which was accessible to those with visual impairments) or a Microsoft Word version of the survey by August 4, 2006. We closed the survey on August 22, 2006. We obtained survey responses from 78 of the 80 state VR agencies, for a response rate of 98 percent. Because this was not a sample survey, it has no sampling errors. However, the practical difficulties of conducting any survey may introduce errors, commonly referred to as nonsampling errors. For example, difficulties in interpreting a particular question or sources of information available to respondents can introduce unwanted variability into the survey results. We took steps in developing the questionnaire, collecting the data, and analyzing them to minimize such nonsampling error. For example, we pretested the content and format of our survey with officials from 17 state agencies to determine if it was understandable and the information was feasible to collect, and we refined our survey as appropriate. When the data were analyzed, an independent analyst checked all computer programs. Since the data were collected with a Web-based and Word format survey, respondents entered their answers directly into the electronic questionnaire, thereby eliminating the need to key data into a database, minimizing another potential source of error. The variables that we analyzed from the survey data are presented in table 4. These included the structure of the agency (stand-alone agencies, umbrella agencies with and without autonomy over staff and finances, and others), agency staffing, agency management, indicators of the existence of performance targets and incentives, specialized caseloads, case management systems and system components, and integration with outside partners and the business community. Since we had data on each of the earnings outcomes and most of the state and agency characteristics for each of the 3 years, we included in our analysis an indicator for year. Table 4: Explanatory Variables from the VR Agency Survey Data: Agency structure. Agency structure 1--indicates whether agency is (1) part of an umbrella agency with autonomy over its own staff and finances, (2) part of an umbrella agency without autonomy over its own staff and finances, (3) a stand-alone agency, and (4) other type of agency. Agency structure 2--indicates whether agency is part of an umbrella agency. Agency structure 3--indicates whether agency is in an umbrella agency that was a part of (1) social services, (2) education, (3) labor (4) human services, (5) a stand-alone, or (6) other type of agency. Agency staffing. Percentage of service delivery sites staffed full-time[A]. Percentage of service delivery sites staffed part-time[A]. Percentage of service delivery sites shared with social services[A]. Percentage of service delivery sites shared with education[A]. Percentage of service delivery sites shared with labor[A]. Percentage of service delivery sites shared with human services[A]. Percentage of service delivery sites shared with other agencies[A]. Indicates whether the VR program experienced a hiring freeze in a given fiscal year. Indicates whether the VR program experienced a large number of retirements in a given fiscal year. Indicates whether the VR program experienced a large influx of new hires in a given fiscal year. Indicates whether the VR program experienced downsizing through layoffs in a given fiscal year. Indicates whether the VR program experienced unusual changes in staffing in a given fiscal year. Indicates whether VR counselors were affiliated with a union in a given fiscal year. Agency management. Number of clients per VR counselor[A]. Number of counselors employed (proxy for agency size)[A]. Indicates whether the director had authority over developmental disability services. Indicates whether the director had authority over independent living services. Indicates whether the director had authority over disability determination services. Indicates whether the director had authority over other programs or services. Percentage of counselors who left VR agency (turnover)[A]. Percentage of counselors meeting comprehensive system of personnel development (CSPD) standards[A]. Percentage of senior managers who left VR agency (turnover)[A]. Length of time director has held his/her position (director tenure)[A]. Length of time director has been with the VR agency (director experience)[A]. Length of time the director has held his/her position as a percent of their time at the agency[A]. Indicates whether the agency operated under an order of selection. Indicates whether the program had a wait list. Length of wait list. Indicates whether the program had a wait list and, if so, its length. Performance targets/incentives. Scale indicating number of reported specific and numerical targets including SSA reimbursements, individual plans for employment (IPE) initiated, client referrals, contacts with businesses, client satisfaction, and other client employment outcomes by year. Indicates whether counselors had performance expectations with numerical targets based on successful VR employment outcomes (status 26 closures). Nature of performance expectations. Indicates whether counselors had numerical targets in their performance expectations. Average number of status 26 case closures required for satisfactory performance[A]. Indicates whether there were performance expectations that contained numerical targets for SSA reimbursements. Indicates whether there were performance expectations that contained numerical targets for the number of IPEs initiated. Indicates whether there were performance expectations that contained numerical targets for the number of client referrals. Indicates whether there were performance expectations that contained numerical targets for the number of contacts made with businesses for job development. Indicates whether there were performance expectations that contained numerical targets for client satisfaction rates. Indicates whether there were performance expectations that contained numerical targets for any other outcomes. Indicates whether there were monetary performance incentives to VR counselors. Indicates how frequently a VR program reported on agencywide performance. Specialized caseloads. Indicates whether there were in-house benefits counselors. Number of benefits counselors[A]. Indicates whether there were job development specialists. Number of job development specialists[A]. Scale measuring the number of types of specialized caseloads covered, including transitioning high school students, mental health, developmental disabilities, traumatic brain/spinal cord injuries, hearing impairments, visual impairments (not counted for blind-serving agencies), or other groups. Percentage of counselors with specialized caseloads serving transitioning high school students[A]. Percentage of counselors with specialized caseloads serving clients with mental health issues[A]. Percentage of counselors with specialized caseloads serving clients with developmental disabilities[A]. Percentage of counselors with specialized caseloads serving clients with traumatic brain/spinal cord injuries[A]. Percentage of counselors with specialized caseloads serving clients with hearing impairments[A]. Percentage of counselors with specialized caseloads serving clients with visual impairments[A]. Percentage of counselors with specialized caseloads serving any other group of clients[A]. Case management system. Scale indicating the sophistication of the case management system including the ability of the case management system to collect Education data, collect fiscal data, generate IPEs, generate client letters, produce state-level management reports, and produce counselor- level management reports. Indicates whether an agency used an automated case management system. Indicates whether the automated case management system was new if an agency used one. Indicates whether an agency used an automated case management system and if so, whether the system was new. Indicates whether case management system could collect RSA 911 data. Indicates whether case management system could collect fiscal data. Indicates whether case management system could generate IPEs. Indicates whether case management system could generate client letters. Indicates whether case management system could generate state level management reports. Indicates whether case management system could generate reports at VR counselor level. Integration with outside partners. Indicates whether any VR staff worked full-time or part-time at Workforce Investment Act (WIA) one-stops. Total number of staff (both full-and part-time) that worked at a WIA site. Indicates whether VR program purchased any services from public or private vendors. Indicates how many purchased services had fee for service arrangements. Indicates how many purchased services had contracts with outcome-based performance measures. Indicates how many purchased services had vendor fees tied to meeting performance measures. Indicates how many purchased services had renewal of their contracts tied to meeting performance measures. Indicates how many purchased services were evaluated by VR to see whether performance measures were met at contract end. Indicates how many purchased services were evaluated by VR by group or type of vendor. Scale indicating the average support level received from different types of programs including WIA one-stops, social service departments, mental health departments, education systems, Medicaid program, Medicare program, substance abuse departments, and developmental disabilities programs. Indicates the extent to which a VR program received support from the state WIA one-stop system. Indicates the extent to which a VR program received support from state social services. Indicates the extent to which a VR program received support from the state mental health department. Indicates the extent to which a VR program received support from the state education system. Indicates the extent to which a VR program received support from the state Medicaid program. Indicates the extent to which a VR program received support from the state Medicare program. Indicates the extent to which a VR program received support from the state substance abuse department. Indicates the extent to which a VR program received support from the state development disabilities program. Indicates the extent to which a VR program received support from another state program. Integration with business community. Scale indicating agency's level of integration with the business community, including the average frequency with which the agency sponsors job fairs, attends business network meetings, meets with local businesses, meets with chambers of commerce, interacts with civic clubs, and hosts employer breakfasts. Frequency with which agency sponsored job fairs. Frequency with which agency representatives attended job fairs. Frequency with which agency representatives attended meetings of business networks. Frequency with which agency met with local businesses. Frequency with which agency met with local chambers of commerce. Frequency with which agency representatives interacted with civic clubs. Frequency with which agency hosted employer breakfasts. Frequency with which agency representatives participated in other business outreach. Source: GAO survey data. [A] Indicates variables that were categorized. [End of table] To determine whether the survey data were sufficiently reliable for our analysis, we collected and analyzed additional data. Specifically, we included questions in the survey that were designed to determine whether each state VR agency uses certain practices to monitor the quality of computer-processed data that were used to complete the survey.[Footnote 44] From these questions, we developed a variable to indicate whether a particular agency might have unreliable data. To determine whether there was a relationship between agencies with data reliability issues and the earnings outcomes we were studying, we included this variable in our three models of earnings outcomes (described below). We found two issues associated with the survey data that are related to our findings. First, net of other effects, agencies that reported having a data reliability issue had significantly lower rates of SSA beneficiaries departing the disability rolls.[Footnote 45] Although we suspect that data quality issues do not have a direct effect on the rates of SSA beneficiaries departing the rolls, poor data quality might be correlated with some other characteristic that we were not able to measure (e.g., agency efficiency), which may have an impact on the rate of departures from the rolls. Second, 11 agencies did not report the percentage of CSPD-certified counselors (a variable that we found to be significantly related to the percentage of SSA beneficiaries with earnings during the year after completing VR) for at least 1 year. For these agencies, the percentage of counselors was imputed using the mean derived from agencies that did report. Statistical tests were conducted to ensure that the observations for which data were imputed did not have significantly different rates of having earnings than those for which the data were not missing. Section 2: Study Population and Descriptive Analyses: Study Population: In consultation with SSA officials and contractors as well as Education officials, we selected as our study population working age individuals who (1) were either receiving Disability Insurance (DI) only, Supplemental Security Income (SSI) only, or both DI and SSI benefits concurrently; and (2) exited VR after having completed VR services.[Footnote 46] To use the most recent data available, we further refined this population to include those beneficiaries who: * Began receiving VR services no earlier than 1995 and who completed VR after having received services in fiscal years 2001 though 2003. * Had received a DI or SSI benefit payment at least once during the 3 months before application for VR services. Beneficiaries were defined as concurrent if they received both DI and SSI benefits for at least 1 month in the 3 months before VR application. We selected a 3-month window to account for the fact that many beneficiaries, SSI beneficiaries in particular, fluctuate in their receipt of benefits for any given month. We excluded from our study population those disability beneficiaries who: * Completed VR after 2003, because we lacked at least 1 year of post-VR earnings data. * Applied for or started VR services, but did not complete VR. * Began receiving disability benefits after receiving VR services because these beneficiaries may have differed in certain important characteristics from those receiving benefits before VR participation. * Reached age 65 or died at any point in their VR participation or during the time frame of our study. We excluded the beneficiaries who died or reached age 65 because they would have left the disability rolls for reasons unrelated to employment. For example, beneficiaries who reach age 65 convert to SSA retirement benefits. Computation of Dependent Variables: Using the Ticket Research File (TRF) subfile combined with data from SSA's Master Earnings File (MEF), we computed three measures of earnings outcomes for the 2001 through 2003 exit cohorts for each state VR agency: (1) the percentage of beneficiaries who had earnings during the year after receiving VR services, (2) the average amount they earned,[Footnote 47] and (3) the percentage that left the disability rolls by 2005. The data sources for our three earnings outcomes or dependent variables are shown in table 5. Table 5: Dependent Variables Used in the Analyses: Dependent variable: Percentage of beneficiaries with earnings during the year after VR; Dataset from which variable was derived: MEF. Dependent variable: Average annual earnings for SSA beneficiaries among those with earnings during the year after exiting VR; Dataset from which variable was derived: MEF. Dependent variable: Percentage of beneficiaries that left the rolls by 2005; Dataset from which variable was derived: TRF subfile. Source: SSA data. [End of table] To adjust for inflation, all of our earnings figures were computed in 2004 dollars using the Consumer Price Index for All Urban Consumers (CPI-U). The CPI-U, maintained by the Bureau of Labor Statistics, represents changes in prices of all goods and services purchased for consumption by urban households. The CPI-U can be used to adjust for the effects of inflation, so that comparisons can be made from one year to the next using standardized dollars. We standardized the value of average annual earnings to 2004 dollars because this was the most recent year for which earnings data were available at the time of our analysis. Departures from the Disability Rolls: To determine whether disability beneficiaries left the rolls before 2005, we used data from the TRF subfile that indicated the month in which a beneficiary left the rolls because of work. We included all beneficiaries who left the rolls after their VR application date. Concurrent beneficiaries were considered to have left the rolls only if they stopped receiving benefits from both programs. Descriptive Analyses: To depict the variation of agency performance in earnings outcomes of SSA beneficiaries completing VR from 2001 to 2003, we performed two descriptive analyses. First, we developed distributions of each earnings outcome. Second, we computed the means and ranges of these outcomes by year and agency type. With data from 78 agencies over 3 years (from persons who exited the state VR programs from 2001 to 2003), we had 234 cases in our data file.[Footnote 48] Both sets of analyses are presented in the findings section of the report. Section 3: Econometric Analyses: To identify key factors related to the earnings outcomes of SSA beneficiaries completing VR programs, we used econometric methods to analyze data from various sources related to VR agencies and the SSA beneficiaries who exited them from 2001 through 2003. Our econometric analyses focused on the differences across agencies for the three different dependent variables: (1) the percentage of beneficiaries who had earnings during the year after leaving VR; (2) among those with earnings, the average beneficiary earnings level during the year after leaving VR; and (3) the percentage of beneficiaries that left the disability rolls as a result of finding work by the end of 2005. We began our econometric analysis with ordinary least squares (OLS) and logistic regression models to analyze differences in outcomes based on individual characteristics. That is, we started with as many observations as there were individuals in our study population, each observation being assigned the characteristics of the agency as well as of the individual. Given that our data were multilevel (i.e., included information on both individuals and agency-level characteristics), we used statistical techniques to assess the feasibility of using ordinary least squares and logistic regression at the individual level rather than hierarchical modeling techniques.[Footnote 49] As a result of these analyses, we chose to use robust standard errors to account for clustering in agencies rather than hierarchical modeling techniques. However, preliminary analyses using the individual-level data to model binary outcomes and each individuals' earnings revealed that regression and logistic models frequently failed statistical tests when compared to a null model with no explanatory variables, and only accounted for a small fraction of the variability outcomes of interest to us.[Footnote 50] Because our econometric models using individual-level data explained very little variation in earnings outcomes (i.e., low predictive power), we proceeded to model outcomes at the agency level. Specifically, we combined data on the aggregate characteristics of individuals within agencies (such as the percentage of female beneficiaries or Disability Insurance recipients within an agency) with agency-level data on structure, expenditures, and policies and practices. In other words, rather than assess whether individuals differed in the likelihood of getting a job or leaving the rolls or had different earnings, we analyzed whether the agencies' earnings outcomes varied as a function of the characteristics of the agencies, the aggregate characteristics of beneficiaries within each agency, and the characteristics of the states the agencies were located in.[Footnote 51] Our dependent variables thus contained, for each agency in a given fiscal year, the average earnings level among those with jobs, the percentage at each agency who had earnings during the year after completing VR, and the percentage of those leaving the rolls due to work. As with our descriptive analysis, we had 234 cases in our data file, a number that was fairly small relative to the large number of agency characteristics whose effects we wanted to estimate.[Footnote 52] We could not, as a result, fit models that estimated the effects of all of the characteristics of interest simultaneously to determine which were statistically significant. We therefore chose to proceed by first estimating, in a series of bivariate regression models, which state and clientele characteristics (or characteristics of the types of SSA beneficiaries served in each agency) were significant. After obtaining preliminary estimates, we aggregated sets of significant state and clientele characteristics into single models for each of the three outcomes, and reassessed the significance of their net effects when they were estimated simultaneously in a multivariate regression model.[Footnote 53] We next tested the stability and magnitude of statistically significant coefficients for the state and clientele characteristics under different model specifications, and proceeded to introduce the agency characteristics (e.g., structure, management, expenditures, etc.) one at a time into these base models with the significant state and case mix characteristics. After determining individually significant agency characteristics, we used an iterative procedure to reassess agency-level effects by testing model stability and which variables were and were not significant when others were included, and retesting the significance of selected state, case mix, and agency characteristics that were marginally significant in prior models.[Footnote 54] In all cases we used robust regression procedures to account for the clustering of cases within agencies (i.e., the lack of independence within agencies over time), and weighted the cases in our analyses according to either the total number of beneficiaries in each agency in each year (for models of having earnings or leaving the rolls) or the total number of beneficiaries with earnings due to work in each year (for models of earnings). Ultimately, we obtained the models shown in table 6. Each of the models consisted of 7 to 9 characteristics that jointly accounted for between 66 and 77 percent of the variability in each dependent variable. Although certain characteristics were significant in some specifications for each outcome, the limited degrees of freedom prevented us from including all but the most consistently significant variables with greatest stability across models. In the models that estimated factors affecting the percentage of SSA beneficiaries who had earnings and factors affecting average earnings, state characteristics accounted for a substantial portion of the explained variance. Although state characteristics were also important in the model estimating the percentage getting off the rolls by 2005, the year that beneficiaries exited the agency accounted for the greatest portion of the variance explained, a result reflecting that those who exited the rolls earlier had more time to do so. Table 6: Coefficients for Multivariate Models Estimating the Effects of State and Agency Characteristics on Three VR Outcomes, and the Proportion of Variance Explained (R-Squared) by Each Model: Significant explanatory variables for percentage of beneficiaries with earnings during the year after VR (R-squared = 0.66): Unemployment rate; Effect coefficient: -2.22; Robust standard error: .358; P-value: <.001. Per capita income (per $10,000); Effect coefficient: 3.90; Robust standard error: 1.42; P-value: .008. Population size (per 1 million); Effect coefficient: -.40; Robust standard error: .047; P-value: <.001. Percentage of female beneficiaries; Effect coefficient: -.508; Robust standard error: .202; P-value: .014. Combined agency; Effect coefficient: 8.08; Robust standard error: 1.95; P-value: <.001. General agency; Effect coefficient: 10.35; Robust standard error: 2.22; P-value: <.001. Proportion of SSA beneficiaries served; Effect coefficient: .397; Robust standard error: .140; P-value: .006. Percentage of counselors meeting CSPD requirements; Effect coefficient: 5.63; Robust standard error: 2.06; P-value: .008. In-house benefits counselor; Effect coefficient: -.3.61; Robust standard error: 1.251; P- value: .005. Constant; Effect coefficient: 60.19; Robust standard error: 9.89; P-value: <.001. Significant explanatory variables for average earnings among SSA beneficiaries (R- squared = 0.77): Per capita income (per $10,000); Effect coefficient: 1684.54; Robust standard error: 185.25; P-value: <.001. Percentage of beneficiaries with mental impairments; Effect coefficient: -82.16; Robust standard error: 3.61; P-value: <.001. Percentage of beneficiaries on Disability Insurance; Effect coefficient: -64.52; Robust standard error: 8.20; P-value: <.001. Agency integration with business community; Effect coefficient: 727.37; Robust standard error: 350.63; P-value: .041. Degree of support/cooperation with other agencies; Effect coefficient: 858.15; Robust standard error: 249.87; P-value: .001. Percentage of expenditures on training; Effect coefficient: 12.54; Robust standard error: 3.95; P-value: .002. Proportion of SSA beneficiaries served; Effect coefficient: -27.55; Robust standard error: 11.77; P-value: .022. Constant; Effect coefficient: 9059.65; Robust standard error: 824.75; P-value: <.001. Significant explanatory variables for percentage of beneficiaries leaving the disability rolls (R-Squared = 0.76): Exit year 2002; Effect coefficient: -1.89; Robust standard error: .170; P-value: <.001. Exit year 2003; Effect coefficient: -4.07; Robust standard error: .194; P-value: <.001. Per capita income (per $10,000); Effect coefficient: 1.98; Robust standard error: .326; P-value: <.001. Population size (per 1 million); Effect coefficient: -.068; Robust standard error: .014; P-value: <.001. Percentage of beneficiaries on Disability Insurance; Effect coefficient: -.040; Robust standard error: .016; P-value: .021. Percentage of beneficiaries 46 to 55 years of age; Effect coefficient: -.078; Robust standard error: .043; P-value: .073. Percentage of beneficiaries with mental impairments; Effect coefficient: -.084; Robust standard error: .016; P-value: <.001. Percent of beneficiaries with visual impairments; Effect coefficient: -.045; Robust standard error: .010; P-value: <.001. Agency integration with business community; Effect coefficient: 1.62; Robust standard error: .788; P-value: .044. Constant; Effect coefficient: 11.64; Robust standard error: 1.58; P-value: <.001. Source: GAO analysis of SSA, Education, GAO survey, and data from various sources listed in table 3. Note: While earnings are coded in units of dollars, per capita income is coded in units of $10,000 so that the coefficient represents the effect of a $10,000 change in per capita income. Population is coded in per million persons. Percentage and proportion variables are coded between 0 and 100; this includes the percentage of beneficiaries with earnings, leaving the rolls, on Disability Insurance, and with mental and visual impairments, as well as the percentage of agency budget spent on training and the percentage of CSPD-certified counselors. Agency integration with the business community and support and cooperation with other agencies are scaled between 0 and 1. [End of table] Taking each outcome one at a time, the coefficients in table 6 suggest the following: * The percentage of SSA beneficiaries who had earnings was significantly lower in more populous states and states with higher unemployment rates, and significantly higher in states with higher per capita incomes. The percentage of SSA beneficiaries who exited VR agencies and had earnings was also significantly lower in agencies in which greater percentages of the beneficiaries are female. Independent of these effects of the state in which the agencies are located, and the gender composition of the beneficiaries who exit the agencies, there were also significant net effects of certain agency characteristics. Agencies that served only blind beneficiaries had lower percentages of SSA beneficiaries who had earnings than combined agencies or general agencies that did not serve the blind. The percentage of SSA beneficiaries who had earnings was also higher in agencies that served a higher proportion of SSA beneficiaries and had a higher percentage of counselors who met CSPD requirements, but lower in agencies that had an in-house benefits counselor. * Among the SSA beneficiaries who had earnings, those who were in agencies in states that had higher per capita incomes had higher average incomes. Net of these effects, agencies that (1) were more integrated with the business community, (2) had a higher degree of support and cooperation from other agencies, and (3) spent more of their total budget on training had higher average annual incomes among SSA beneficiaries completing VR services. Agencies with a higher percentage of beneficiaries on Disability Insurance, with mental impairments, and higher proportions of SSA beneficiaries served had lower average annual incomes among SSA beneficiaries completing VR services. * With respect to leaving the disability rolls by 2005, our final model showed that beneficiaries who exited agencies more recently were less likely to leave the rolls by 2005. (See section 4 for an explanation of why this might be the case.) Net of this, agencies in states with larger populations had lower percentages of beneficiaries who left the rolls by 2005, while agencies in states with higher per capita incomes had higher percentages who left the rolls by 2005. The characteristics of clientele served by each agency had a significant effect on the percentage of SSA beneficiaries who got off the rolls by that year. Agencies with higher percentages of beneficiaries who were blind or had mental impairments had lower percentages getting off the rolls by 2005. Those agencies with a higher proportion of DI beneficiaries had lower percentages that got off the rolls by 2005. The only agency characteristic with consistent statistical significance was the integration with the local business community; a greater proportion of beneficiaries in agencies with better integration with businesses left the rolls.[Footnote 55] Agencies with a greater proportion of beneficiaries from 46 through 55 years of age had fewer recipients leaving the rolls, but this effect is only significant at the 90 percent confidence level in our final model. Section 4: Limitations of our Analyses: Our results cannot be generalized to the larger population of all SSA disability beneficiaries or all VR participants because we looked only at VR participants who were SSA beneficiaries. Because VR participation is voluntary, beneficiaries who participate in VR may have certain characteristics that make them different from other SSA beneficiaries and, therefore, more likely or less likely to succeed in the workforce. Because our primary goal was not to conduct an impact evaluation of the VR program, but rather to conduct a comparative analysis of earnings outcomes across state VR agencies to determine what might account for differences in state agency performance, we felt that our analysis did not require a control group of SSA beneficiaries that did not receive VR services. Nonetheless, as a secondary analysis goal, we attempted to identify such a group using the data that were available to us on SSA beneficiaries that had applied for but did not receive VR services. However, we were unable to identify a subset of this group that was sufficiently similar to the VR participants to feel confident that any differences in earnings outcomes that we found between them and those that completed the VR program would be attributable to the VR program and not to the differences in individual characteristics. Therefore, our findings do not allow us to report on the overall effectiveness of the VR program. Our earnings data had several limitations that may have resulted in an under-or overestimation of beneficiaries' earnings. For example, although the beneficiary earnings data were provided to SSA by the Internal Revenue Service and are considered to be the most comprehensive and accurate measure of earnings available, they excluded several categories of workers who participated in alternative retirement systems and whose earnings may not have been reported to SSA.[Footnote 56] Such omissions could have resulted in an under-or overestimate of beneficiary earnings. On the other hand, some earnings reported to SSA may have included income derived from work activity in a previous year, such as commissions or bonuses. Further, the earnings data included some forms of nonwork income, such as sick leave earnings and profit sharing. These additional sources of income could not be identified and, therefore, could result in an overestimation of beneficiaries' earnings in a particular year. The data did not allow us to estimate the magnitude of the effect of these factors on our analyses. Our findings that beneficiaries receiving DI and beneficiaries from later cohort years were less likely to leave the rolls are likely due to several factors related to program structure and the updating of data. First, under current program rules, DI beneficiaries are allowed a trial work period (9 months) and an extended period of eligibility (36 months) before they are considered off the rolls.[Footnote 57] SSI beneficiaries who earn enough so that they do not receive a benefit for 12 months are taken off the rolls. Therefore, given that we measured whether beneficiaries left the rolls by 2005, beneficiaries from earlier cohort years would have had more time to leave the rolls. Further, by 2005 many DI beneficiaries may not yet have entered or completed their extended period of eligibility or reached the point where they would have been considered off the rolls. In addition, delays in the reporting of earnings may also have contributed to our finding that relatively more SSI beneficiaries and beneficiaries from earlier cohort years left the rolls. There can be a significant delay--up to 3 years--between when beneficiaries begin work and when SSA is notified or learns of their earnings. This delay is more likely to occur with DI beneficiaries, whose earnings were reviewed on a yearly basis as compared to monthly earnings reviews for SSI beneficiaries during the time frame of our study. Because of this reporting delay, the TRF subfile data that indicated whether a beneficiary had left the rolls may not have contained completely up-to- date data, especially for later cohorts. For these two reasons, we did not include comparisons of the rates of departures from the disability rolls by exit year because they would be misleading. With respect to earnings after VR, we included all earnings in the calendar year after VR, irrespective of the time gap between VR completion and the first month of earnings. Therefore, the start month for calculating earnings in the year after VR could have ranged from 1 to 12 months after VR, depending on which month the beneficiary exited. In other words, beneficiaries who exited VR in January 2000 would have their 2001 annual earnings calculated beginning in January 2001--12 months after their exit from VR. In contrast, beneficiaries who completed VR in December 2000 would have been out of VR for 1 month when their 2001 annual earnings calculation started in January 2001. We have no indication of clustering in earnings relative to VR completion and therefore expect a fairly even distribution of earnings over time. We do not expect the time lag in the earnings calculation to vary systematically by year or cohort. Finally, our analysis of the impact of agency characteristics on earnings outcomes of particular exit cohorts was limited by the time frames of our agency data. Specifically, we used information on the agency that pertained to the year before each exit cohort completed VR to explain the earnings outcomes of that exit cohort (e.g., agency data from 2000 were used to explain 2001 beneficiary outcomes). We did this for two reasons. First, beneficiaries, on average, receive services from VR for approximately 2 years. Therefore, for any given exit cohort, data on the agency from the year prior to exit will cover the most beneficiaries in that exit cohort. Second, although data from previous years might also explain beneficiary outcomes for a given cohort, we did not want to impose an inordinate burden on our survey respondents by collecting data on many years, especially those prior to 2000. In conducting preliminary tests of our survey questions, we also learned that the quality of the data may have been lower in earlier years because some agencies retain data for a limited time period. [End of section] Appendix II: Comments from the Department of Education: Note: GAO's comments supplementing those in the report text appear at the end of this appendix. United States Department Of Education: Office Of Special Education And Rehabilitative Services: The Assistant Secretary: Apr 17 2007: Denise M. Fantone: Acting Director: Education, Workforce and Income Security Issues: United States Government Accountability Office: 441 G Street N.W. Washington, D.C. 20548: Dear Ms. Fantone: Thank you for the opportunity to review the draft report to Congressional Requesters: Vocational Rehabilitation-Improved Information, Performance Measures, and Promotion of Agency Practices May Enhance Earnings Outcomes for State Agencies, GAO-07-521. We are pleased to provide comments from the Department of Education. This study was conducted as part of an examination of Social Security Administration (SSA) rehabilitation and was also, to' some extent, a follow-up to the finding in GAO's 2005 report on Vocational Rehabilitation (VR) generally. The 2005 report noted that state V R agencies varied substantially in terms of the employment rates they achieved for their clients, particularly for SSA beneficiaries, who, according to research, attain lower employment and earnings outcomes than other VR clients. The purpose of the current study is to examine factors related to the wide variations in state VR agency outcomes with respect to SSA beneficiaries. As noted in the limitations section, the results of the study are applicable only to the experiences and outcomes of SSA beneficiaries who were also VR consumers and cannot be generalized to either the larger population of SSA beneficiaries or VR consumers participating in these programs. Understanding the draft report requires substantial knowledge of the major federal programs that provide services and benefits to individuals with disabilities. While we acknowledge the length constraints of your reports, we believe that it is important to provide readers with more context so that they can better understand the data and findings presented. For example, a very brief summary of the current SSA benefit structures and the specific criteria for a successful rehabilitation would be useful. It is also important for the report to note that SBA's requirements for a successful rehabilitation are considerably more stringent than those used by VR agencies for regular rehabilitations. We believe it is also important to mention that state VR agencies may be reimbursed by SSA for a successful rehabilitation that meets SBA's earnings and duration tests. This would require a description of SBA's Ticket to Work program, which would be appropriate in that data from that program were considered in GAO's work. Finally, the role of VR agencies as SSA contractors in providing disability determination services should be mentioned because these agencies provide gate-keeping services for SSI and SSDI recipients entering the rolls, and thus these agencies have a unique and early opportunity to examine potential rehabilitants. Possibly much of this contextual information - which we believe is important to understanding - could be provided simply by referencing SSA and related website material. We commend GAO's use of multiple data sources. GAO's combination of data sources including the Ticket to Work Research File, the SSA Master Earnings File, the VR RSA-911 Case Service Report and your survey of state VR agencies opens up entirely new analytic possibilities concerning SSA beneficiaries served through the VR program. The Department is interested in discussing the data, methodology, and findings in more detail after the report is issued to assist with future evaluation and policy development. We are particularly interested in examining specific low-performing and high-performing VR programs identifiable through your work. The draft report is highly technical in approach and analysis. The Department does not take issue with the multivariate analyses GAO has conducted. We do not have access to all of the datasets and work papers we would need for a detailed review. We are not certain, however, that the relationships you describe are significant in the context of the administration of a large federally funded state-operated formula grant program. The Department does not actively manage detailed programmatic decisions made by formula grantees. Measures of statistical significance do not necessarily translate into issues of programmatic significance in a formula grant program. The draft report implies that the earnings and benefits of SSA rehabilitants are related to VR agency characteristics to a greater extent than we believe may actually be the case. Realistically, only a small proportion of SSI and SSDI recipients are good candidates for leaving the rolls. Many persons in the SSI population are elderly, have multi-organ systems disease and are in or will be in long-term care. Others have chronic mental illness or significant developmental disabilities and require continuing social supports and medical services programmatically linked to continued SSI eligibility. The draft report contains several recommendations for the Secretary of Education to improve the effectiveness of the Department's program evaluation efforts and ultimately the management of vocational rehabilitation programs. We agree, in part, with these recommendations. The first recommendation is to further promote agency practices that show promise for helping more SSA disability beneficiaries participate in the workforce. The report states that such a strategy should seek to increase: * Use of CSPD and other standards and certifications for VR staff, * Partnership or involvement with area business communities; and: * Collaboration with other agencies that provide complementary services. We will discuss each of these three points in turn. However, we note that there is a very important distinction to be made concerning the wording of this recommendation. "Helping more SSA disability beneficiaries participate in the workforce" is a much less specific and stringent outcome than that of helping people leave the benefit rolls. We believe the latter outcome is the primary concern of GAO's requesters. The draft report's recommendation related to the use of the state's CSPD or "comprehensive system of personnel development" appears to misconstrue the nature of this personnel system requirement. All state VR agencies must have an approved CSPD in place as a condition of funding because the CSPD is part of the state plan for vocational rehabilitation services. CSPDs vary among states and are typically keyed to job classifications or employee responsibilities in the state personnel system. There is no statutory requirement that a VR agency employee have a master's degree or a degree in a particular academic field or hold a private, third-party certification. State law may or may not provide for the certification or licensure of vocational rehabilitation counselors. The reference on page 26 of the draft report that: "Our findings also demonstrate the value of Education's policies requiring that VR counselors be certified." is misleading. We believe GAO may mean employees meeting higher requirements of the internal CSPD of the state agency employing the counselor. The draft report's reference to the range of counselors meeting CSPD requirements varying among agencies from zero to 100 percent would support this interpretation. In any event, all VR agencies have and use personnel standards. State agency staff need appropriate skills to carry out the duties of their positions. In federally funded state formula grant programs staffing decisions are made by state officials within the general framework of each state's personnel system. Each state's approved CSPD is an integral part of the VR agency personnel system. Finally, the CSPD is a mechanism for personnel development, rather than a standard. We agree that the partnerships or other involvement with areas' business communities are important. The Department has a variety of national efforts and initiatives under way with the business community in addition to the Employer Business and Development Network you have cited in the draft report. The Rehabilitation Act and the Department's implementation of the VR program emphasize the importance of collaboration with other agencies. However, the draft report appears to suggest that collaboration and cooperation are directly linked to SSA rehabilitations. This is certainly true in some cases but conditions and circumstances of collaboration with other agencies providing complementary services vary widely. Overall, collaboration may have the effect of improving outcomes, but some collaborations providing collectively better services to individuals with disabilities may not always support more or better SSA rehabilitations. Individuals with developmental disabilities and serious mental illness are likely to require long-term supports to live in the community. It is quite possible that the more attention a state VR agency focuses on these populations, the fewer departures from the SSA rolls may occur, even if earnings do increase to an extent. Services to these populations are not provided in isolation but are virtually always undertaken in cooperation and collaboration with the other service agencies. These circumstances are a possible, even likely, explanation for the draft report's observation that ".agencies with a greater proportion of SSA beneficiaries had more beneficiaries with earnings during the year after VR, but saw lower earnings levels for their SSA beneficiaries. (p. 20 of the draft report). For example, close relationships with state developmental disability (DD) agencies result in better VR services and extended support services for individuals with disabilities who need supported employment. However, many individuals will not leave the benefit rolls because either the DD program or the related Medicaid program provide a variety of social supports the individual needs to live in the community. SSA benefit eligibility is necessary to generate the funds for these community supports. Cooperative or collaborative relationships between the state DD agency and the VR agency can result in the state VR agency serving more individuals who are SSA beneficiaries, and having employment outcomes with increased earnings, but with very few individuals actually leaving the benefit rolls. The same situation can exist with mental health agencies. The second recommendation of the draft report is for the Department to reassess the collection of VR client data through consultation with outside experts in vocational rehabilitation and the state agencies. In particular, GAO recommends that the Department: * Consider the importance of data elements that are self-reported by the client and explore cost-effective approaches for verifying these data, and: * Consider collecting additional data that may be related to work outcomes, such as more detailed data on the severity of the client's disability and past earnings history, collaborating with other state and federal agencies to collect this information. When GAO conducted its 2005 study on the vocational rehabilitation program, the Department raised and discussed with GAO staff issues about the reliability and validity of data submitted to the Rehabilitation Services Administration by state VR agencies. Although at that time GAO deemed the data quality acceptable, the present draft report questions the validity of RSA's client-level data, apparently because it is self-reported. We do not understand these differing points of view, because the data have not changed. We remain open to improvements in data quality. We are also open to the future possibility of using SSA earnings records to benchmark both historical and post-rehabilitation earnings for VR clients as an alternative to self-reported data. We would be interested in comparisons (using the data GAO has obtained from SSA) of Education's earnings data with SSA earnings data for the same clients. Additionally, a Department-funded rehabilitation research and training center on vocational rehabilitation is commencing work and may help to address some of these data issues. We would point out that many VR agencies maintain internal case management systems containing richer information than is federally reported. We are, of course, sensitive to burden issues and will not collect information without a clear and compelling purpose and use. RSA carries out program evaluation through analysis of annual state agency data collections and through targeted studies. It is not always necessary or efficient to require all state agencies to collect data from a secondary source on all clients on an annual basis. If the data collection is excessively burdensome, it may be more efficient to periodically conduct studies using data from secondary sources to validate self-reported data collected on an annual basis and to enhance our understanding of program outcomes. The draft report's third recommendation reiterates the recommendation made in GAO's 2005 report to revise performance measures or adjust individual state VR agencies' performance targets to take into consideration economic conditions and state demographics such as state unemployment rates and state per capita income. The Department agreed to consider the prior recommendation and we continue to do so. We note that a major economic adjustment is incorporated in the VR program's statutory funding formula; the formula allocates relatively more funds to poorer states based on per capita income. These additional funds help offset the lack of other resources in the state and help meet performance expectations and the needs of individuals with disabilities. We have concerns, also, that the profile of all jobs available within a state's economy may not realistically correspond with the types of employment available to individuals seeking assistance from publicly funded programs such as vocational rehabilitation. For example, the geographic availability of employment opportunities may not correspond to the location of individuals in the VR program who are seeking jobs. If such is the case, general statewide employment data might not provide an accurate or consistent benchmark for VR program measurement. Local economic conditions are taken into consideration in our discussions with VR agencies in the course of our program monitoring, agency-by-agency. When the Department's Rehabilitation Services Administration (RSA) conducts monitoring reviews of state VR agencies, a number of factors are considered including, but not limited to, economic conditions and the demographics of states' populations. RSA and the state agency analyze the extent to which a range of factors affect the state's performance and how the factors may affect the planned improvements that RSA and the state agencies agree to undertake. RSA believes that this approach to addressing such factors is more effective than revising performance measures or adjusting performance targets to account for factors such as economic conditions and the demographics of states' populations. RSA uses performance measures and targets as the starting point in its discussions with state agencies about how to improve performance. If it were possible to adjust the measures and targets according to a uniform standard, this adjustment would reset the starting point of this dialogue, but it would not contribute to it substantively. We suggest that the title of this draft report, "Vocational Rehabilitation-Improved Information, Performance Measures, and Promotion of Agency Practices May Enhance Earnings Outcomes for State Agencies," be modified to indicate that the sample was limited to SSA beneficiaries. In particular, not mentioning SSA in the title would likely complicate searches for information on the actual subject of the study. Thank you for your continued interest in the operation and efficiency of the Department's programs for individuals with disabilities. As we have mentioned, we would like to schedule follow-up meetings to discuss technical issues after the report is released. Sincerely, Signed by: John H. Hager: The following are GAO's comments on the Department of Education's letter dated April 17, 2007. GAO Comments: 1. Education noted that more contextual information would help readers better understand the data and findings in the report. We added additional information to the report about VR reimbursement for successful SSA beneficiary rehabilitations, the role of disability determination services in VR referral, as well as a reference on where to find more information on the structure of federal disability programs. 2. We disagree with Education that our measures of statistical significance do not necessarily translate into issues of programmatic significance for the VR program because it is a formula grant program or that the agency characteristics we identify as having a significant impact on agency-level performance with respect to SSA beneficiaries may be overstated. Given Education's important leadership role in overseeing the VR agencies, we believe that our findings are relevant to the guidance and information that Education may choose to provide to state VR agencies. While we acknowledge that many SSA disability beneficiaries will not be able to return to work and leave the rolls for a variety of reasons, such as the severity of their disability, we analyzed numerous versions of our models and only reported on the relationships that were consistently significant across many versions of the model. As such, we believe these relationships are valid and deserve careful consideration. 3. Education stated that there is a significant difference between helping more SSA beneficiaries participate in the workforce versus helping more leave the disability rolls. While we agree, we believe that participating in the workforce is an important first step and improves SSA beneficiaries' potential for leaving the rolls. 4. We agree with Education that our description of the states' CSPD certifications could be misconstrued. We clarified the CSPD language in the background, findings, and recommendation sections of the report. 5. Education stated that overall collaboration between VR agencies and other agencies providing complementary services may improve outcomes, but that some collaboration resulting in better services to individuals with disabilities may not always support more or better rehabilitations for SSA beneficiaries. When we tested the effect of receiving support from specific agencies on SSA beneficiary outcomes, we did not find support from individual agencies to be significant. (See table 4 in app. I under "Integration with Outside Partners" for a list of variables we tested.) However, we found that when these relationships were aggregated, agencies that received a greater degree of support from more than one public agency had significantly higher levels of earnings among SSA beneficiaries. 6. Education suggested that VR agencies with high proportions of SSA beneficiaries may also have high levels of collaboration with other agencies because SSA beneficiaries may require long-term supports to live in the community, which may in turn necessitate cooperation with other public programs. The department noted that this may account for our findings because benefit eligibility may be necessary to receive certain supports from outside agencies. We added a footnote with this potential explanation to the report. 7. Education noted that we questioned the validity of certain self- reported data in this report, but deemed similar data acceptable in our 2005 VR report.[Footnote 58] In relation to our 2005 report, this report references different self-reported data for different purposes. Specifically, this report refers to clients' earnings data at the time of application, whereas our 2005 report used clients' earnings data after exiting VR (which was not used in this report). More importantly, this report used data as part of an econometric model whereas our 2005 report used self-reported data for descriptive purposes. While it is always preferable to verify self-reported data, our reliability tests are limited to our intended use of the data, and the data's reliability for that purpose. Education said it was open to our recommendation, but sensitive to the reporting burden on the VR agencies. Our recommendation that Education explore cost-effective ways to validate self-reported data was based on the experience of some VR agencies that have obtained data successfully from official sources and not relied solely on self-reported information. 8. Education disagreed with our recommendation on when economic conditions and state demographics should be considered in assessing agency performance. Instead of using this information to help set performance measures, the department said that it takes these factors into account when it monitors agency performance results and believes that its approach is effective. However, on the basis of the statistical significance of economic factors in our analysis, we believe that incorporating this contextual information in assessing performance measures is essential to provide state agencies with a more accurate picture of their relative performance. Education also stated that the VR program's statutory funding formula allocates relatively more funds to poorer states based on per capita income to offset the lack of other resources in the state. However, if the additional funds allocated to VR agencies located in states with low per capita incomes actually offset the lack of other resources in the state, we would not have found a significant relationship between per capita income and state VR performance. Finally, Education stated the overall state unemployment rate may not entirely correspond to the jobs available to SSA beneficiaries. In our analysis, however, this variable was highly significant in explaining the percentage of SSA beneficiaries with earnings for state VR agencies. 9. We agree with Education that the report's title should indicate that our sample was limited to SSA beneficiaries and we modified the title accordingly. [End of section] Appendix III: Comments from the Social Security Administration: Note: GAO's comments supplementing those in the report text appear at the end of this appendix. Social Security: The Commissioner: April 13, 2007: Ms. Denise M. Fantone: Acting Director, Education, Workforce, and Income Security Issues: U.S. Government Accountability Office: 441 G Street, NW: Washington, D.C. 20548: Dear Ms. Fantone: Thank you for the opportunity to review and comment on the draft report, "Vocational Rehabilitation: Improved Information, Performance Measures, and Promotion of Agency Practices May Enhance Earnings Outcomes for State Agencies" (GAO-07-521). While the report contains no recommendations for SSA, we have serious concerns with the content of this report. At the February 8, 2007 exit conference, our analysts and statisticians raised a number of points as they relate to serious methodological flaws in the data analysis and the accompanying conclusions. Also, many, if not all, of the weaknesses cited in our March 2, 2007 response to the GAO report, "Vocational Rehabilitation: Workforce Participation Increases For Many SSA Beneficiaries after Receiving VR Services, But Most Earnings Were Below Substantial Gainful Activity" (GAO-07-332) are applicable to this report because the same data were used for both reviews. We understand how important it is to make the nation's VR program as effective as possible to help people with disabilities participate in the workforce and truly appreciate your efforts in this area. However, we do not believe the data in either report (GAO-07-332 or GAO-07-521) are reliable enough to serve as the basis for making changes to VR programs at this time. Finally, we agree that additional data would be helpful in providing a more definitive analysis. However, we caution that more data may not necessarily provide GAO with greater explanatory power in an individual- level model. The attached comments provide detailed information, and specific examples, to support the rationale for our response. We also provide technical comments that should be made to enhance the accuracy of the report. If you have any questions, please contact Ms. Candace Skurnik, Director, Audit Management and Liaison Staff, at (410) 965-4636. Sincerely, Signed by: Michael J. Astrue: Enclosure: Comments On The Government Accountability Office (GAO) Draft Report, "Vocational Rehabilitation: Improved Information, Performance Measures, And Promotion Of Agency Practices May Enhance Earnings Outcomes For State Agencies" (GAO-07-521): We appreciate the opportunity to review and comment on the report. We are disappointed and have serious concerns with the content of this report. At the February 8, 2007 exit conference, the Social Security Administration's (SSA) analysts and statisticians raised a number of points regarding serious methodological flaws in the data analysis and the accompanying conclusions. Additionally, many, if not all, of the weaknesses cited in our March 2, 2007 response to the GAO draft report "Vocational Rehabilitation: Workforce Participation Increases For Many SSA Beneficiaries after Receiving VR Services, But Most Earnings Were Below Substantial Gainful Activity" (GAO-07-332) are applicable to this report because the same data were used for both reviews. Those comments are too extensive to repeat here but can be found in their entirety beginning on page 53 of that report. Given the growing size of the disability rolls and the potential savings associated with moving beneficiaries into the workforce, we acknowledge how important it is to make the nation's vocational rehabilitation (VR) program as effective as possible to help people with disabilities participate in the workforce. While we appreciate your efforts in conducting these reviews, we do not believe the data in either report (GAO-07-332 or GAO-07-521) are reliable enough to serve as the basis for making changes to the VR programs at this time. We agree that additional data would be helpful in providing a more definitive analysis. However, we caution that more data may not necessarily provide GAO with greater explanatory power in an individual- level model. It is very possible that unobservable, or at least factors that are difficult to measure, such as motivation, are the true drivers of employment outcomes among the disabled. The following detailed information and specific examples provide the rationale for our response. Statistical Technique: The most serious methodological flaw with the analysis is the statistical technique that was employed. We do not believe aggregate data should be used to determine the effects of individual behavior. In one example, GAO cites the lack of a measure of severity of the individual's disability as a limitation in the individual data, but the aggregate data model is no better at predicting outcomes as it has no measure of severity either. At the exit conference, we emphasized our concerns of the potential consequences of aggregation bias and false correlations in aggregate data. At that time, we urged GAO to focus on the individual level analysis, or at least to report both the individual level and aggregate level analysis and permit the reader to decide whether the findings are justified. The vast majority of statisticians and econometricians would agree that there are serious limitations to aggregate data analysis and that the aggregate results are not adequate as measures of individual behavior or outcomes. While some researchers would argue that at times we may have to rely on aggregate data analysis when there is no individual level data, this analysis is only a preliminary step towards understanding the potential micro-level relationships. A half century of peer-reviewed research shows that, as a general rule, aggregate data is not a meaningful measure of individual behavior. In a seminal article on the subject in 1950, Robinson[Footnote 59] argued that one cannot use aggregate data as a substitute for individual level data stating, "While it is theoretically possible for the two to be equal, the conditions under which this can happen are fur removed from those ordinarily encountered in data. From a practical standpoint, therefore, the only reasonable assumption is that an ecological correlation is almost certainly not equal to its corresponding individual correlation. " In his research, Robinson demonstrated that aggregating individual data can actually yield results that are the opposite of those found in the individual data. More recently, Stoker[Footnote 60] reviewed the literature on aggregation over individuals and noted that, "The problem is that for any equation connecting aggregates, there are a plethora of behaviorally different 'stories' that could generate the equation, which are observationally equivalent from the vantage point of aggregate data alone. If one invents a paradigm that is not consistent with individual data, or based on fictitious coordination between agents, then the results of estimating an aggregate equation based on that paradigm are not well founded, and are not to be taken seriously. " This research strongly suggests that the results generated by the GAO should not be used to drive policies to target programs and policies for individuals. In summary, because GAO's individual-level analysis did not yield the results it expected, or results that supported the benefits of VR, does not mean that the model was flawed. In fact, it could very well be that the proper model was used and the reality is that there are few, or no, measurable factors that relate to observed outcomes. Factors not readily measurable, such as individual motivation, may be the driving force in obtaining successful employment outcomes. Differences among the States: When GAO compares State VR agencies, in terms of their success in finding employment, earnings measures, getting off the benefit rolls, a major assumption is being made about how disability beneficiaries are referred to VR. If a State has a rule which allows for a large percentage of beneficiaries being referred, that State may end up having poor outcome statistics, in terms of employment, earnings, and ending benefits. On the other hand, a State that targets beneficiaries narrowly so that only the most likely candidates for successful employment are referred may have fewer beneficiaries rehabilitated but a better success rate. The GAO analysis should recognize that such State-to-State differences in how candidates are referred to VR may affect success rates. Ticket to Work: The time period under study (2001 through 2003) was a period of transition in the VR program as the Ticket to Work (TTW) was implemented in phases, with groups of States being added to the program over this time frame. The analysis does not appear to have been controlled for the presence of the TTW program in each State. The inclusion of non-State VR providers and changes in reimbursement methods contributed to variations in State outcomes over the period under study. We believe the exclusion of variables to account for the rollout of the TTW program represents a serious misspecification of the model estimated by GAO. State Economic Factors: GAO reports the two measures having the greatest impact on outcomes were the unemployment rate and level of per-capita income in the State. If one were to accept these relationships, GAO's analysis would suggest that during periods of higher unemployment less funding should be allocated towards VR services. In addition it suggests more funding should be directed to States with higher per-capita income, such as California and New York, and less funding to low-income States such as Mississippi and Louisiana (at least to the point that the cost-benefit ratio of VR expenditures balance across States). GAO suggests, page 18, that the significance of unemployment rates and per-capita income have some connection with better labor market opportunities and hence better outcomes. Alternatively, one might suggest that States with higher per- capita income (and/or lower unemployment) have greater tax revenues and are able to spend more on VR services, and thus have better outcomes. Ultimately, neither GAO nor SSA understands what these State-level variables measure nor how they contribute to the observed individual outcomes, and thus no implications or recommendations for increasing the effectiveness of VR for individuals can be offered. State Certified VR Counselors: GAO provides information on several variables that, at least conceptually, appear to be directly policy-relevant and that were found to be statistically significant to increase positive outcomes, including the proportion of State-certified VR counselors and stronger business sector ties. However, the limitations of aggregate data preclude strong conclusions with respect to the benefits of working with certified counselors as the micro data does not indicate that the individuals who had successful outcomes actually received any services from certified counselors. While this finding may be consistent with prior research, the statement on page 21 that "this appeared to corroborate research" is too strong. If past research found that certified counselors are more successful in getting beneficiaries back to work, GAO should cite that research and make recommendations based on those findings, not on this analysis. In House Benefit Counselors: GAO reports that the presence of in-house benefit counselors was found to diminish the favorable return to employment outcomes. It further suggests that this actually runs counter to other research that shows benefit counselors have a positive impact on earnings. This conflicting information is not helpful to policymakers and GAO needs to reconcile the value of its analytical result with that of prior research. If the prior research had a strong, supportable research design, this only serves to strengthen the concerns about the appropriateness of the current study's methodological approach and diminishes the value of GAO's findings. Demographic Differences: On page 19, a finding was made regarding lower employment rates among women than men. We would recommend that future studies also examine the concept of primary caregiver for children in the household. When minor children reside in a household, the role of primary caregiver generally rests with women. Lack of daycare or other factors related to this caregiver role could be a considerable contributing factor to the differences in employment rates by gender. Legally Blind: Page 20, first full paragraph, it states "Higher earnings thresholds for the legally blind might reduce their incentive to leave the rolls." We believe the statement should read, "Higher earnings limits make it less likely that blind individuals will leave disability rolls." The higher earnings limits coupled with additional incentives utilized by the blind (e.g. Impairment Related Work Expenses, Blind Work Expenses, Un-incurred Business Expenses, etc.) provide these individuals with the opportunity to earn more than other categories of disabled workers. Earnings Levels for DI and SSI: The last paragraph on page 20 includes a discussion on the examination of the difference in earnings levels between Supplemental Security Income (SSI) and Social Security Disability Insurance (DI) beneficiaries. While we do not dispute the finding that SSI beneficiaries have lower average annual earnings, we believe that a contributing factor to this trend that was not taken into account was past work history. As a general rule, the reason a person is receiving SSI rather than DI benefits is because they lack the work history needed to earn quarters of coverage. This lack of recent employment would be more likely to lead SSI beneficiaries into entry level jobs than their DI counterparts who have already established an employment history. Generalizations: We felt that there were generalizations, on pages 19 and 20, which should not be used given the lack of explanatory power of the individual data on page 24. Examples of those are: * Clientele characteristics such as higher numbers of women beneficiaries served resulted in lower employment outcomes, * A higher number of SSA beneficiaries between 46 and 55 years old resulted in decreased employment incomes, * Serving a larger percentage of individuals with mental impairments decreased the proportion of SSA beneficiaries leaving the rolls, * A higher proportion of blind or visually impaired beneficiaries had fewer departures from the disability rolls, and: * Agencies serving a higher proportion of SSDI beneficiaries had lower average annual earnings a month than SSA beneficiaries and a lower percentage of beneficiaries leaving the rolls. Technical Comment: Several tables have vertical axes labeled "percentage," but the scale is in decimals such as .20 (see pages: highlights page, 10, 11, and 12). Throughout the report GAO incorrectly refers to beneficiaries who have "left the disability rolls" due to work. They are actually referring to beneficiaries who are in suspense or termination status (receiving a zero cash benefit) due to work. In most cases given the timeframe of the study, most of the beneficiaries in this status would be on the SSA rolls, but in cash benefit suspense. In footnote number 3 page 3, GAO notes what they mean by "left the disability rolls." This term is so different from what it represents that it is likely to lead to misinterpretation by policy makers and other readers. In figures 4-8, pages 13-17, ranges are presented for VR agencies for percent of beneficiaries with any earnings, earnings amounts, and those "leaving the rolls" and the variation in these ranges is then compared. These ranges are a poor way to present variation because simply presenting the mean and range of values can be misleading because it gives equal weight to both common and outlier values. This is why we usually present variation with medians and in terms of standard deviations from the mean-doing so provides a more accurate representation of the central tendency and spread. The following are GAO's comments on the Social Security Administration's letter dated April 13, 2007. GAO Comments: 1. We disagree with SSA that many of the comments provided on our previous report (GAO-07-332) apply to this report because the methods and data we used differed significantly from our earlier report. Prior to submitting this report for agency comment, we carefully reviewed and incorporated any comments from the earlier report that were relevant. 2. SSA stated that the report has methodological flaws that introduced aggregation bias and false correlations. It suggested that we should have focused on individual-level analysis or reported the results of both the individual-and aggregate-level analyses. We disagree, as the primary goal of our analysis was to analyze agency-level outcomes, not individual-level outcomes. Specifically, our objective was to understand what "may account for the wide variation in state VR agency outcomes with respect to SSA beneficiaries." In doing so, we used aggregated data, which is a widely used and, at times, necessary means of analysis throughout all social sciences. Because we used aggregated data, we did not attempt to infer the effects of individual behavior or individual outcomes and noted such in our report. For example, we did not find that a lower percentage of women beneficiaries had earnings relative to male beneficiaries. Rather, we found that agencies serving a higher proportion of women beneficiaries had lower percentages of beneficiaries with earnings relative to other agencies. We did not report the results from the individual-level analyses, as recommended by SSA, because we did not find them sufficiently reliable upon which to base findings. Specifically, we did not find the individual-level results to be reliable, as we were not able to control for some factors at the individual level--for example, severity of disability--that were crucial to an individual-level analysis, but not crucial to analyses at the aggregated level. Although we chose not to report individual-level results, they were, in fact, consistent with the results of our aggregate analyses. We conducted statistical tests prior to our agency-level analyses to ensure that our aggregate analyses were not biased by a failure to account for certain types of correlations between individuals within agencies. Our tests did not reveal such correlations. In the absence of such correlations, several respected authorities agree that aggregate- level analyses that incorporate aggregated individual-level characteristics will not result in biased estimates.[Footnote 61] To further ensure our methods were appropriate and robust, our final report was reviewed and validated by an expert in statistical analysis.[Footnote 62] 3. We agree with SSA that state agency rules about whether and how disability beneficiaries are referred to VR may have an affect on agency success rates, and controlled for it to the extent possible in our analysis. While we were not able to control for differences in the way states target beneficiaries for referral to VR as SSA suggested, we did include a variable reflecting the percentage of SSA beneficiaries served by a VR agency (computed as the percentage of all clientele served at that agency). This variable was significant in two of our three models. 4. SSA had concerns that the Ticket to Work program was implemented during the time frame of our study and should have been controlled for in our analysis. Although there was a very slight overlap between the time frame of our study and the timing of the Ticket to Work program, we nevertheless conducted tests to determine whether the rollout of the Ticket program had an effect on VR agency outcomes for SSA beneficiaries. The rollout of the Ticket program was not significant and, therefore, we did not report its effect. 5. SSA questioned the value of measuring state-level economic factors and the resultant implications for VR. Our findings on the influence of state economic characteristics were highly significant and are corroborated by previous research, and therefore we believe that implications and recommendations can be offered from our analysis of state economic factors on agency-level outcomes. However, nowhere in our report do we indicate that our findings suggest that during times of high unemployment, less funding should be allocated to VR agencies. To the contrary, we suggest that economic factors should be controlled, or accounted for, when assessing agency performance. Moreover, while we agree that economic conditions are associated with tax revenues, we found that total state agency expenditures on services (and several other expenditure variables listed in table 2 of app. I) were not significant predictors of agency-level earnings outcomes for SSA beneficiaries. 6. SSA had concerns about our findings on benefits counselors because they differed from those of other research. While prior research focused on the impact of benefits counseling in one state, our analysis focused on the impact of benefits counseling across all state agencies. Additionally, we noted that the time frame of our study was a period of transition for the benefits counseling program. Therefore, while we believe our findings are accurate, we also noted the contradictory findings in other research. 7. We agree with SSA that the higher earnings thresholds for the legally blind allow them to earn more than other categories of workers with disabilities while still keeping their disability benefit and have modified our explanation of the results on beneficiaries who are blind. 8. SSA stated that SSI beneficiaries generally lack a work history that qualifies them for DI benefits, and that this lack of work history is more likely to lead SSI beneficiaries into entry-level jobs, resulting in lower average annual earnings than DI beneficiaries. While we agree past work history can be a contributing factor, we found the opposite effect. We found that agencies with a higher proportion of SSA beneficiaries who were DI beneficiaries had lower average annual earnings among SSA beneficiaries and a lower percentage of beneficiaries leaving the rolls. We offer potential explanations for these results in the report. 9. We incorporated SSA's technical comments as appropriate. [End of section] Appendix IV: GAO Contacts and Staff Acknowledgments: GAO Contact: Denise M. Fantone, Acting Director, (202) 512-7215, fantoned@gao.gov: Acknowledgments: In addition to the contact named above Robert E. Robertson, Director; Michele Grgich, Assistant Director; Amy Anderson; Melinda Cordero; Erin M. Godtland; Jay Grussing; Robert Marek; Brittni Milam; Nisha R. Unadkat; and Rick M. Wilson made significant contributions to all phases of this report. In addition, Susan Bernstein, Cindy Gilbert, Lisa Mirel, Thomas McCool, Anna Maria Ortiz, Daniel A. Schwimer, Doug Sloane, and Shana B. Wallace provided technical assistance. [End of section] Related GAO Products: Vocational Rehabilitation: Earnings Increased for Many SSA Beneficiaries after Completing VR Services, but Few Earned Enough to Leave SSA's Disability Rolls. GAO-07-332. Washington, D.C.: March 2007. Vocational Rehabilitation: Better Measures and Monitoring Could Improve the Performance of the VR Program. GAO-05-865. Washington, D.C.: September 2005. SSA Disability: SGA Levels Appear to Affect the Work Behavior of Relatively Few Beneficiaries, but More Data Needed. GAO-02-224. Washington, D.C.: January 2002. SSA Disability: Other Programs May Provide Lessons for Improving Return- to-Work Efforts. GAO-01-153. Washington, D.C.: January 2001. Social Security Disability Insurance: Multiple Factors Affect Beneficiaries' Ability to Return to Work. GAO/HEHS-98-39. Washington, D.C.: January 1998. Social Security: Disability Programs Lag in Promoting Return to Work. GAO/HEHS-97-46. Washington, D.C.: March 1997. SSA Disability: Program Redesign Necessary to Encourage Return to Work. GAO/HEHS-96-62. Washington, D.C.: April 1996. Vocational Rehabilitation: Evidence for Federal Program's Effectiveness is Mixed. GAO/PEMD-93-19. Washington, D.C.: August 1993. FOOTNOTES [1] To determine a beneficiary's earnings in the year after VR, we calculated earnings in the calendar year after the year in which beneficiaries completed VR. For example, whether the beneficiary completed VR in January or December 2000, earnings from January 2001 through December 2001 would have been used to determine earnings in the year after VR. [2] David C. Stapleton and William A. Erickson, "Characteristics or Incentives: Why Do Employment Outcomes for the SSA Beneficiary Clients of VR Agencies Differ, on Average, from Those of Other Clients?" (Rehabilitation Research and Training Center for Economic Research on Employment Policy for Persons with Disabilities, Cornell University, Ithaca, New York, Oct. 2004). [3] The longitudinal dataset from SSA and Education contains information on beneficiaries for a longer time horizon (i.e., 1998 through 2004). However, we focused on the cohorts completing VR between 2001 and 2003 because, at the time of our analysis, data were not available on earnings after 2004. Further, we excluded earlier cohort years due to limitations associated with collecting survey data from VR agencies prior to 2000. See appendix I for more information on our data. [4] For the purposes of our study, leaving the rolls is defined as the termination of cash disability benefits due to work. [5] We conducted our analyses using multivariate regression analysis. [6] Individuals may be referred from SSA to state VR agencies by state disability determination services (DDS), which are funded by SSA to render the initial decision on whether an individual qualifies for DI or SSI benefits, and thus are in a good position to consider whether the individual is an appropriate candidate for VR. [7] Ticket to Work and Work Incentives Improvement Act of 1999, Pub. L. No. 106-170 (1999). The Ticket to Work Program was implemented in three phases, beginning in February 2002. Under the Ticket program, VR agencies and other providers can opt for one of two different reimbursement methods, one based on a successful outcome, the other based on successfully reaching milestones. State VR agencies can also continue to be reimbursed under the traditional cost reimbursement program if the beneficiary does not utilize his or her ticket to obtain services. [8] GAO, Social Security: Disability Programs Lag in Promoting Return to Work, GAO/HEHS-97-46 (Washington, D.C.: March 1997). [9] In this report, when we refer to state VR agencies, we are including agencies in the states and territories. [10] Tsze Chan, Recruiting and Retaining Professional Staff in State VR Agencies: Some Preliminary Findings from the RSA Evaluation Study, a special report prepared at the request of the Department of Education, October 2003. [11] WIA requires states and localities to bring together a number of federally funded employment and training services into a single system- -the one-stop system. Funded through different federal agencies, these programs are to provide services through a statewide network of one- stop career centers to adults, dislocated workers, and youth. [12] Public support refers to cash payments made by federal, state, or local governments for any reason, including an individual's disability, age, economic, retirement, and survivor status. This excludes any noncash support payments such as Medicaid, Medicare, food stamps, and rental subsidies. [13] Education tracks individuals in terms of seven types of case closures, which can be collapsed into four categories for individuals who (1) exited without employment, during the application phase; (2) exited without employment, with limited services; (3) exited without employment, after receiving services under an employment plan; and (4) exited with at least 90 days of employment, after receiving services under an employment plan. [14] GAO-05-865, 39. [15] GAO, Vocational Rehabilitation: Earnings Increased for Many SSA Beneficiaries after Completing VR Services, but Few Earned Enough to Leave SSA's Disability Rolls, GAO-07-332 (Washington, D.C.: March 2007). [16] See appendix I for an explanation of why we did not compare agencies' rates of SSA beneficiaries leaving the rolls over this period. [17] All findings discussed in this section are statistically significant at the 0.05 level, unless otherwise noted. [18] Unless otherwise indicated, the effects being discussed in this and the next section are marginal effects (i.e., the effect of a 1 unit change in the explanatory variable on the dependent variable, holding other factors constant). See appendix I, section 3, for more details on our econometric analyses. [19] We also found that in states with larger populations, fewer SSA beneficiaries (1) had earnings during the year after completing VR and (2) left the disability rolls. A study conducted by RTI International noted that states with small populations reported having improved access to other agencies and better collaboration with state leaders due to closer work and personal relationships. [20] Michael D. Tashjian, et al., Study of Variables Related to State Vocational Rehabilitation Agency Performance Revised Draft Final Report, a special report prepared at the request of the Department of Education, October 2004. [21] Although state economic and demographic conditions are not factored into performance measures and targets, Education considers these factors through its monitoring of state agencies. In addition, the statutory funding formula for VR agencies allocates relatively more funds to poorer states based on per capita income to help offset a lack of resources. [22] See appendix I for a detailed list of the factors we controlled for. [23] These clientele characteristics appeared to influence one or more of the earnings outcomes measured, but not necessarily all three. [24] David Wittenburg and Melissa Favreault, "Safety Net or Tangled Web? An Overview of Programs and Services for Adults with Disabilities" (Occasional Paper No. 68, the Urban Institute, Washington, D.C.: 2003). [25] While other variables were significant at the 0.05 level, this variable was significant at the 0.10 level. See appendix I for more information. [26] Timothy Tremblay et al., "Effect of Benefits Counseling Services on Employment Outcomes for People with Psychiatric Disabilities," Psychiatric Services, vol. 57, no. 6 (2006). [27] Specifically, holding other factors constant, agencies known as combined or general agencies had more SSA beneficiaries with earnings during the year after VR than agencies known as blind agencies. [28] In its comments on our report, Education suggested that VR agencies with high proportions of SSA beneficiaries may also have high levels of collaboration with other agencies because the long-term supports that may be required to live in the community necessitate cooperation with other public programs. The department noted that this may account for our findings because benefit eligibility may be necessary to receive certain supports from outside agencies. [29] See GAO-07-332 for a more detailed description of the differing DI and SSI benefit structures. [30] See appendix I, section 4, for a detailed description of why, given the time frames of our study, the rates of departures from the rolls might be lower for DI beneficiaries. [31] Edna Mora Szymanski, "Relationship of Level of Rehabilitation Counselor Education to Rehabilitation Client Outcome in the Wisconsin Division of Vocational Rehabilitation," Rehabilitation Counseling Bulletin, vol. 35, no. 1 (1991). [32] The Council of State Administrators of Vocational Rehabilitation is also developing a national VR-business network whose aim is to coordinate VR outreach efforts to businesses. In addition to these national-level efforts, state VR agencies also participate in state- level business networks. In Utah, for example, the VR agency participates in the Utah Business Employment Team, which serves as a business-to-business network for recognizing and promoting best practices in hiring, retaining, and marketing to people with disabilities. [33] Past GAO reports have highlighted the need for greater coordination among agencies delivering services to people with disabilities. See, for example, GAO, Federal Disability Assistance: Wide Array of Programs Needs to Be Examined in Light of 21st Century Challenges, GAO-05-626 (Washington, D.C.: June 2, 2005). [34] The expenditures considered for this calculation do not include assessment, counseling, guidance, and placement services provided directly by VR personnel since these services are generally provided to all VR clients. The total expenditures in this calculation include those optional services that are provided to clients based on their specific needs. [35] Becky J. Hayward and Holly Schmidt Davis, Longitudinal Study of the Vocational Rehabilitation Services Program Final Report 2: VR Services and Outcomes, a special report prepared at the request of the Department of Education, 2003. [36] Other research finds a positive effect of benefits counseling on earnings among beneficiaries with psychiatric disabilities and clients in the state of Vermont. See Timothy Tremblay, et al., "Effect of Benefits Counseling Services on Employment Outcomes for People with Psychiatric Disabilities," Psychiatric Services, vol. 57, no. 6 (2006). [37] We cannot say with certainty that our results were detrimentally affected by these limitations because we do not have data without these limitations with which to test our hypotheses. [38] Mitchell P. LaPlante and H. Stephen Kaye, "The Employment and Health Status of Californians with Disabilities" (Institute of Health and Aging, University of California, San Francisco, June 2005). [39] In evaluating the significance of a disability, some state VR agencies already collect such information. [40] We are especially grateful to Professor Herbert Smith--Professor of Sociology and Director, Population Studies Center at the University of Pennsylvania, and an expert in the area of statistical analysis--who provided valuable advice on our statistical methods. [41] In 2003, SSA contracted with Mathematica Policy Research to conduct a full evaluation of the Ticket to Work Program. As part of this evaluation, Mathematica constructed the Ticket Research File, a compilation of longitudinal data from SSA. An extract of the TRF was merged with vocational rehabilitation data from the Department of Education's RSA-911 database by an SSA official. [42] Education's data on VR closures were available from 1998 to 2004. Data from SSA's TRF database were available from 1994 to 2004, with MEF earnings data available from 1990 to 2004. Social Security's MEF data are annual earnings based on Internal Revenue Service W-2 tax filings. At the time we obtained this dataset from SSA, earnings data for 2005 were not available. [43] For the purposes of this study, the term "explanatory variable" is used to describe a variable that is used to predict the value of another variable, and the term "dependent variable" is used to describe a variable whose values are predicted by the explanatory variable. [44] Electronic copies of the survey are available upon request. [45] Specifically, we inquired about whether (1) there were written procedures that define data elements or specify how the data for each data system were collected and if so, how well the procedures were followed; (2) anyone conducted routine internal reviews of the data to check for errors in completeness, accuracy, or reasonableness; (3) anyone independent of the organization conducted periodic monitoring or audits of the data to check for errors in completeness, accuracy, or reasonableness; and (4) there were any potential problems or limitations with the reliability of the data that were used to answer the survey questions. [46] This variable was significant at the 0.10 level. [47] Our study population included disabled adult children and disabled widow(er)s, who may receive DI benefits based on their parents' or spouses' Social Security earnings record. While their benefits are paid from the Old-Age and Survivors Insurance Trust Fund, these individuals are disabled and are eligible for VR services. [48] We have only 232 observations in our model of earnings because we considered average earnings among only those beneficiaries with earnings in the year following their exit from VR. Two agencies did not have any beneficiaries with reported earnings from employment in 2002. [49] We used Stata's xtreg and rtlogit commands to calculate the intraclass correlation coefficient rho. These analyses revealed minimal clustering among individuals within agencies (rho of 0.02 and below); that is, individuals' characteristics and employment outcomes appeared to vary as much within agencies as across agencies. This suggests that inferences derived from OLS and logistic regressions with robust standard errors are not misleading as a result of failure to hierarchically account for clustering of individuals within agencies. [50] For example, our multivariate models of earnings were only able to explain, at best, approximately 8 percent of the variation in individuals' earnings. [51] Although the alternative of looking at individual outcomes with individual data might have allowed us to control for individual characteristics somewhat better before estimating the effects of the state and agency characteristics, we believe modeling the variability in outcomes using the aggregate data was more appropriate given the objective of assessing which agency-level characteristics are related to employment outcomes. However, because aggregation reduces our degrees of freedom and may compound individual measurement error in variables such as earnings, we recognize that our estimated coefficients may not be as precise as ones generated using individual characteristics. See section 4 of this appendix for more information on measurement issues. [52] We have only 232 observations in our model of earnings because two agencies did not have any beneficiaries with reported earnings from employment in 2002. [53] Statistical significance was measured at a p-value <0.05 and marginal significance was measured at p-value <0.10. [54] Although we considered this full range of variables in the series of models leading to our final specifications for each outcome, not all variables are significant for each outcome. We used a variety of factors to decide which characteristics to include or exclude in the model for each outcome. We considered statistical significance, magnitude of each effect, stability of included coefficients across model specifications with different regressors, changes in the proportion of variance explained using F-tests for nested models (when appropriate), and theoretical considerations based on past research and input from agencies we surveyed and interviewed. [55] Several other agency characteristics, notably the percentage of expenditures on purchased services, had marginally significant effects and were not included in the final model. [56] Workers who may have been excluded include federal civilian employees hired before 1984 and certain state and local government employees. [57] The 9-month trial work period must occur within a 60-month period. [58] GAO-05-865. [59] W.S. Robinson Ecological Correlations and the Behavior of Individuals American Sociological Review XV 1950, pp 351-357 (quote on page 357): [60] T.M. Stoker "Empirical Approaches to the Problem of Aggregation Over Individuals" Journal of Economic Literature December 1993 (quote on page 1871): [61] See, for example, Leigh Burstein, "The Analysis of Multilevel Data in Educational Research and Evaluation," Review of Research in Education, vol. 8, p. 158-233 (1980); Judith Singer, "Using SAS PROC MIXED to Fit Multilevel Models, Hierarchical Models, and Individual Growth Models," Journal of Educational and Behavioral Statistics, vol. 24, no. 4 (1998); and Stephen W. Raudenbush and Anthony S. Bryk, Hierarchical Linear Models: Applications and Data Analysis Methods, second ed. (Thousand Oaks, California: Sage Publications, 2002), 99- 159. [62] Herbert Smith, Professor of Sociology, Director of Population Studies, University of Pennsylvania. GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to www.gao.gov and select "Subscribe to Updates." Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400 U.S. Government Accountability Office, 441 G Street NW, Room 7125 Washington, D.C. 20548: Public Affairs: Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548: