This is the accessible text file for GAO report number GAO-15-77 entitled 'Veterans' Reemployment Rights: Department of Labor Has Higher Performance Than the Office of Special Counsel on More Demonstration Project Measures' which was released on November 25, 2014. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Committees: November 2014: Veterans' Reemployment Rights: Department of Labor Has Higher Performance Than the Office of Special Counsel on More Demonstration Project Measures: GAO-15-77: GAO Highlights: Highlights of GAO-15-77, a report to congressional committees. Why GAO Did This Study: USERRA protects the employment and reemployment rights of workers who leave civilian jobs to perform military or other uniformed service. VBA directed DOL and OSC to establish a second demonstration project for investigating and resolving USERRA claims filed against federal agencies. Congress established the demonstration project to facilitate a review of relative agency performance and mandated GAO to report on relative performance across a number of areas specified in the VBA. This report assesses agencies' relative performance using VBA performance metrics - case outcomes, customer satisfaction, timeliness, cost, and capacity. To determine agencies' relative performance, GAO analyzed agency data on the aforementioned metrics. GAO reviewed VBA requirements and relevant guidance, and interviewed agency officials. In addition, GAO also conducted tests on each agencies' cost and accounting data to ensure agreed upon cost components were reflected in the totals. What GAO Found: Demonstration Project Performance Between August 2011 and August 2014, the Department of Labor (DOL) demonstrated relatively higher performance than the Office of Special Counsel (OSC) on three of five performance metrics in the Veterans' Benefits Act of 2010 (VBA).The relative performance was influenced—to a varying extent—by a number of factors, such as the investigative approach. Case Outcomes (as of July 31, 2014): OSC provided relief to about 26 percent and DOL provided relief to about 20 percent of its claimants. DOL resolved 308 (or 97 percent of the 319) Uniformed Services Employment and Reemployment Rights Act (USERRA) cases, and OSC resolved 366 (or 84 percent of the 434) cases it received. OSC received a greater number of cases due to a requirement to investigate 27 cases involving a prohibited personnel practice (PPP) and to the random assignment of cases from servicemembers with odd social security numbers. GAO did not evaluate the appropriateness of agencies' case outcomes. Although the agencies had 10 months to prepare, OSC officials stated they had limited capacity to investigate and resolve claims during the first six months of the demonstration project. In fiscal years 2013 and 2014, both agencies closed about as many cases as received. Customer Satisfaction: On a survey sent to claimants and administered by OPM, DOL respondents reported higher average satisfaction on every question than OSC respondents, with pronounced differences in scores on timeliness, access to staff, and overall experience. For example, 66 percent of DOL's respondents (n=100) were satisfied with overall customer service, whereas 34 percent of OSC's respondents (n=151) were satisfied. In light of the low survey response rates, GAO conducted additional statistical analyses to control for potential bias and ensure conclusions could be drawn from survey results. Differences in satisfaction between agencies persisted after controlling for variables such as case outcome and timeliness. Timeliness: DOL's average investigation time of closed cases was about 41 days and OSC's was about 151 days. GAO examined factors potentially influencing timeliness, such as OSC's responsibility to investigate cases involving a PPP, and whether relief was obtained for claimants. GAO found these factors were not primary contributors to OSC's relatively longer average times. Agencies have different policies for extending case investigation timeframes. Officials from OSC said they allow for open-ended case extensions, whereas DOL does not. Cost: DOL spent about $1,112 per case, whereas OSC spent about $3,810. The relative difference in agencies' costs was affected by factors such as the number of hours dedicated to case investigations and pay levels, among others. Capacity: The agencies demonstrated different capabilities to investigate and resolve cases in areas such as staffing, training, and information technology. For example, DOL had 31 staff investigating USERRA demonstration project, and other nonfederal USERRA or veterans' preference cases. These DOL investigators had an average annual demonstration project caseload of five. OSC had 7 staff investigating demonstration project cases, with an average annual caseload of 28. GAO could not determine relative performance on agency capacity due to the lack of a specific and comparable metric. What GAO Recommends: GAO recommends that any agency chosen to investigate USERRA claims continue efforts to collect claimants' survey satisfaction information, and to consider efforts to increase the survey response rate. DOL agreed with GAO's recommendations. OSC neither agreed nor disagreed with GAO's recommendations, and disagreed with GAO's findings regarding agencies' relative performance. View [hyperlink, http://www.gao.gov/products/GAO-15-77]. For more information, contact Yvonne D. Jones at (202) 512-2717 or jonesy@gao.gov. [End of section] Contents: Letter: Background: DOL Has Relatively Higher Performance Than OSC for More Demonstration Project Performance Measures: Customer Satisfaction Can Provide Meaningful Feedback for Service Improvements: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: USERRA Demonstration Project Customer Satisfaction Survey Administration and Instrument: Appendix III: Nonresponse and Multivariate Analysis of Customer Satisfaction Data: Appendix IV: Comments from the Department of Labor: Appendix V: Comments from the Office of Special Counsel: Appendix VI: GAO Contact and Staff Acknowledgments: Related GAO Products: Tables: Table 1: The Office of Special Counsel Resolved a Greater Proportion of Cases in Favor of the Claimant: Table 2: On Average, DOL Received Higher Scores on Claimant Interaction with Staff and Overall Satisfaction with Customer Service and Investigation: Table 3: DOL Has More Investigative Staff with Generally Lower Pay Levels While OSC Had Less Investigative Staff with Higher Caseloads: Table 4: Survey Response Rates at DOL and OSC: Table 5: Case Processing Time for Responders and Non-Responders, by Agency and Overall: Table 6: Discrimination Allegations for Responders and Non-Responders, by Agency and Overall: Table 7: Favorable Outcomes for Responders and Non-Responders, by Agency and Overall: Table 8: Agency Reported Outcome by Respondent Perception of the Outcome: Table 9: Responses to 15 Satisfaction Items on the USERRA Survey Questionnaire, by Agency: Table 10: Odds and Odds Ratios Indicating Differences in Satisfaction at DOL and OSC before and after Adjusting for Whether the Outcome Was Favorable, Case Processing Time, and Whether Discrimination Was Alleged: Table 11: Odds and Odds Ratios Indicating Differences in Satisfaction at DOL and OSC Before and After Adjusting for Whether the Outcome Was Favorable, Case Processing Time, and Whether Discrimination Was Alleged: Figures: Figure 1: USERRA Claims Processing under the Demonstration Project: Figure 2: Department of Labor Closed About As Many Cases As It Opened: Figure 3: Office of Special Counsel Received More Cases Than It Closed During the Early Years but the Gap Narrowed During the Later Years: Figure 4: Department of Labor Respondents Were More Likely to Express Satisfaction: Figure 5: Department of Labor Respondents Were More Likely to Express Satisfaction: Figure 6: Department of Labor Resolved Cases Faster: Figure 7: Office of Special Counsel Had a Greater Proportion of Cases Open More Than 90 Days: Figure 8: OSC Uses ADR Process to Mediate USERRA Cases to Offer Claimants Additional Resolution Options: Figure 9: OPM 2012 Customer Satisfaction Survey Instrument: Abbreviations: ADR: Alternative Dispute Resolution: DOD: Department of Defense: DOL: Department of Labor: DOL-VETS: Department of Labor Veterans' Employment and Training Service: MSPB: Merit Systems Protection Board: NVTI: National Veterans Training Institute: OMB: Office of Management and Budget: OPM: Office of Personnel Management: OSC: Office of Special Counsel: OSC 2000: Office of Special Counsel case tracking system: PMF: Presidential Management Fellow: PPP: prohibited personnel practices: SSN: Social Security number: UIMS: USERRA Information Management System: USERRA: Uniformed Services Employment and Reemployment Rights Act of 1994: VBA: Veterans' Benefits Act of 2010: VBIA: Veterans Benefit Improvement Act of 2004: [End of section] United States Government Accountability Office: GAO: 441 G St. N.W. Washington, DC 20548: November 25, 2014: The Honorable Bernie Sanders: Chairman: The Honorable Richard Burr: Ranking Member: Committee on Veterans' Affairs: United States Senate: The Honorable Jeff Miller: Chairman: The Honorable Michael Michaud: Ranking Member: Committee on Veterans' Affairs: House of Representatives: Over the next 4 to 5 years, more than a million servicemembers are expected to leave the military and transition into civilian life, according to the Department of Defense.[Footnote 1] In making this transition, some servicemembers will face significant challenges reentering the workforce and maintaining employment. Many factors-- such as workplace absences due to overseas deployments, translating military skills to civilian job requirements, and employers' lack of awareness regarding reemployment rights--can contribute to the difficulties servicemembers face when seeking a return to the civilian workforce. To protect the employment and reemployment rights of federal and nonfederal employees when they leave their civilian employment to perform military or other uniformed service, Congress enacted the Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA).[Footnote 2] USERRA applies to a wide range of employers, including federal, state, and local governments as well as to private- sector firms. Among other rights, servicemembers who meet the statutory requirements are entitled to reinstatement to the positions they would have held if they had never left their employment (or reinstatement to positions of similar seniority, status, and pay). As part of the Veterans Benefits Improvement Act of 2004 and the Veterans' Benefits Act of 2010 (VBA), Congress established two demonstration projects between the Department of Labor (DOL) and the Office of Special Counsel (OSC). The first demonstration project was implemented from February 8, 2005, through December 31, 2007. The second demonstration project was implemented from August 9, 2011, through August 9, 2014. Claims filed under USERRA from servicemembers have remained relatively steady over time despite ongoing efforts to improve outreach to employers and improve agencies' training and guidance. Between fiscal years 2008 and 2012, servicemembers filed more than 1,400 employment and reemployment claims each year (in 2013, the number of claims fell to fewer than 1,300). In addition, in fiscal years 2012 and 2013, more than 200 USERRA claims were filed against federal executive agencies. VBA mandated us to report on the relative performance of DOL and OSC across a number of areas specified in the act. This assessment of the 2011-2014, 36-month demonstration project (1) covers agencies' relative performance under performance metrics including case outcomes, customer satisfaction, timeliness, cost, and capacity as mandated by VBA; and (2) identifies actions agencies can take to improve satisfaction on customer service.[Footnote 3] To assess agencies' relative performance, we reviewed the requirements of the demonstration project set forth in VBA and compared final agency performance data on case outcome, timeliness, customer satisfaction, and cost. We also analyzed agency data to provide comparative descriptions of capacity. In conducting our work, we obtained data on case tracking, customer satisfaction, cost, and capacity from DOL, OSC, and the Office of Personnel Management (OPM) from the beginning of the USERRA demonstration project in August 2011 to July 2014. To assess case outcomes and timeliness, we reviewed and analyzed data from DOL's case tracking system (the USERRA Information Management System) and OSC's case tracking system for demonstration project cases opened between August 9, 2011 and July 31, 2014. To assess customer satisfaction, we reviewed and analyzed data and narrative responses from the USERRA customer satisfaction survey, which was administered by OPM on behalf of both agencies. To assess demonstration project costs, we reviewed and analyzed cost and accounting data from DOL and OSC, including supporting documentation such as the number of hours dedicated to demonstration project cases. We assessed the reliability of data on case outcomes, customer satisfaction, timeliness, and cost, and determined the data we used to evaluate the relative performance of agencies was sufficiently reliable for the purpose of this report. We reviewed and analyzed information on agencies' capacity based on factors identified in the VBA mandate such as staffing levels, grade level, training, education, and caseload. We also reviewed DOL's and OSC's unique characteristics that enable them to investigate and resolve claims. We interviewed key officials involved with the USERRA demonstration project at DOL and OSC. We also reviewed pertinent reports, guidance, plans, relevant federal laws, directives, and other documents. To identify actions agencies can take to improve customer service, we analyzed agency customer satisfaction data and compared results to customer service principles and guidance outlined in executive orders and the Office of Management and Budget guidance. We also interviewed agency officials about their views on related procedures and practices that worked well or needed improvement. We conducted this performance audit from April 2014 to November 2014 in accordance with generally accepted government auditing standards. These standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: Under USERRA, an employee or applicant for employment who believes that his or her USERRA rights have been violated may file a claim with the DOL's Veterans' Employment and Training Service (DOL-VETS), which investigates and attempts to resolve the claim. If DOL-VETS cannot resolve the claim and the servicemember is a federal government employee or applicant to a federal agency, DOL is to inform the claimant of the right to have his or her claim referred to OSC for further review and possible OSC representation before the Merit Systems Protection Board (MSPB).[Footnote 4] DOL is also to inform the claimant that he or she may file a claim directly with MSPB. The Veterans Benefit Improvement Act of 2004 (VBIA) established a demonstration project for the period February 8, 2005, through December 31, 2007, during which OSC was authorized to receive and investigate certain USERRA cases while DOL remained in an investigative role for others.[Footnote 5] In 2007, as mandated by VBIA, we evaluated the demonstration project and made recommendations to DOL to help establish internal controls for case review, claimant notification, and data management.[Footnote 6] Specifically, we found that the data DOL used to track case investigation time and the data DOL and OSC used to track case outcomes were not reliable to monitor, track, and report on the agencies' performance. Further, we found the data for reporting outcomes were not reliable at either DOL or OSC. This adversely affected Congress's ability to assess how well federal USERRA claims were being investigated as well as to assess whether changes would be needed in the future.[Footnote 7] To improve the USERRA process, we recommended that the Secretary of Labor develop an internal review mechanism for all unresolved cases before they are closed and claimants are notified. We also recommended establishing internal controls to ensure the accuracy of data entered into DOL's case tracking database. DOL agreed with and implemented our recommendations. Congress passed the VBA which directed DOL and OSC to establish a second demonstration project (36-month duration) for receiving, investigating, and resolving USERRA claims filed against federal executive agencies. Procedures in the second demonstration project were similar to those in the first demonstration project. DOL and OSC each received claims and were authorized to investigate and seek corrective action for those claims. Specifically, DOL is authorized to investigate and seek corrective action for those claims filed against federal executive agencies if the servicemember's Social Security number (SSN) ends in an even number. OSC is authorized to investigate and seek corrective action for USERRA claims against federal executive agencies if the servicemember's SSN ends in an odd number. If a claim does not contain an SSN, DOL will assign a claim number based on the date of the month the claim is received. For example, claims filed on an odd-numbered date will be assigned an odd case number and forwarded to OSC. Claims filed on an even-numbered date will be assigned an even case number and be investigated by DOL. Also, under the demonstration project, OSC is authorized to handle any "mixed claims" in which a claimant files a USERRA claim against a federal executive agency and also brings a related prohibited personnel practice claim. There are 13 prohibited personnel practices (PPP) including discrimination, retaliation, or unauthorized preference or improper advantage.[Footnote 8] VBA mandated us to evaluate how DOL and OSC designed the demonstration project and assess their relative performance during and at the conclusion of the demonstration project. Figure 1 depicts USERRA claims processing under the demonstration project. Figure 1: USERRA Claims Processing under the Demonstration Project: [Refer to PDF for image: processing illustration] Claimant submits claim (Form 1010) electronically or hard copy: Department of Labor (DOL)/Veterans' Employment and Training Service (VETS): 90 days: Investigative process under the demonstration project: 1. Odd-numbered claim?[A] Yes: Refer odd-numbered cases to OSC[A]; No: Even-numbered claim investigated[A]. 2. Claim resolved? Yes: Claimant is notified of resolution; No: Claimant is notified that claim is unresolved and of right of referral to OSC. 60 days: Referral phase under Uniformed Services Employment and Reemployment Rights Act (USERRA): 3. If claimant requests referral to OSC, VETS investigator prepares memorandum of referral (MoR). 4. VETS regional office reviews MoR. 5. DOL Solicitor conducts legal review, prepares analysis and representation recommendation; Continue to #5 under Office of Special Counsel (OSC) process. Office of Special Counsel (OSC): 90 days: Investigative process under the demonstration project: 1. Odds-numbered or prohibited personnel practice (PPP[A]) cases screened for alternative dispute resolution (ADR) mediation. 2. Mediation offered: If parties agree, case is mediated. 3. Claim resolved? Yes: Claimant is notified of resolution; No: Case is investigated. 4. Claim resolved? Yes: Claimant is notified of resolution; No: Claimant notified of investigation results and of right to have OSC consider claim for possible representation before Merit Systems Protection Board (MSPB). 60 days: Referral phase under Uniformed Services Employment and Reemployment Rights Act (USERRA): 5. OSC reviews investigative file. 6. OSC determines claim has merit? Yes: OSC attempts resolution, including offering representation before MSPB; No: Claimant informed of OSC decision and of option to file claim with MSPB without OSC representation. Source: GAO (data); Art Explosion (image). GAO-15-77. [A] If, during initial processing or investigation phase, DOL personnel identify a possible PPP case, DOL and OSC will jointly determine at what point, if at all, the case should be transferred to OSC for investigation. [End of figure] Our previous reports on USERRA included recommendations to improve data quality and develop comparable data and processes to facilitate evaluating agency performance. In June 2011, we reported on the methods and procedures that DOL and OSC had agreed to establish for the demonstration project. We recommended that both agencies take a number of steps to ensure a comparable process and collect sufficiently reliable data.[Footnote 9] In response to our recommendations, DOL and OSC entered into an interagency agreement with OPM to establish and regularly administer a customer satisfaction survey.[Footnote 10] The customer satisfaction survey provides comparable information and includes a survey plan and protocols for contacting respondents in line with the recommendation from our demonstration project design assessment. Furthermore, by the start of the demonstration project in August 2011, both agencies established a cost accounting system to collect and track actual time spent investigating demonstration project cases. In September 2012, we reported an interim assessment of the demonstration project.[Footnote 11] At that time, we reported that both DOL and OSC had established methods and procedures that would allow them to report comparable and reliable performance data for the demonstration project, as required by VBA. We also identified additional actions that agencies could take to improve the quality of the customer satisfaction and cost data. Specifically, we recommended that DOL and OSC take additional steps to increase their customer satisfaction survey response rates and address any potential survey response bias. We also recommended that both agencies establish and document procedures for compiling and reporting the cost data during the demonstration project. In response to our 2012 recommendations to increase survey response rates and to address potential response bias, the agencies agreed to conduct additional outreach to claimants by providing an initial survey notification, and contracted with OPM to conduct a nonresponse analysis. OPM provided a nonresponse analysis to agencies in April 2013. In response to our recommendation that agencies establish and document procedures for compiling and reporting cost data, DOL provided written instructions to its staff on methods for reporting its time and related costs, developed written procedures for tracking and reporting costs, and implemented a quarterly audit of cost information. OSC did not implement our recommendation to document procedures for compiling and reporting cost data. Specifically, according to OSC's USERRA Unit Chief, the agency established, but did not document, procedures for compiling the cost data associated with the demonstration project. OSC officials explained that in lieu of documenting their procedures, they held a training session with staff to discuss the method they use to track and report the time and costs associated with demonstration project cases. However, standards for internal control in the federal government require that internal controls and all transactions and other significant events be clearly documented, and the documentation should be readily available for examination. The documentation should appear in management directives, administrative policies, or operating manuals, and may be in paper or electronic form. DOL Has Relatively Higher Performance Than OSC for More Demonstration Project Performance Measures: OSC Resolved a Greater Proportion of Cases in Favor of the Claimant and DOL Resolved More of the Cases it Received: During the demonstration project, DOL resolved more of the cases it received. Between August 9, 2011, and July 31, 2014, DOL received 319 demonstration cases and OSC received 434 cases. Of the cases received, OSC closed 366 cases, or 84 percent. DOL closed 308 cases, or 97 percent. OSC had 68 demonstration project cases remaining open, and DOL had 11 cases remaining open at the end of July 2014. We identified two factors that may explain why OSC received more demonstration project cases than DOL. One factor is the presence of more odd versus even social security numbers of claimants, and the second is OSC's responsibility to investigate and resolved all cases that involve a PPP.[Footnote 12] Most cases were randomly assigned to each agency by using the last digit of the claimant's SSN. OSC received more cases based on this assignment. In addition, OSC was required to handle all cases that involved a PPP, which contributed to 27 more cases being assigned to OSC. These 27 cases amounted to about 7 percent of OSC's total claims and 23 percent of the 115 additional cases that OSC received (see table 1). From fiscal year 2012 to fiscal year 2014, DOL closed about as many cases as it opened, as shown in figure 2. DOL received more than 100 cases during fiscal years 2012 and 2013, and closed about the same number each year. Because the demonstration project began near the end of fiscal year 2011, both agencies received a smaller number of cases during that period. Furthermore, the demonstration project ended about 2 months prior to the end of fiscal year 2014. Figure 2: Department of Labor Closed About As Many Cases As It Opened: [Refer to PDF for image: vertical bar graph] Fiscal year: 2011; Cases received: 18; Cases closed: 5. Fiscal year: 2012; Cases received: 114; Cases closed: 113. Fiscal year: 2013; Cases received: 101; Cases closed: 102. Fiscal year: 2014; Cases received: 86; Cases closed: 89. Source: GAO analysis of Labor UIMS case tracking data (8/9/2011 to 7/31/14). GAO-15-77. [End of figure] OSC received more cases than it closed in the first two fiscal years of the demonstration project, which included less than 2 months of fiscal year 2011 and all of fiscal year 2012. Agency officials explained the agency had limited capacity to investigate and resolve claims during the first 6 months of the demonstration project. During this time, the agency reported it was hiring new staff, negotiating a reimbursement agreement with DOL, and expanding the capabilities of its USERRA Unit to handle demonstration project cases. As such, according to OSC officials, the agency had a limited capacity to investigate and resolve claims during this time. However, agencies had about 10 months to prepare and assemble the resources required to implement the project. VBA established the requirement for the demonstration project on October 13, 2010 (the date VBA was enacted), and the demonstration project began on August 11, 2011. As shown in figure 3, between the beginning of fiscal year 2013 and the end of July 2014, OSC closed the same number of cases as it received. Figure 3: Office of Special Counsel Received More Cases Than It Closed During the Early Years but the Gap Narrowed During the Later Years: [Refer to PDF for image: vertical bar graph] Fiscal year: 2011; Cases received: 28; Cases closed: 1. Fiscal year: 2012; Cases received: 149; Cases closed: 108. Fiscal year: 2013; Cases received: 129; Cases closed: 136. Fiscal year: 2014; Cases received: 128; Cases closed: 121. Source: GAO analysis of Labor UIMS case tracking data (8/9/2011 to 7/31/14). GAO-15-77. [End of figure] OSC Resolved a Greater Proportion of Cases in Favor of the Claimant: Officials from OSC and DOL hold similar views on their investigative role under USERRA and told us they view their roles as that of impartial investigators. Specifically, officials from both agencies told us that their role is to determine if a claim has merit, and if so, to resolve the claim appropriately. Importantly, we did not independently assess the quality of agencies' case investigations to determine if DOL and OSC arrived at the appropriate case outcomes. As such, we were not able to determine the relative performance of agencies for this measure. Between August 9, 2011, and July 31, 2014, OSC obtained relief for claimants in about 26 percent, or 94, of its demonstration project cases. DOL obtained relief for claimants in about 20 percent, or 62, of its cases (see table 1). OSC officials told us that their ability to close a greater proportion of cases resulting in relief for claimants is partially attributable to their expertise on federal sector employment matters, the quality of their work, and the composition of their investigative team, which is largely staffed by attorneys. DOL officials told us their goal is to obtain the correct case outcome, even if it is not in favor of the claimant. DOL officials also expressed concern that associating higher performance with case outcomes that provide relief for claimants may create an incentive to pursue relief for cases that do not warrant corrective action. Table 1: The Office of Special Counsel Resolved a Greater Proportion of Cases in Favor of the Claimant[A]: Department of Labor; Number of Cases Received: 319; Number of Cases Closed: 308; Number of Cases Resolved In Favor of Claimant: 62; Percent of Cases Resolved In Favor of Claimant: 20.1%. Office of Special Counsel; Number of Cases Received: 434; Number of Cases Closed: 366; Number of Cases Resolved In Favor of Claimant: 94; Percent of Cases Resolved In Favor of Claimant: 25.7%. Source: GAO analysis of DOL's USERRA Information Management System and OSC's case tracking system data. GAO-15-77. [A] The case information presented in table 1 may vary somewhat from information reported in conjunction with our nonresponse analysis reported in appendix III, because the data presented in the table were updated from agencies following our nonresponse analysis. Furthermore, our nonresponse analysis is based on data provided by OPM, rather than case tracking data collected directly from agencies, as shown in this table. [End of table] We worked with agencies prior to the demonstration project to develop a method to ensure that case outcomes could be described in a consistent manner, and a comparison could be made at the conclusion of the demonstration project. Accordingly, in August 2011, agencies developed a cross-walk of case resolution codes to facilitate a comparison of case outcomes. In our interim report issued in September 2012, we reported case outcomes based on this cross-walk. This final review also relies on the cross-walk for the purpose of comparing agency case outcomes. DOL and OSC both tracked the disposition of closed cases to determine if cases had been resolved in favor of the claimant, and in some cases, to track the type of corrective action agreed to, or to provide the reason a case was not resolved in favor of the claimant. Specifically, OSC uses the case resolution codes "dispute resolved," "corrective action," and "complainant declines corrective action offered" to identify cases resolved in favor of the claimant, whereas DOL uses the resolution codes "claim granted," "claim settled," and "merit - not resolved." A number of case resolution codes, including OSC's "corrective action declined by claimant" code, were not included in the cross-walk of case resolution codes. Because these codes were not included in the cross-walk, we had to rely on agencies' determination of whether certain cases were resolved in favor of the claimant. Specifically, OSC's totals include 3 cases in which corrective action was declined by the claimant, or about 3 percent of the 94 cases the agency resolved in favor of the claimant. DOL's totals include 7 cases that had merit, but which were not resolved, or about 11 percent of the 62 cases the agency resolved in favor of the claimant. According to OSC officials, the code "corrective action declined by claimant" indicates that the agency offered resolution but the claimant declined it. According to DOL officials, the code "merit, not resolved" indicates that the claim was meritorious, but the agency did not offer resolution, or the claimant declined the resolution that was offered. Agency officials from OSC told us that the relief provided to claimants included, but was not limited to, initial job offers, reinstatement, promotions, restored benefits, accommodations for service-connected disabilities, back pay, USERRA training for federal officials, and systemic changes to agency policies and procedures to better comply with USERRA. For example, OSC officials explained that they investigated a claim by a National Guardsman who worked for the Defense Commissary Agency and claimed he was improperly denied reemployment upon returning from a tour of duty. OSC investigated the claim, determined the claim was valid, and intervened with the agency to identify appropriate corrective action. According to OSC, the agency agreed to reinstate the individual to his former position, restore his benefits and seniority, and provide him with back pay. Agencies also identified cases that were not resolved in favor of the claimant, and provided a reason, or disposition, to explain why. For example, some cases were not resolved favorably because the claimant withdrew the claim, investigators determined the claim had no merit, there was insufficient evidence to support the claim, or the claimant failed to supply evidence for further action, among other reasons. Respondents Reported Greater Customer Satisfaction with DOL: DOL received higher scores from respondents than OSC on every question asked on the customer satisfaction survey administered by OPM. However, the response rates of the surveys were low, which can potentially affect the conclusions that can be drawn from the survey. Specifically, 32 percent of claimants responded to DOL's survey, while 42 percent responded to OSC's survey. In light of the low survey response rates, we conducted additional statistical analyses to control for potential bias and ensure conclusions could be drawn from survey results. Our analyses revealed that differences in satisfaction scores for each question remained statistically significant and pronounced even after controlling for variables that could affect the claimants' views of the customer service provided. These variables include case investigation time, whether the claimant indicated that the case was resolved in his or her favor, and whether discrimination was alleged by the claimant. For a more detailed explanation of these analyses and their findings, see appendix III. Side bar: USERRA Customer Satisfaction Survey: Select Narrative Responses: "[A DOL investigator] provided timely updates in regards to the steps she was taking to determine the disposition of the case." "[I] would like to be kept informed [by OSC] of what is going on." Source: OPM survey. [End of side bar] The differences in scores between the two agencies were especially pronounced on questions relating to timeliness, access to staff, and overall experience. Narrative responses to the survey provided additional detail on aspects of agencies' USERRA investigations that respondents said were working well, and areas that may require improvement. Select narrative responses are provided throughout this section to highlight certain aspects of agencies' customer service. More details on the survey's administration and survey instrument can be found in appendix II. Our analysis found that a higher percentage of DOL respondents agreed with survey statements and expressed satisfaction with DOL's service than did OSC respondents. Figures 4 and 5 display the percentage of DOL and OSC respondents who expressed satisfaction with various elements of customer service and the case investigation by responding to survey statements. Figure 4: Department of Labor Respondents Were More Likely to Express Satisfaction: [Refer to PDF for image: vertical bar graph] Percentage who agreed with survey statements: Statement: The staff are courteous: Labor: 82% (n=100); Office of Special Counsel: 65% (n=149). Statement: The staff are competent: Labor: 68% (n=98); Office of Special Counsel: 37% (n=144). Statement: The staff are professional: Labor: 83% (n=100); Office of Special Counsel: 51% (n=148). Statement: The staff provides consistently good service: Labor: 64% (n=92); Office of Special Counsel: 29% (n=143). Statement: The staff policies and procedures are customer friendly: Labor: 63% (n=92); Office of Special Counsel: 35% (n=145). Statement: I have adequate access to staff for advice and assistance: Labor: 66% (n=96); Office of Special Counsel: 31% (n=147). Statement: The staff keep me informed of significant case developments: Labor: 67% (n=100); Office of Special Counsel: 31% (n=149). Statement: I know whom to contact if I have additional questions: Labor: 71% (n=100); Office of Special Counsel: 43% (n=149). Statement: The staff responded to my questions in a timely manner: Labor: 76% (n=100); Office of Special Counsel: 32% (n=148). Source: GAO analysis of Office of Personnel Management customer satisfaction survey data. GAO-15-77. [End of figure] Figure 5: Department of Labor Respondents Were More Likely to Express Satisfaction: [Refer to PDF for image: vertical bar graph] Percentage who were satisfied: Statement: Thoroughness of investigation: Labor: 54% (n=92); Office of Special Counsel: 24% (n=139). Statement: Clarity of written communication: Labor: 69% (n=97); Office of Special Counsel: 39% (n=145). Statement: Clarity of verbal communication: Labor: 69 (n=100); Office of Special Counsel: 40% (n=142). Statement: Customer service: Labor: 66% (n=100); Office of Special Counsel: 34% (n=151). Statement: Investigation of complaint: Labor: 55% (n=101); Office of Special Counsel: 25% (n=151). Statement: Results of investigation: Labor: 43% (n=101); Office of Special Counsel: 25% (n=150). Source: GAO analysis of Office of Personnel Management customer satisfaction survey data. GAO-15-77. [End of figure] Side bar: USERRA Customer Satisfaction Survey: Select Narrative Responses: "[The DOL investigator] went above and beyond the call of duty to help me." "OSC has very dedicated measures and personnel in place offering sound advice and service along every step of the process." Source: OPM survey. [End of side bar] A portion of both DOL respondents and OSC respondents expressed some dissatisfaction with the thoroughness of the investigation, as figure 5 shows. One respondent wanted DOL "to be more proactive and actually contact the individuals that are committing the wrongdoings and getting away with affecting the veterans that want to work." One OSC respondent wanted OSC to "investigate cases better and conduct interviews before they make their final decisions and not follow by only what the employer says." As depicted by the average scores on the customer satisfaction survey, the largest reported difference in satisfaction with customer service between DOL and OSC is agency timeliness and the smallest reported difference in satisfaction with customer service between DOL and OSC is staff courteousness. Survey responses also showed that claimants felt that DOL better kept claimants informed and provided better access to staff, as well as better satisfying claimants in regards to the overall investigation of the claim. Claimants' responses were coded on a scale of 1 to 5, with 1 representing a "strongly disagree" or "very dissatisfied" response, 3 representing a neutral score, and 5 representing a "strongly agree" or "very satisfied" response. Therefore, the higher the average score, the more positive respondents felt about agency performance. Table 2 depicts the average scores on the customer satisfaction survey for select survey questions and a table with all the average scores can be found in appendix III. Table 2: On Average, DOL Received Higher Scores on Claimant Interaction with Staff and Overall Satisfaction with Customer Service and Investigation: Question: The staff is courteous; DOL Mean: 4.21; OSC Mean: 3.64; Difference (DOL-OSC): 0.57. Question: I have adequate access to staff for advice and assistance; DOL Mean: 3.79; OSC Mean: 2.63; Difference (DOL-OSC): 1.17. Question: The staff responded to my questions in a timely manner; DOL Mean: 4.10; OSC Mean: 2.61; Difference (DOL-OSC): 1.49. Question: The staff kept me informed of significant case developments; DOL Mean: 3.82; OSC Mean: 2.52; Difference (DOL-OSC): 1.30. Question: Satisfaction with customer service; DOL Mean: 3.70; OSC Mean: 2.50; Difference (DOL-OSC): 1.20. Question: Satisfaction with investigation of complaint; DOL Mean: 3.35; OSC Mean: 2.13; Difference (DOL-OSC): 1.21. Question: Survey Sample Size; DOL Mean: n=101; OSC Mean: n=151. Key: 1 represents a score of "strongly disagree" or "very dissatisfied" and 5 represents a score of "strong agree" or "very satisfied". Source: GAO Analysis of OPM Customer Satisfaction Survey Data. GAO-15-77. [End of table] Side bar: USERRA Customer Satisfaction Survey: Select Narrative Responses: "Upon making the complaint [to DOL] an investigator was quickly appointed." "[I] was never able to reach [the] OSC case worker directly. I had to leave voice mail and email messages and wait for a call back which would be 3-5 days." Source: OPM survey. [End of side bar] In light of low survey response rates, we identified variables available for both respondents and nonrespondents that might affect satisfaction and conducted additional nonresponse and multivariate analyses to control for these variables. We found the differences in satisfaction scores for each question remained statistically significant and pronounced even after we took account of, and controlled statistically for, variables that could affect the claimants' views of the customer service provided. These variables include the differences across agencies in case investigation time, whether the claimant indicated that the case was resolved in his or her favor, and whether discrimination was alleged by the claimant. For example, our analysis of the survey responses revealed that more individuals whose cases were not resolved in their favor responded to OSC's surveys than to DOL's. We also found that case outcome significantly affected respondents' mean satisfaction score. To account and control for these factors, we conducted multivariate regression analyses of survey responses, and calculated adjusted average scores that correct for sources of response and nonresponse bias. Specifically, this analysis showed that the likelihood (or odds) that DOL respondents agreed with statements or expressed satisfaction with DOL was two to more than six times higher than the likelihood (or odds) that OSC respondents agreed with or expressed satisfaction with OSC. For example, the likelihood of DOL claimants agreeing that the agency responded to questions in a timely matter was 6.6 times higher than the likelihood of OSC claimants agreeing that the agency responded to questions in a timely manner. For adjusted customer satisfaction scores, and a more detailed explanation of these analyses and their findings, see appendix III. DOL Investigated and Resolved Cases Faster Than OSC: DOL's average case investigation time was more than three times faster than (or about 27 percent of the time used by) OSC's case investigation time. As shown in figure 6, between August 9, 2011, and July 31, 2014, DOL's average investigation time for closed cases was about 41 days, whereas OSC's average investigation time was about 151 days. Case investigation time varied over time. Both agencies experienced an increase in the average time to investigate cases between fiscal years 2011 and 2013. In fiscal year 2013, DOL's average investigation time was about 48 days, and OSC's average investigation time was about 168 days. Between fiscal year 2013 and the end of July 2014, OSC's average investigation time remained about the same at 167 days, and DOL's average investigation time fell to about 43 days. Figure 6: Department of Labor Resolved Cases Faster: [Refer to PDF for image: 2 vertical bar graphs] Average case investigation time: Number of days: Agency: Labor: 41 (n=308); Agency: Office of Special Counsel: 151 (n=366). Average case investigation time: Number of days: Fiscal year 2011: Agency: Labor: 34 (n=5); Agency: Office of Special Counsel: 7 (n=1). Fiscal year 2012: Agency: Labor: 34 (n=113); Agency: Office of Special Counsel: 111 (n=108). Fiscal year 2013: Agency: Labor: 48 (n=102); Agency: Office of Special Counsel: 167 (n=366). Fiscal year 2014: Agency: Labor: 41 (n=308); Agency: Office of Special Counsel: 151 (n=121). Source: GAO analysis of Labor UIMS and Office of Special Counsel 2000 case tracking data (8/9/2011 to 7/31/14). GAO-15-77. [End of figure] As of July 31, 2014, OSC had a substantially greater proportion of demonstration project cases with investigation times greater than 90 days, when compared with DOL. As shown in figure 7, about 56 percent of OSC's cases were open more than 90 days, with about 32 percent of cases between 30 and 90 days. At DOL, about 9 percent of cases were open more than 90 days, with 45 percent of cases between 30 and 90 days. Under USERRA, DOL is required to investigate and attempt to resolve USERRA claims within 90 days of receipt, unless the claimant agrees to an extension. Under the demonstration project, the same 90- day time limit also applies to OSC. Both agencies explained that they requested extensions from the claimant for cases requiring more than 90 days to investigate. Figure 7: Office of Special Counsel Had a Greater Proportion of Cases Open More Than 90 Days: [Refer to PDF for image: 2 pie-charts] Office of Special Counsel case timeliness: More than 90 days: 56%; 241 cases; Less than 30 days: 12%; 53 cases; Between 30 and 90 days: 32%; 140 cases. Labor case timeliness: More than 90 days: 9%; 28 cases; Less than 30 days: 46%; 148 cases; Between 30 and 90 days: 45%; 143 cases. Source: GAO analysis of Labor UIMS and Office of Special Counsel 2000 case tracking data (8/9/2011 to 7/31/14). GAO-15-77. [End of figure] OSC received extensions for investigations that, at times, lasted longer than 1 year. OSC officials explained that they sometimes received open-ended extensions from claimants to provide additional time as necessary to complete their investigation. DOL officials told us their policy does not allow for open-ended extensions; rather, each extension provides claimants with a date certain by which the investigator must either complete the investigation or request an additional extension. As of July 31, 2014, OSC had a total of 48 cases that had been open for more than 1 year, which represents about 11 percent of the 434 total cases they received. At this time, DOL had no cases that had been open for more than 1 year. Of the 48 OSC cases taking more than 1 year to close, 16 were still open at the end of July 2014. Of the 32 cases closed, 6 were resolved in favor of the claimant, or about 19 percent of these cases. As such, the resolution rates for cases open for more than 1 year, that were resolved in favor of the claimant, are slightly lower than the favorable resolution rates for all of the agency's demonstration project cases (which was about 26 percent for all closed cases). OSC officials noted that cases open longer than 1 year are often the most complex cases, and therefore take longer to investigate even if they are ultimately not resolved in favor of the claimant. Factors Potentially Influencing Timeliness. OSC officials provided a few potential explanations for their agency's relatively longer average case investigation time. For example, OSC officials said that claimants may withdraw their cases and pursue relief on their own at any time. But many prefer that the agency complete its work and attempt to resolve their claim, even if it takes a significant amount of time. Unlike DOL, OSC does not terminate its investigation if the claimant receives representation by private counsel that is also involved in the case. A DOL official told us the agency may terminate these cases if private counsel investigations interfere with their case investigation. OSC and DOL did not collect information that would allow analysis of timeliness for cases where outside counsel was involved. Therefore, we are unable to determine if this was a primary contributor to the agency's average investigation time. DOL officials attributed their relatively faster case investigation time to factors such as, * their institutional structure, which includes staff in field locations that are available to immediately investigate cases across the country; * the USERRA-specific training provided to staff; * standard operating procedures and related guidance on timely completion of case investigations; and: * the composition of their staff, which includes many veterans who are dedicated to the agency's mission to assist servicemembers. Furthermore, DOL officials said supervisors reinforce the importance of timeliness and closely monitor cases to ensure timely resolution. DOL officials acknowledged that some cases take longer than 90 days to resolve, but these should be exceptions. As required by VBA, OSC receives all USERRA cases that involve a prohibited personnel practice (PPP). DOL does not handle cases that involve a PPP. OSC officials explained that these cases are often more complex and may take more time to investigate and resolve than other cases because they often involve multiple allegations and can require much more extensive fact-finding and legal analysis before a determination or resolution can be reached. To determine if these cases contributed to OSC's relatively longer case investigation times, we calculated the average case investigation times for cases that included a PPP allegation and compared it to the average case investigation time. We found that during the demonstration project, OSC closed 24 cases involving a PPP allegation, or about 7 percent of all closed cases. The average investigation time for these cases was 201 days, which is about 50 days longer than the average case investigation time for OSC's demonstration project cases. As such, on average, PPP cases take about 34 percent more time to investigate than the average case. Although these cases do take more time to investigate, due to the relatively small number of PPP cases, and because 50 days falls within the normal variation of case investigation time at OSC, we have concluded that PPP allegations were not a primary contributor to the relatively longer case investigation times between OSC and DOL. We also considered the potential relationship between case outcome and timeliness. We found that DOL investigations for cases resolved in favor of the claimant took longer to investigate, on average, than cases with an unfavorable outcome; whereas, at OSC, the average case investigation times were about the same. At DOL, the average investigation time for cases that were resolved in favor of the claimant was about 64 days, and for cases not resolved in favor of the claimant was about 35 days (with an average of 41 days for all cases). At OSC, the average investigation time for cases resolved in favor of the claimant was about 150 days, and for cases not resolved in favor of the claimant was about 151 days. While favorable case outcomes are associated with longer case investigation times at DOL, they do not appear to be a contributing factor to case investigation timeliness at OSC. DOL's Case Investigation Costs Were Lower Than OSC's: On average, it cost DOL about three times less (or about 29 percent of the cost) to investigate demonstration project cases than OSC. The relative difference in agencies' costs was affected by factors such as the number of hours dedicated to case investigations and pay levels, among others. For example, we found that OSC used more than twice the number of staff hours, on average, per case, than DOL. We did not evaluate OSC's demonstration project costs prior to August 12, 2012, because these cost data were incomplete and we were not able to assess their reliability. As we recommended in our preliminary assessment of the demonstration project's design, DOL and OSC established cost accounting systems by the start of the demonstration project on August 9, 2011, to collect and track actual time spent investigating USERRA demonstration project cases.[Footnote 13] While the cost accounting systems developed at each agency differ somewhat in the way they track time spent, both systems track actual salary, benefits, and indirect cost components by applying an hourly rate that includes those components for each specific employee who works on, and tracks time spent on, demonstration project cases. We conducted tests on each agencies' cost and accounting data to ensure these cost components were accurately reflected in their total costs. To determine the total cost, agencies multiplied the hourly rate for all personnel who participated in the demonstration project, by the total time spent working on USERRA demonstration project investigations. Agencies also tracked indirect miscellaneous costs, such as shipping, and included these costs to their totals. OSC was not able to provide us with complete cost information for the demonstration project in time for us to analyze the complete set of data and assess its reliability. On August 12, 2012, OSC changed the methodology it used to track and report costs during the demonstration project to be more consistent with the approach used by DOL. As such, we are not evaluating demonstration project costs at OSC prior to that date, and are only able to report on the cost information that we received, and assessed for reliability. Between the beginning of the demonstration project on August 9, 2011, and July 31, 2014, DOL investigated 319 claims at a total cost of $354,712. During the period for which we have comparable and reliable data, from August 12, 2012, to August 1, 2014, OSC's demonstration project costs totaled $1,055,377. Importantly, these totals include costs for both closed and ongoing cases during the periods described. Because 79 demonstration project cases were still open at the end of July, and due to the incomplete information received from OSC, the final cost of the demonstration project was not known at the time of this product's issuance. Of the 319 claims investigated by DOL, the agency closed 308. As such, there were 11 demonstration project cases still open. The total cost of $354,712 covers all 308 closed cases, and work completed on the 11 open cases during this period. On average, the agency spent $1,112 on each case investigated during the demonstration project. The agency reported that demonstration project staff dedicated 6,579 hours to investigate demonstration project cases, or about 21 hours for each case received during this period. DOL officials explained that their total cost does not include some costs from support staff, case intake processing, and for managers who assisted, as needed, with some demonstration project activities. DOL officials explained that they did not track these costs because it was not practically feasible, and the agency would have incurred these costs regardless of their participation in the USERRA demonstration project. To compare agencies, we also linked OSC's cost information to the number of cases that were opened and closed during the period for which comparable and reliable cost information was available. Between August 12, 2012, and August 1, 2014, OSC opened 277 cases and closed 275 cases. As such, OSC spent about $3,810 for each case that it opened during this period, or about $3,838 for each case that the agency closed during this period. Furthermore, the agency reported that demonstration project staff dedicated 14,864 hours to investigate demonstration project cases, or about 54 hours for each case received during this period. On average, this is more than double the number of hours spent per case, at DOL. The total cost of $1,055,377 covers the investigation costs of all cases worked during this period, and is not necessarily limited by the cases that were opened and closed during this time. Each Agency Demonstrated Differential Capabilities to Investigate and Resolve Cases: VBA refers to five criteria for consideration when evaluating DOL's and OSC's capacity to investigate federal USERRA claims: staffing levels, caseload, training, education, and grade level. In addition, agencies provided their view of other distinguishing characteristics that enhance their ability to effectively and efficiently investigate and resolve USERRA claims. We could not determine relative performance on agency capacity due to the lack of a specific and comparable metric. DOL Has More USERRA Dedicated Staff, While OSC Has More Cases Assigned Per Investigator and Higher Graded Staff: During the demonstration project, DOL had more staff available to investigate cases with lower average USERRA caseloads than OSC. OSC's investigative staff generally had higher pay levels (or higher pay grades) than DOL. Both agencies' staff had varying levels of education and experience. During the demonstration project, DOL had 31 staff investigating demonstration project cases, and other nonfederal USERRA or veterans' preference cases.[Footnote 14] OSC had 7 staff investigating demonstration project cases.[Footnote 15] According to DOL officials, their investigators have varying levels of education and provided no specific information. Rather, DOL officials suggested that the level of investigators' experience, can serve as a proxy for education, as will be discussed later. Of OSC's seven investigators, five are attorneys with Juris Doctor degrees (J.D.), one has a Master's degree, and one has a Bachelor's degree. OSC has had additional staff throughout the demonstration project, as well as a part-time Alternative Dispute Resolution (ADR) specialist--also with a J.D. and a Master's degree in Conflict Analysis and Resolution--from OSC's ADR unit. OSC also employed six legal interns who were J.D. candidates and two different Presidential Management Fellows (PMF) with J.D.s during the demonstration project. The PMFs were full-time employees and were responsible for investigating cases, whereas some of the interns worked part-time and some worked full-time and provided case intake and research support and were not assigned cases to investigate. On average, these individuals served for periods of 3 to 6 months each. Table 3: DOL Has More Investigative Staff with Generally Lower Pay Levels While OSC Had Less Investigative Staff with Higher Caseloads: DOL Investigators; Number of Staff who Investigated Demonstration Project Cases: 31; General Schedule (GS) Pay Range[A]: GS12-13; Average Annual Caseload[B]: 5. OSC Investigators[C]; Number of Staff who Investigated Demonstration Project Cases: 7; General Schedule (GS) Pay Range[A]: GS 11-14; Average Annual Caseload[B]: 28. Source: GAO analysis of DOL and OSC information. GAO-15-77. [A] The General Schedule classification system is a mechanism for organizing work, notably for the purposes of determining pay, based on a position's duties, responsibilities, and qualification requirements, among other things. [B] These figures represent the average number of cases an investigator would have been assigned over a 12-month period. We calculated weighted averages because the number of staff at both agencies was not always constant and the demonstration project was not ongoing for the entirety of fiscal years 2011 and 2014. [C] This number includes a supervisor who also investigated cases. It does not include the part-time ADR specialist who supports the USERRA Unit. [End of table] DOL's investigators were assigned an average of about five demonstration cases per investigator per year during the demonstration project. Demonstration project investigators had other responsibilities such as other nonfederal USERRA or veterans' preference cases, and the average demonstration project caseload varied by year. The average demonstration project caseload at DOL ranged from an average of about seven cases per investigator in 2012 to an average of about four cases in fiscal year 2014. The agency's investigators did not work solely on demonstration cases during the demonstration project. They were assigned other non-demonstration cases as well, as will be discussed later. Our analysis of the caseload showed, over the course of the demonstration project, OSC averaged about 28 cases opened per employee each year.[Footnote 16] OSC officials told us each non-supervisory USERRA Unit investigator or attorney generally has between 10 and 20 open cases on his or her docket at any given time, and that the number per fiscal year fluctuates based on the complexity and timing of each case. When the demonstration project began, OSC had four attorneys staffed to its USERRA Unit to investigate and resolve cases, including the Unit Chief. Beginning in the spring of 2012, OSC began hiring other investigators and temporary staff to help with the caseload. OSC's average caseload ranged from an average of 39 cases per attorney per year during part of fiscal years 2011 and 2012 to 20 cases per attorney per year during part of fiscal year 2012.[Footnote 17] Similar to the DOL investigators, OSC investigators had additional responsibilities during the demonstration project. DOL and OSC Staff Received Case Investigation and Resolution Training: DOL investigators receive formal USERRA-specific training as well as training on conducting investigations, according to DOL officials. The investigators must complete an online training and a 2-week class at the National Veterans Training Institute (NVTI) before investigating cases, and have additional training and professional development opportunities. The online training class has four components and NVTI focuses on USERRA basic training and investigation training. Specifically, investigators are trained on USERRA and its regulations, claim processing, determining eligibility for USERRA, USERRA remedies, contacting employers and claimants, negotiation skills and techniques, investigations and evidence, resolution conferences, negotiations, interviewing methods and techniques, types of respondents, credibility of witnesses/witness statements, and confidentiality and ethics. In addition, DOL investigators receive on-the-job mentoring and shadowing prior to independently investigating cases. According to DOL officials, additional training opportunities are contingent on the training budget. Some investigators took DOL investigative training while also participating in external training classes through organizations such as the Army Inspector General School. OSC does not have a formal USERRA training program for its staff because, according to OSC officials, a majority of the current USERRA staff has experience from the previous demonstration project. However, many USERRA Unit staff received formal training on ADR techniques for USERRA investigations. Specifically, OSC's ADR Unit conducted training on mediation, conflict resolution, beginning negotiation, and advanced negotiation that was attended by USERRA Unit staff. To help employees understand the documentation needed to perform case investigations, OSC provides its new USERRA employees with several training materials, including a PowerPoint slide presentation, a sample correspondence, copies of relevant laws and regulations, and a series of written training modules that include fact sheets, flow charts, common scenarios, and questions and answers on USERRA law. According to OSC officials, these training materials are given to new employees to use for background information and to reference when performing casework, and are periodically updated to reflect recent court decisions and legislative changes. The USERRA Unit Chief and other experienced members also provided on-the-job training and mentoring to interns, PMFs, and a law clerk during the demonstration project. The USERRA Unit also has a number of training modules that cover different common scenarios that the USERRA Unit staff can access. These modules are periodically updated as new cases come up and during an annual review of USERRA-related cases. DOL Has More Investigators with More Experience and Information Technology Infrastructure Benefiting USERRA Investigations: DOL has been investigating USERRA claims since the USERRA law was passed in 1994, and its investigators have experience investigating these claims. Among all of DOL's current investigators, 41 percent have less than 5 years of experience, 41 percent have 5 to 10 years of experience, and 19 percent have more than 10 years of experience at the agency.[Footnote 18] The investigators who worked on demonstration project cases had an average of 8.9 years of experience. In addition to the relatively large demonstration project staff at DOL, investigators also investigate numerous claims annually. The 31 investigators who worked on demonstration cases were assigned an average of about 15 USERRA cases (demonstration and non-demonstration) per investigator per year during the time they were employed by DOL. Overall, all current DOL investigators have been assigned an average of about 11 USERRA cases per year since 1996 when the agency began tracking these claims.[Footnote 19] According to DOL, its larger pool of investigators is a benefit because USERRA work will not be affected due to individual staff availability or turnover. There will be other DOL investigators who can step in to perform the USERRA work as needed. DOL said the agency is investing in its USERRA Information Management System electronic case tracking database through upgrades that will enhance its capabilities and transform the case tracking database into a case management system. According to DOL officials, the upgrades to the database will better protect the personally identifiable information in the case files, allow for better oversight of case investigations, and increase the efficiency of case processing. DOL plans on making this investment regardless of the results of the demonstration project because federal USERRA cases only represented about 20 percent of all USERRA cases over the past 2 years. DOL has an online system called the elaws advisor that assists potential USERRA claimants. The elaws advisor has a logic and decision tree function that asks claimants questions and provides information on USERRA to enable potential claimants to decide if they think their claim is valid and provides claimants information on how to file a claim. According to DOL officials, this online system has reduced the number of phone calls received, improved communication, and enabled claimants to more easily submit claims. OSC Uses Alternative Dispute Resolution to Facilitate Resolution and Has Additional Responsibility to Represent Claimants: OSC Alternative Dispute Resolution. According to OSC officials, in September 2012, OSC updated its mediation process after conversations with stakeholders, agency counsel, and servicemember organizations, including the Employer Support of Guard and Reserve and now uses a mediation-based ADR program to help claimants and agencies resolve USERRA claims. According to OSC officials, the program follows the requirements of the Administrative Dispute Resolution Act of 1996 for conducting federal agency mediations.[Footnote 20] OSC designed and implemented a USERRA-focused ADR program to provide additional resolution process options (such as mediation) for servicemembers who filed USERRA claims and the agencies against whom the claim is filed. The ADR process is voluntary, but if parties agree to use it, OSC mediators bring them together in a confidential, nonadversarial environment to find a mutually satisfactory resolution to the dispute. As such, the ADR process relies heavily on mediation, a process through which a neutral third party works with the disputing parties to open lines of communication, explore interests, and, through this process, find a mutually satisfactory resolution to the dispute. OSC officials told us the ADR process gives servicemembers the opportunity to resolve their claims with more carefully tailored results that may better meet their needs than strict legal remedies. OSC officials explained that it uses a small number of core mediators who have a combined 50 years of experience, education, and training in dispute resolution who are generally not USERRA Unit employees. Figure 8 shows OSC's ADR and case investigation process. Figure 8: OSC Uses ADR Process to Mediate USERRA Cases to Offer Claimants Additional Resolution Options: [Refer to PDF for image: process illustration] 1. Refer odd-numbered cases to Office of Special Counsel: 2. Cases simultaneously assigned for investigation and reviewed for potential alternative dispute resolution (ADR): ADR is not offered or parties decline: go to #3; ADR is appropriate: go to #4. 3. Case is investigated: Case referred for ADR during investigation: go to #4; If no settlement occurs: return to #4. 4. Mediation offered; if parties agree, case is mediated. 5. If settlement occurs, case is closed. Source: Office of Special Counsel. GAO-15-77. [End of figure] According to agency officials, since September 2012, OSC has used the ADR process for 19 demonstration cases, settling 17 of the 19 cases, or 89 percent. Some of the settlements have resulted in systemic changes in agencies' policies and procedures that will impact other servicemembers, such as a change to a form for a federal agency so that "extended absence" forms now include a category for extended absence to perform military service, which OSC believes will better ensure that servicemembers receive their full USERRA entitlements following service. OSC's Legal Experience. Furthermore, OSC officials attribute their ability to handle USERRA claims to their expertise on the federal workforce and federal personnel law; training and experience in investigating, resolving, and litigating federal employment claims, including USERRA and PPPs; well-established relationships with federal agencies; and an expanded ADR program. OSC also has the responsibility of deciding whether or not to represent a claimant before the Merit Systems Protection Board (MSPB). OSC officials told us that OSC has had this responsibility since 1994 and has successfully resolved dozens of these cases without litigation in USERRA cases before MSPB. Three of the five attorneys in the USERRA Unit have between 3 and 7 years of experience handling the USERRA cases referred from DOL and deciding whether or not to offer claimants representation in front of MSPB, according to OSC officials. In addition, five of the seven employees who worked on current demonstration project cases also worked on cases received during the prior demonstration project from 2005 to 2008. OSC IT Infrastructure. OSC officials also stated that they are investing in updates to their primary case tracking database, OSC 2000, and are developing a system whereby all case files will be retained digitally. This will reduce waste and make the assignment of cases more efficient. OSC officials noted that they updated OSC's official website to include a complaint dashboard where prospective claimants can select the type of claim they would like to file with a brief description of the type of violation each claim is--USERRA or the other types of cases OSC investigates. OSC officials told us they also added functionality on their website enabling prospective claimants to file claims online. OSC Received Reimbursement from DOL for Demonstration Project Costs. In January 2012, DOL and OSC entered into an interagency agreement that provided OSC with reimbursement for demonstration project costs. This agreement was signed more than a year after Congress passed VBA and about 5 months after the start of the demonstration project. This agreement was based on a similar interagency agreement that was negotiated in 2005 during the first USERRA demonstration project. According to agency officials, the reimbursement rates were based on the rates agreed to in 2005 and adjusted upward for inflation. DOL reimbursed OSC between $3,184 and $3,379 per demonstration case it closed depending on the fiscal year the case was referred to OSC. Customer Satisfaction Can Provide Meaningful Feedback for Service Improvements: Customer Satisfaction Survey Ends: DOL and OSC no longer receive feedback on customer satisfaction now that the demonstration project has ended. During the demonstration project, OPM administered the customer satisfaction survey, analyzed the survey data annually, and provided each agency the quantitative data results and qualitative comments respondents provided. Both agencies have reported gaining insights on improving service to claimants based on information provided through the survey. For example, both agencies used information collected from survey respondents and survey analysis provided by OPM to make incremental improvements in their USERRA demonstration project operations. Specifically, DOL officials told us that they used survey information to improve their communication with claimants. DOL officials said they now engage in more telephone and email interactions with claimants. OSC officials also took actions to improve their customer service based on survey results by providing earlier and more frequent contact with claimants (at least once a month), informing the claimant of the preliminary determination via phone before mailing the determination letter, and establishing a goal to respond to claimant emails and phone calls within 1 business day. The Office of Management and Budget (OMB) has emphasized the importance of setting customer service standards, regularly soliciting customer feedback, and using the feedback they receive to improve their services. To this end, OMB established a cross-agency priority goal aimed at adopting customer service best practices. According to this priority goal, government programs that directly serve the public can benefit from understanding customer expectations and service needs, and regularly evaluating and improving program effectiveness in meeting those needs. One way to accomplish this is by conducting customer satisfaction surveys and analyzing the results to identify opportunities to improve service. The customer satisfaction survey for the USERRA demonstration project ceased on July 28, 2014--the survey was created to support the USERRA demonstration project. Neither DOL nor OSC has an ongoing agreement with OPM to continue administering the customer satisfaction survey for USERRA claimants. Also, neither agency has developed other plans to continue the customer satisfaction survey. The amount spent on the survey represents slightly more than 5 percent of DOL's total investigation costs, and about 2 percent of OSC's investigation costs between August 2012 and August 2014. The cost to administer the customer satisfaction survey was $20,000. As previously discussed, DOL spent $354,712 on investigating demonstration cases and OSC spent more than $1,055,377 on investigating demonstration cases. Agencies used survey results to make adjustments but the survey ended in July 2014. Such a feedback mechanism will provide agencies an opportunity to enhance customer service. Low Response Rates Limit Agency Efforts to Make Service Improvements: Both agencies had a low response rate to the customer satisfaction survey. As of July 28, 2014, DOL had 101 claimants respond for a response rate of 32 percent and OSC had 151 claimants respond for a response rate of 42 percent.[Footnote 21] Of the 101 DOL respondents, 69 provided narrative comments on the final two questions. Of the 151 OSC respondents, 135 provided narrative comments on the final two questions. A high response rate increases the likelihood that the survey results reflect the views and characteristics of the target population, whereas a low response rate can be an indicator of potential nonresponse bias, which would be detrimental to the accuracy of the results of a study in a variety of ways. OMB guidance recommends that executive branch agencies should try to achieve the highest practical rates of survey response. Moreover, OMB suggests that when survey response rates are less than 80 percent, agencies conduct a nonresponse analysis to identify potential limitations to the data. As we recommended in 2012, agencies undertook additional efforts to increase the response rate, included providing claimants with an initial notification of the survey; however, agencies did not pursue other methods to increase response, such as by contacting respondents over the phone or providing more than two additional follow-up notifications, because they did not want to aggravate claimants with repeated follow-up requests.[Footnote 22] Conclusions: We analyzed agencies relative performance on the five demonstration project metrics outlined in the VBA, including case outcomes, customer satisfaction, timeliness, cost, and capacity. Based on our analysis, DOL demonstrated relatively higher levels of performance than OSC on most of these performance metrics. Specifically, DOL demonstrated higher levels of customer satisfaction and resolved cases in about one- third of the time and for about one-third of the cost, on average; whereas OSC resolved a greater proportion of cases in favor of the claimant. The relative difference in agencies' costs was affected by factors such as the number of hours dedicated to case investigations and pay levels, among others. However, there are other considerations affecting agency performance, such as differing resource levels, staffing levels and qualifications, and case review and investigative approach. Our report provides Congress with agencies' relative performance information that may help inform the policy decision on the future responsibilities of the two agencies for the processing of USERRA claims against federal executive agencies. In response to our past recommendations, DOL and OSC worked together to establish and administer a customer satisfaction survey that solicited feedback on service provided to claimants. The customer satisfaction survey concluded on July 28, 2014, and, although DOL and OSC undertook additional efforts to increase the response rate, the response rate remained low. Agencies do not have plans to administer a customer satisfaction survey in the future. Agencies officials at both agencies told us they benefited from the information collected during the survey, and used survey information to make improvements in their operations to better serve claimants. Without ongoing access to customer satisfaction information, both agencies will be unable to track satisfaction levels over time, and may miss opportunities to receive feedback from servicemembers and make additional improvements to federal USERRA operations in the future. Recommendations for Executive Action: We recommend that any federal agency designated to investigate future USERRA claims against federal executive agencies take the following two actions: * Continue administering a customer satisfaction survey, whether administered by OPM or the agency, so the agency investigating federal USERRA claims can receive consistent feedback and improve service to claimants. * Undertake efforts to increase the response rate of the customer satisfaction survey if it continues to be administered, so more tenable conclusions can be drawn from its data. Such efforts may include follow-up phone calls to nonrespondents, additional email notifications requesting participation in the survey, or making the survey easier to complete and submit. Agency Comments and Our Evaluation: We provided a draft of this report to DOL and OSC for review and comment. DOL concurred with our two recommendations and said it is committed to continuous improvement of the USERRA program. DOL comments are reprinted in appendix IV. In response to our recommendations, DOL stated it has plans to continue a customer satisfaction survey for USERRA claimants in fiscal year 2015 and will take steps to maximize response rates. Recognizing the need for continuous improvement, DOL added that the agency invested in an electronic case management system for implementation in fiscal year 2015. DOL also provided technical comments which we incorporated as appropriate. In its written comments, OSC did not say whether it agreed with our two recommendations and expressed concerns about our characterization of performance data and conclusions. OSC's comments, including examples of case outcomes, are reprinted in appendix V. OSC also provided technical comments which we incorporated as appropriate. OSC expressed a concern that we ignored its efforts and successes in securing relief for veterans. OSC characterized the report as containing unreliable data, unsupportable conclusions, and a subjective assessment of relative performance. Our report presents a fair, balanced, and objective portrayal of relative performance between OSC and DOL. We worked with both agencies to develop an approach for collecting and reporting comparable performance data since the beginning of the demonstration project in 2011. Our report acknowledges the complexity of assessing relative performance for the performance metrics outlined in VBA. As we report, performance can be affected by factors such as the investigative approaches used by agencies, case type, and other factors. Where appropriate, we have provided additional information or clarification to ensure that the performance information is viewed in the appropriate context. Our response to the specific points raised by OSC are as follows. OSC expressed several concerns with our analysis of case outcomes. OSC commented that we diminished and obscured performance information regarding case outcomes. OSC also claimed we abdicated responsibility by not providing a qualitative assessment of case outcomes based on summary case information. Our report provides clear information regarding the number of cases received and closed, as well as the percentage of cases resolved in favor of the claimant. It was not our intention to assess the merits of agencies' case outcomes. At the beginning of the demonstration project, we made clear to OSC that it would be inappropriate for us to review case files and make an independent determination of the merits of agencies' case outcomes. Moreover, as previously stated, we worked with agencies prior to the demonstration project to develop a method to ensure that case outcomes could be described in a consistent manner, and a comparison could be made at the conclusion of the demonstration project. While we did not attempt to assess the merits of agencies' case outcomes, our report provides examples of specific relief claimants received and information on the types of outcomes OSC achieved. OSC objected to our inclusion of 7 cases from DOL that were decided in favor of the claimant, but did not result in the claimant receiving relief. To address this concern, our report provided additional details to describe our treatment of such cases. We had to rely on agencies' determination of whether certain cases were resolved in favor of the claimant because certain case outcomes were not included in the cross-walk. Specifically, as we report, OSC's totals include 3 cases in which corrective action was declined by the claimant, or about 3 percent of the 94 cases the agency resolved in favor of the claimant. DOL's totals include 7 cases that had merit, but which were not resolved, or about 11 percent of the 62 cases the agency resolved in favor of the claimant. According to OSC officials, the code "corrective action declined by claimant" indicates that the agency offered resolution but the claimant declined it. According to DOL officials, the code "merit, not resolved" indicates that the claim was meritorious, but the agency did not offer resolution, or the claimant declined the resolution that was offered. OSC expressed several concerns with our analysis of agency cost data. OSC stated that the cost information presented is unverifiable. We believe the cost information presented in our report is reliable. We took steps to assure the reasonableness of cost figures reported by agencies. We conducted tests on each agency's cost and accounting data to ensure appropriate cost components were accurately reflected in their total costs. We independently reviewed supporting documentation and verified that the cost information was reasonable. For more details regarding our analyses and data reliability assessments, see our scope and methodology in appendix I. OSC stated that DOL's total costs are incomplete because the total cost and hours reported constitute only about 105 hours per year, per DOL investigator, or about 5 percent of the 2,088 annual hour work schedule. OSC's statement is based on the false assumption that the 31 DOL investigators who worked on USERRA demonstration project cases, worked on them full time. We have added language to make clear that DOL investigators had other responsibilities. Specifically, we added information explaining that DOL's investigators also worked on other nonfederal USERRA or veterans' preference cases. OSC also speculated that the average annual caseload of 15 USERRA demonstration and non- demonstration project cases comprised DOL investigators' full workload. As we clarified, investigators also worked on veterans' preference cases, which are not included in this average annual caseload figure. Furthermore, we did not collect information on the time spent on non-demonstration project cases or other duties. Thus, we cannot draw conclusions about the average time, or proportion of time, dedicated to non-demonstration project related work activities. OSC stated we reported that neither agency tracked the costs of individual cases, and so were unable to report the cost of closed cases. However, we reported the total cost and the average cost for investigating a case during the demonstration project. OSC stated that, in 2012, we requested that the agency change its methodology for tracking costs to be more consistent with DOL. In 2011, at the beginning of the demonstration project, we recommended that agencies establish comparable methods and procedures for tracking and reporting demonstration project costs. We did not suggest a specific method or approach for agencies to follow, but asked that agencies agree to a comparable approach that would facilitate a relative comparison of costs at the conclusion of the demonstration project. Agencies implemented this recommendation, which enabled us to provide comparable cost information for both agencies in this report. OSC expressed concern that we failed to put the customer satisfaction survey results in the proper context. OSC stated that we did not expand on the limitations and potential biases of the survey data until the end of our report. We recognize the low survey response rate throughout the report and identified the actions we took that enabled us to draw conclusions about the survey. Our analyses revealed that differences in DOL and OSC satisfaction scores for each question remained statistically significant and pronounced even after controlling for variables that could affect the claimants' views of the customer service provided. However, we made minor adjustments in the Highlights page language to clarify that, due to the low response rate, we undertook additional statistical analyses to control for potential sources of bias. OSC stated that we failed to include OPM concerns about biases in the survey data. OPM's conclusions were based on a nonresponse analysis conducted in 2013, and not the final analyses provided in this report. OPM's analyses did not control for variables that could affect claimants' views of the services received from each agency. OSC stated that our decision to report the raw (not adjusted) customer satisfaction survey scores in the body of the report is misleading. OSC said that we reported adjusted scores only in appendix III. We reported on both raw and adjusted scores in our report. Both the actual and adjusted scores demonstrate, with a high degree of confidence, that DOL's respondents were more satisfied than OSC's respondents. We will send copies of this report to the Secretary of Labor and to the Special Counsel, and other interested parties. This report will also be available at no charge on GAO's website at [hyperlink, http://www.gao.gov]. If you have any questions on this report, please contact me at (202) 512-2717 or jonesy@gao.gov. Contact points for our offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix VI. Signed by: Yvonne D. Jones: Director, Strategic Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: The Veterans' Benefits Act of 2010 (VBA) required us to undertake a final assessment of the demonstration project and provide a report to Congress 90 days after the end of the demonstration project. This report (1) assesses agencies' relative performance under VBA performance metrics including case outcomes, customer satisfaction, timeliness, cost, and capacity; and (2) identifies actions agencies can take to improve service. To assess the agencies' relative performance, we reviewed the demonstration project requirements set forth in VBA and compared final agency data on case outcomes, timeliness, customer satisfaction, and cost. We also reviewed information on agency capacity including staffing levels, grade level, training, education, and caseload. The demonstration project period began on August 9, 2011, and concluded on August 9, 2014; however, to ensure we met the mandated reporting deadline, this report includes data collected through the end of July 2014. We also provided comparative and descriptive explanations of agency capacity. Case Outcomes and Timeliness. To assess case outcomes and timeliness, we analyzed data from the Department of Labor's (DOL) case tracking system (the USERRA Information Management System, or UIMS) and the Office of Special Counsel's (OSC) case tracking system (OSC 2000) for demonstration cases opened between August 9, 2011, and July 31, 2014. We also reviewed relevant agency documents and interviewed DOL and OSC staff. We assessed case tracking data to identify the number of cases received, the number of cases resolved in favor of the claimant, and those not resolved in favor of the claimant. To identify the number of cases received, we removed duplicate and non-unique claims from both agencies, and totaled the number of claims received and resolved each fiscal year. As such, our case totals may vary from the number of cases reported by agencies in prior Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA) progress reports issued to Congress. To determine the cases resolved in favor of the claimant, we reviewed the agencies' cross-walk of case outcomes to identify corresponding case closure dispositions and case closure codes. We also interviewed agency officials. DOL uses the resolution codes claim granted, claim settled, and merit - not resolved, to identify cases resolved in favor of claimants. OSC uses the codes dispute resolved, corrective action, and complainant declines corrective action offered. To assess the timeliness of cases, we calculated the average case investigation time for each fiscal year and for all demonstration project cases closed between August 9, 2011, and July 31, 2014. We also calculated the average age of demonstration project cases that were open as of July 31, 2014. In addition, we grouped cases into one of three categories based on the investigation processing times for both open and closed cases-cases open less than 30 days, cases open between 30 and 90 days, and cases open more than 90 days. We also considered factors that may have contributed to timeliness of case resolution, such as whether the case was decided in favor of the claimant, and whether the case involved a prohibited personnel practice. To assess the reliability of agency case tracking data, we reviewed agency documentation on any significant operational or case management changes occurring since our last report, issued in September 2012. We tested the data for missing entries, errors, duplicate entries, and other logic testing. We also reviewed related agency documentation, including our previous reports, and interviewed DOL and OSC staff. We determined that internal controls for the demonstration project had not changed substantially since our past reviews. We generally found low rates of missing data or erroneous dates pertinent to our analysis. For cases in which we found missing information or dates or Social Security numbers out of sequence, we followed up with the agency and, as appropriate, updated our analysis files with corrected information. For example, we found that OSC had received a number of demonstration project cases from claimants who did not provide a Social Security number, and which had an even-numbered opening date. According to the demonstration project procedures agreed to by both agencies, OSC was authorized to accept claims that did not have a corresponding Social Security number, if they were filed on an odd date, whereas, DOL would accept cases filed on even dates. We asked OSC about this discrepancy, and officials there explained that for some cases that did not have a corresponding Social Security number, the agency received claims on an odd numbered day, but did not enter the claim into their case tracking system until the next day. OSC provided us with the correct case opening date for these cases. DOL officials told us they followed the agency's existing USERRA operations manual during the demonstration project to ensure data reliability and validity, while OSC drafted a data reliability plan specifically for the demonstration project. Based on the collective results of our data reliability assessment, we consider the data elements we assessed in DOL and OSC case tracking databases to be sufficiently reliable for the purposes of evaluating relative performance of DOL and OSC during the demonstration project. Customer Satisfaction. To assess customer satisfaction, we analyzed data, results, and narrative responses from the USERRA customer satisfaction survey. Surveys were sent by each agency to claimants upon resolution of their cases. Survey data were collected independently by the Office of Personnel Management (OPM). The survey data collection ended July 28, 2014, for our reporting purposes. Agencies entered into an agreement with OPM to administer the satisfaction survey and provide interim reports providing customer satisfaction performance information. Because the response rate to the survey was low, we performed additional analyses including multivariate and nonresponse analyses to understand what conclusions could be drawn from the data and which variables might affect the results. For a detailed description of the analyses completed and their findings, see appendix III. We also interviewed agency officials at OSC, DOL, and OPM to gather supporting documentation including the survey instrument and other relevant information to facilitate our analyses of the customer satisfaction survey data. Furthermore, we reviewed relevant documents, such as interim survey reports and the interagency agreement between DOL and OPM for the survey administration, and the statement of work between DOL, OSC, and OPM for survey responsibilities. We assessed the reliability of the customer satisfaction survey data by testing the data for missing entries and errors, and employing other logic testing. We reviewed OPM documentation and interviewed OPM, DOL, and OSC staff. In addition, OPM described and provided supporting documentation of the procedures it has in place to ensure data reliability and validity, including running checks on the data for completeness. Based on the collective results of our data reliability assessment, we consider data provided by OPM on the customer satisfaction survey to be sufficiently reliable for the purposes of evaluating relative performance of DOL and OSC during the demonstration project. Agency Costs. To assess demonstration project costs, we reviewed and analyzed cost and accounting data from DOL and OSC, including supporting documentation such as the number of hours dedicated to demonstration project cases. While the cost accounting systems developed at each agency differ somewhat in the way they track time spent, both systems track actual salary, benefits, and indirect cost components by applying an hourly rate that includes those components for each specific employee who works on, and tracks time spent on, demonstration project cases. To determine the total cost, the agencies multiplied the hourly rate for all personnel who participated in the demonstration project by the total time spent working on USERRA demonstration project investigations. We also interviewed DOL and OSC staff responsible for collecting and reporting cost information. DOL provided cost and accounting data for the full demonstration project for the period August 9, 2011, to July 31, 2014. OSC provided complete and comparable cost data for the period August 12, 2012, to August 1, 2014, or about 24 months out of the 36-month demonstration project. OSC officials explained that they did not provide complete and comparable cost data prior to August 12, 2012, because the agency changed its process and methods for tracking and reporting data to be more consistent with the process and methods used by DOL. Prior to August 12, 2012, OSC tracked the costs of each case and reported costs of closed cases. After August 12, 2012, OSC began collecting and reporting costs for all demonstration project cases worked, including cases that were still open, and stopped tracking costs on a case-by- case basis. On September 4, 2014, OSC provided us with additional cost information for demonstration project cases that were investigated between May 2012 and August 12, 2012. This data was received about three weeks after the deadline we established for submitting cost and performance information. OSC also provided cost and accounting information for cases that were closed between August 2011 and May 2012. This data was incomplete because it excluded costs incurred for cases that remained open during this period. We did not include these data in our assessment of demonstration project costs because these data were incomplete, or not directly comparable to DOL costs. Thus, we determined we did not have sufficient time to verify the reliability of all of the data prior to our congressionally mandated reporting deadline. We assessed the reliability of DOL's and OSC's USERRA cost accounting systems by testing the data for missing entries and errors, employing other logic testing, reviewing DOL and OSC documentation, and interviewing agency staff. Furthermore, we determined that agencies developed steps for ensuring the reliability of cost data, including developing USERRA operations manuals, providing instructions to staff entering the data, and describing the steps for reviewing the data after entered by staff. Because OSC did not implement our recommendation to document procedures for compiling and reporting cost information, we also conducted a limited trace-to-file process to determine whether the agency's reported monthly costs accurately reflected the time and cost reported by employees. Specifically, we identified a random sample of five individual time and cost lines from the master cost and accounting spreadsheet and compared the totals to the agency's time and cost records for those months. During this assessment, we identified an error in the hourly rate of an employee used to determine demonstration project costs. OSC corrected this error and provided us with an updated time and cost spreadsheet. We then conducted a second trace-to-file sample of five randomly chosen individual time and cost lines and determined the data were sufficiently reliable for our purposes. We also performed a data check on the DOL cost data. We used the annualized cost rates for a number of investigators to manually calculate their total demonstration project costs for, and compared our results to, DOL's reported total costs for these investigators. Based on this check, we found the data to be sufficiently accurate. Based on the collective results of our data reliability assessment, we consider the DOL and OSC cost accounting data to be sufficiently reliable for the purposes of evaluating relative performance of DOL and OSC during the demonstration project. Agency Capacity. To describe agencies' capacity, we analyzed agency data on staff levels, grade levels, training, education, and caseload. We also reviewed agency documentation, and interviewed agency officials about factors impacting agency performance and capacity. To determine the average caseload of DOL investigators who worked on demonstration cases by fiscal year, we divided the number of demonstration cases opened per fiscal year by the number of investigators who worked on demonstration cases during that year. To determine DOL's overall average caseload per fiscal year for the entire length of the demonstration project, we took the average caseloads for each fiscal year and used them to create a weighted average--weighing each fiscal year by the number of months the demonstration project was ongoing--because demonstration cases were not investigated for the entirety of fiscal years 2011 and 2014. We also calculated the average caseload of all active DOL investigators by 1) dividing the number of cases all active DOL investigators were assigned by the number of years they had been investigating cases to calculate the average caseload of all active investigators and then 2) dividing the sum of the individual investigators' averages by the total number of active investigators to calculate the overall average caseload for all active DOL investigators. To determine the average number of cases opened by OSC staff members investigating claims, we calculated the average number of cases opened per staff member for each period of time that had a stable amount of staff. Specifically, each time a new staff member was hired or a staff member left OSC, we created a new time period to better capture the number of cases opened per staff member. Finally, we used these nine averages to create a weighted average based on the number of months in the time period to calculate the average number of cases opened per staff member investigating claims at OSC. In addition, we reviewed agency documentation and testimonial evidence, as well as analyzed agency data, to describe DOL's and OSC's staffing levels, training programs on USERRA investigations and general investigation techniques, and the agencies' views on their unique qualifications to investigate claims. To identify actions agencies can take to improve service, we analyzed agency customer satisfaction data and compared results to customer service principles and guidance outlined in executive orders, Office of Management and Budget guidance, and the governmentwide performance plan at [hyperlink, http://www.performance.gov]. We also interviewed agency officials about their views on related procedures and practices that worked well or needed improvement. We conducted this performance audit from April 2014 to November 2014 in accordance with generally accepted government auditing standards. These standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: USERRA Demonstration Project Customer Satisfaction Survey Administration and Instrument: The Veterans' Benefits Act of 2010 includes customer satisfaction as one of five performance metrics to be used to assess the relative performance of the Department of Labor (DOL) and the Office of Special Counsel (OSC) during the Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA) demonstration project. In response to our previous recommendation, both agencies agreed on the method by which customer satisfaction data would be collected. Specifically, the agencies agreed to administer a customer satisfaction survey, and entered into an interagency agreement with the Office of Personnel Management (OPM) to collect survey data and provide regular reports to agencies with comparative performance information. Furthermore, with cooperation from agencies, OPM developed a survey plan and developed protocols for contacting respondents.[Footnote 23] The customer satisfaction survey was provided to all claimants whose cases were closed between the start of the demonstration project on August 9, 2012, and July 28, 2014. OPM established an end date for collection of survey data of July 28, 2014, for reporting purposes. The customer satisfaction survey was initially sent via an email link on April 19, 2012, 8 months after the start of the demonstration project, to all claimants whose cases had been closed since August 9, 2011. Since then, DOL and OSC have sent the survey on an ongoing basis after cases are closed. When the survey began in April 2012, DOL and OSC emailed the claimants a link to the customer satisfaction survey, followed by a reminder emailed one week after case resolution. Two weeks following the initial notification, DOL and OSC sent a hard-copy reminder to claimants with the survey link. However, our interim report on the demonstration project found the response rate to the customer satisfaction survey was low for both agencies and recommended that DOL and OSC take actions to increase the response rate.[Footnote 24] In response, beginning on May 20, 2013, both agencies provided claimants with an initial notification that a survey would be provided to them upon completion of their cases and requesting their participation in the survey. In addition, both agencies sent claimants two follow-up emails. The survey allows respondents to report their satisfaction regarding several aspects of their experiences with DOL and OSC, as shown in figure 9. The survey included nine statements regarding claimants experience, and provided respondents the option to respond to statements by selecting "strongly disagree," "disagree," "neither agree nor disagree," "agree," "strongly agree," or "no basis to judge." The survey also included four categories for the respondents to express their level of satisfaction with the service provided by selecting "very dissatisfied," "dissatisfied," "neither," "satisfied," "very satisfied," and "no basis to judge"; and one question for respondents to express their level of satisfaction with the complaint form used to file USERRA claims. The survey included two open-ended questions allowing respondents to describe what went well and what needed to change about their agency experience. In addition, the survey asked claimants to self report the outcome of their case and their military affiliation. Figure 9 shows the survey instrument OPM used to collect customer satisfaction data. Figure 9: OPM 2012 Customer Satisfaction Survey Instrument: [Refer to PDF for image: survey data] Customer Satisfaction Survey: Part 1: Background Information: 1. Which of the following best describes the outcome of your complaint? My complaint was decided in my favor; My complaint was decided partially in my favor; My complaint was not decided in my favor; Do Not Know. 2. What is your military affiliation? National Guard: Reserves: Active Duty Military: Other-Please Specify: Part 2: Customer Experiences: 3. The OSC/DOL staff are courteous. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 4. The OSC/DOL staff are competent. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 5. The OSC/DOL staff are professional. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 6. OSC/DOL provides consistently good service. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 7. OSC/DOL policies and procedures are customer friendly. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 8. I have adequate access to OSC/DOL staff for advice and assistance. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 9. The OSC/DOL staff keep me informed of significant case developments. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 10. I know whom to contact at OSC/DOL if I have additional questions. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 11. OSC/DOL staff responded to my questions in a timely manner. Strongly Disagree: Disagree: Neither Agree: Strongly Agree: No Basis to Judge: 12. How satisfied are you with the following products and services: Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: No Basis to Judge: a. Thoroughness of investigation Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: No Basis to Judge: b. Clarity of written communication Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: No Basis to Judge: c. Clarity of verbal communication Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: No Basis to Judge: The E-1010 (complaint) form is well designed and easy to use. (Non- evaluation item) Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: No Basis to Judge: Part 3: Overall Satisfaction and Comments: 13. Overall, how satisfied are you with the customer service provided by OSC/DOL? Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: 14. Overall, how satisfied are you with OSC's/DOL's investigation of your complaint? Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: 15. Overall, how satisfied are you with the results of the investigation? Very Dissatisfied: Dissatisfied: Neither: Satisfied: Very Satisfied: 16. Use the following space to describe what OSC/DOL is doing well. (2000 characters): 17. Use the following space to describe what you would like to see OSC/DOL change. (2000 characters): Source: OPM. GAO-15-77. [End of figure] [End of section] Appendix III: Nonresponse and Multivariate Analysis of Customer Satisfaction Data: In light of the low response rates to the customer satisfaction survey, we performed additional analyses including nonresponse and multivariate analyses to understand what conclusions could be drawn from the data and which variables might affect the results. We found pronounced differences in customer satisfaction between the two agencies as indicated by each of the 15 survey questions. In all but one of the questions, differences persist even after taking into account differences in the favorability of the outcomes of the claims, case processing times, and whether discrimination was alleged. Nonresponse Analysis. Satisfaction with the Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA) claims process at the Department of Labor (DOL) and the Office of Special Counsel (OSC) is measured by responses from participants to 15 questions on an exit survey which is completed voluntarily and electronically after participants completed the process. Table 4 shows that the overall response rate to the survey was low (37.2 percent) and that the response rate was higher at OSC (41.6 percent) than at DOL (32.2 percent). Table 4: Survey Response Rates at DOL and OSC: Agency: DOL; Response Status: Non-Responders: 213; 67.8%; Responders: 101; 32.2%; Total: 314; 100.0%. Agency: OSC; Response Status: Non-Responders: 212; 58.4%; Responders: 151; 41.6%; Total: 363; 100.0%. Agency: Total; Response Status: Non-Responders: 425; 62.8%; Responders: 252; 37.2%; Total: 677; 100.0%. L[2] (Independence) = 6.44 with 1 df, P = .01162.8%. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15-77. [End of table] The low overall response rate and the significant difference in response rates across the two agencies are potentially troublesome inasmuch as responders and non-responders may differ with respect to their satisfaction with the investigation of their claims or differ on other characteristics that affect satisfaction. To the extent that such differences exist, estimates of the level of satisfaction in the two agencies and the differences in satisfaction between them may be biased. We do not know whether responders and non-responders differ with respect to their satisfaction with the investigation of their claims, since only the responders provided information about their satisfaction by responding to the survey. However, responders and non- responders can be compared on three characteristics that might affect satisfaction, using data that the agencies provided us with--namely, the time it took for their cases to be investigated, whether discrimination was alleged as part of their claim, and whether the claim was settled in favor of the claimant. These comparisons, both within each of the two agencies and overall, are shown in tables 5, 6, and 7. Table 5 shows that overall (or when both agencies are combined, in the bottom panel of the table) responders were significantly more likely to have lengthy processing times (more than 90 days) and less likely to have short processing times (1 to 30 days). This overall difference was statistically significant. While differences between responders and non-responders within each agency were not statistically significant, the tendency for there to be fewer responders than non- responders with short processing times is evident in each agency. Table 5: Case Processing Time for Responders and Non-Responders, by Agency and Overall: Agency: DOL; Response Status; Case Processing Time (in days): 1-30; Non-Responders: 110; 52.1%; Case Processing Time (in days): 31-90; Non-Responders: 82; 38.9%; Case Processing Time (in days): 91+; Non-Responders: 19; 9.0%; Total: Non-Responders: 211; 100.0%. Responders: Case Processing Time (in days): 1-30; Responders: 40; 39.6%; Case Processing Time (in days): 31-90; Responders: 53; 52.5%; Case Processing Time (in days): 91+; Responders: 8; 7.9%; Total: Responders: 101; 100.0%. Case Processing Time (in days): 1-30; Responders: 150; 48.1%; Case Processing Time (in days): 31-90; Responders: 135; 43.3%; Case Processing Time (in days): 91+; Responders: 27; 8.7%; Total: Responders: 312; 100.0%. L2(Independence) = 5.23 with 2 df, P = .073. Agency: OSC; Response Status; Case Processing Time (in days): 1-30; Non-Responders: 39; 18.4%; Case Processing Time (in days): 31-90; Non-Responders: 79; 37.3%; Case Processing Time (in days): 91+; Non-Responders: 94; 44.3; Total: Non-Responders: 212; 100.0%. Responders: Case Processing Time (in days): 1-30; Responders: 19; 12.6%; Case Processing Time (in days): 31-90; Responders: 53; 35.1%; Case Processing Time (in days): 91+; Responders: 79; 52.3%; Total: Responders: 151; 100.0%. Case Processing Time (in days): 1-30; Responders: 56; 16.0%; Case Processing Time (in days): 31-90; Responders: 132; 36.4%; Case Processing Time (in days): 91+; Responders: 173; 47.7%; Total: Responders: 363; 100.0%. L2(Independence) = 3.20 with 2 df, P = .202. Agency: Overall; Response Status; Case Processing Time (in days): 1-30; Non-Responders: 149; 35.2%; Case Processing Time (in days): 31-90; Non-Responders: 161; 38.1%; Case Processing Time (in days): 91+; Non-Responders: 113; 26.7; Total: Non-Responders: 423; 100.0%. Responders: Case Processing Time (in days): 1-30; Responders: 59; 23.4%; Case Processing Time (in days): 31-90; Responders: 106; 42.1%; Case Processing Time (in days): 91+; Responders: 87; 34.5%; Total: Responders: 252; 100.0%. Case Processing Time (in days): 1-30; Responders: 208; 30.8%; Case Processing Time (in days): 31-90; Responders: 267; 39.6%; Case Processing Time (in days): 91+; Responders: 200; 29.6%; Total: Responders: 675; 100.0%. L2(Independence) = 11.26 with 2 df, P = .004. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15-77. [End of table] Table 6 shows that discrimination allegations overall and within each agency did not, strictly speaking, differ significantly between responders and non-responders. But, differences approached significance in each agency (.05 < p < .10) and showed that fewer responders than non-responders had alleged discrimination as part of their claim. Table 6: Discrimination Allegations for Responders and Non-Responders, by Agency and Overall: Agency: DOL; Discrimination Alleged; Response Status: No; Non-Responders: 90; 42.7%; Response Status: Yes; Non-Responders: 121; 57.3%; Response Status: Total: Non-Responders: 211; 100%. Response Status: No; Responders: 54; 53.5%; Response Status: Yes; Responders: 47; 46.5%; Response Status: Total: Responders: 101; 100%. Response Status: No; Total: 114; 46.2%; Response Status: Yes; Total: 168; 53.8%; Response Status: Total: Total: 312; 100%. L2(Independence) = 3.21 with 1 df, P = .073. Agency: OSC; Discrimination Alleged; Response Status: No; Non-Responders: 29; 13.7%; Response Status: Yes; Non-Responders: 183; 86.3%; Response Status: Total: Non-Responders: 212; 100%. Response Status: No; Responders: 32; 21.2%; Response Status: Yes; Responders: 119; 78.8%; Response Status: Total: Responders: 151; 100%. Response Status: No; Total: 61; 16.8%; Response Status: Yes; Total: 302; 83.2%; Response Status: Total: Total: 363; 100%. L2(Independence) = 3.51 with 1 df, P = .061. Agency: Overall; Discrimination Alleged; Response Status: No; Non-Responders: 119; 28.1%; Response Status: Yes; Non-Responders: 304; 71.9%; Response Status: Total: Non-Responders: 423; 100%. Response Status: No; Responders: 86; 34.1%; Response Status: Yes; Responders: 166; 65.9%; Response Status: Total: Responders: 252; 100%. Response Status: No; Total: 205; 30.4%; Response Status: Yes; Total: 470; 69.6%; Response Status: Total: Total: 675; 100%. L2(Independence) = 2.66 with 1 df, P = .103. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15-77. [End of table] Table 7 shows that overall, and for each survey question, agency responders were significantly more likely than non-responders to have their claims decided in their favor. Because of the very sizable differences here and somewhat smaller differences between responders and non-responders in processing time and discrimination allegations in the analyses of the differences in satisfaction, we used multivariate models to adjust the agency differences for case processing time, whether discrimination was alleged, and whether the outcome was favorable. In those analyses however, we used a survey measure derived from the respondents, rather than the indicator provided by the agency, to indicate whether their claim was decided upon favorably. While the two different indicators are very strongly associated (see table 8) the association is not perfect. We believed it would be more informative to adjust for what respondents thought was the outcome of their claim at the time of the survey rather than what the agency told us was ultimately the outcome. Table 7: Favorable Outcomes for Responders and Non-Responders, by Agency and Overall: Agency: DOL; Favorable Outcome: Response Status: No; Non-Responders: 184; 87.2%; Response Status: Yes; Non-Responders: 27; 12.8%; Response Status: Total: Non-Responders: 211; 100%. Response Status: Yes; Responders: 67; 66.3%; Response Status: Yes; Responders: 34; 33.7%; Response Status: Total: Responders: 101; 100%. Response Status: Total; Responders: 251; 80.4%; Response Status: Yes; Responders: 61; 19.6%; Response Status: Total: Responders: 312; 100%. L2(Independence) = 17.88 with 1 df, P < .001; Agency: OSC; Favorable Outcome: Response Status: No; Non-Responders: 172; 81.1%; Response Status: Yes; Non-Responders: 40; 18.9%; Response Status: Total: Non-Responders: 212; 100%. Response Status: Yes; Responders: 106; 70.2%; Response Status: Yes; Responders: 45; 29.8%; Response Status: Total: Responders: 151; 100%. Response Status: Total; Responders: 278; 76.6%; Response Status: Yes; Responders: 85; 23.4%; Response Status: Total: Responders: 363; 100%. L2(Independence) = 5.81 with 1 df, P = .016; Agency: Overall; Favorable Outcome; Response Status: No; Non-Responders: 356; 84.2%; Response Status: Yes; Non-Responders: 67; 15.8%; Response Status: Total: Non-Responders: 423; 100%. Response Status: Yes; Responders: 173; 68.7%; Response Status: Yes; Responders: 79; 31.3%; Response Status: Total: Responders: 252; 100%. Response Status: Total; Responders: 529; 78.4%; Response Status: Yes; Responders: 146; 21.6%; Response Status: Total: Responders: 675; 100%. L2(Independence) = 21.83 with 1 df, P < .001; Source: GAO analysis of OPM customer satisfaction survey data. GAO-15- 77. [End of table] Table 8: Agency Reported Outcome by Respondent Perception of the Outcome: Respondent Perception of the Outcome: Favorable; Agency Reported Outcome: Not Favorable: 14; 18.7%; Favorable: 61; 81.3%; Total: 75; 100.0%. Respondent Perception of the Outcome: Don't Know; Agency Reported Outcome: Not Favorable: 46; 82.1%; Favorable: 10; 17.9%; Total: 56; 100.0%. Respondent Perception of the Outcome: Not Favorable; Agency Reported Outcome: Not Favorable: 107; 94.7%; Favorable: 6; 5.3%; Total: 113; 100.0%. Respondent Perception of the Outcome: Total; Agency Reported Outcome: Not Favorable: 167; 68.4%; Favorable: 77; 31.6%; Total: 244; 100.0%. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15- 77. [End of table] Multivariate Analysis. To discern whether there are differences in participants' satisfaction between the two agencies and whether they persist after taking account of case processing times, whether discrimination was alleged, and whether the outcome of the claim was favorable, we undertook two different analyses. In the first set of analyses we treated responses as if they were interval-ratio measures, scored them from 1 ("strongly disagree" or "very "dissatisfied") to 5 ("strongly agree" or "very "satisfied"), and averaged the scores across all respondents in each agency for each of the 15 items separately. We then calculated the differences between the two agencies in these average scores across each of the items, and tested the significance of those differences using t-tests (when we estimated raw or unadjusted differences) and F-statistics (when we estimated differences after adjusting, or effectively holding constant, the three aforementioned confounds).[Footnote 25] In the second set of analyses we treated responses as categorical, and because of the small sample size and our desire to look at agency differences before and after adjustment, we collapsed response categories to contrast unfavorable responses (reflecting disagreement or dissatisfaction) with favorable responses (reflecting agreement or satisfaction). We then calculated the odds on responding favorably in the two agencies and odds ratios reflecting the differences between them, and tested the significance of these differences using chi- square statistics. We then reestimated these odds ratios and re-tested the difference between agencies in the odds on responding favorably to each of the 15 items after taking account of the processing times, discrimination allegations and whether the outcome of the case was favorable or unfavorable to the claimant.[Footnote 26] Table 9 shows the wordings to the 15 survey items, and the percentages of respondents at the two agencies who responded to them in different ways. Using a five-point Lickert scale which ranged from "strongly disagree" to "strongly agree" for the first nine items and from "very dissatisfied" to "very satisfied" for the last six items in the table. Table 9: Responses to 15 Satisfaction Items on the USERRA Survey Questionnaire, by Agency: The staff are courteous: Agency: DOL; Strongly Disagree: 9.0%; Disagree: 4.0%; Neither: 5.0%; Agree: 21.0%; Strongly Agree: 61.0%; N: 100. Agency: OSC; Strongly Disagree: 12.8%; Disagree: 10.1%; Neither: 12.8%; Agree: 28.9%; Strongly Agree: 35.6%; N: 149. The staff are competent: Agency: DOL; Strongly Disagree: 9.2%; Disagree: 8.2%; Neither: 14.3%; Agree: 16.3%; Strongly Agree: 52.0%; N: 98. Agency: OSC; Strongly Disagree: 22.9%; Disagree: 21.5%; Neither: 18.8%; Agree: 11.8%; Strongly Agree: 25.0%; N: 144. The staff are professional: Agency: DOL; Strongly Disagree: 9.0%; Disagree: 3.0%; Neither: 5.0%; Agree: 26.0%; Strongly Agree: 57.0%; N: 100. Agency: OSC; Strongly Disagree: 17.6%; Disagree: 12.8%; Neither: 18.9%; Agree: 20.9%; Strongly Agree: 29.7%; N: 148. The staff provides consistently good service: Agency: DOL; Strongly Disagree: 10.9%; Disagree: 9.8%; Neither: 15.2%; Agree: 13.0%; Strongly Agree: 51.1%; N: 92. Agency: OSC; Strongly Disagree: 28.7%; Disagree: 25.9%; Neither: 16.1%; Agree: 8.4%; Strongly Agree: 21.0%; N: 143. The staff policies and procedures are customer friendly: Agency: DOL; Strongly Disagree: 14.1%; Disagree: 13.0%; Neither: 9.8%; Agree: 18.5%; Strongly Agree: 44.6%; N: 92. OSC; Strongly Disagree: 23.4%; Disagree: 20.0%; Neither: 21.4%; Agree: 13.8%; Strongly Agree: 21.4%; N: 145. I have adequate access to staff for advice and assistance: Agency: DOL; Strongly Disagree: 12.5%; Disagree: 9.4%; Neither: 12.5%; Agree: 17.7%; Strongly Agree: 47.9%; N: 96. Agency: OSC; Strongly Disagree: 32.0%; Disagree: 23.8%; Neither: 13.6%; Agree: 10.9%; Strongly Agree: 19.7%; N: 147. The staff keep me informed of significant case developments: Agency: DOL; Strongly Disagree: 12.0%; Disagree: 8.0%; Neither: 13.0%; Agree: 20.0%; Strongly Agree: 47.0%; N: 100. Agency: OSC; Strongly Disagree: 37.8%; Disagree: 22.4%; Neither: 9.1%; Agree: 11.9%; Strongly Agree: 18.9%; N: 143. I know whom to contact if I have additional questions: Agency: DOL; Strongly Disagree: 15.0%; Disagree: 6.0%; Neither: 8.0%; Agree: 21.0%; Strongly Agree: 50.0%; N: 100. Agency: OSC; Strongly Disagree: 26.8%; Disagree: 19.5%; Neither: 10.7%; Agree: 21.5%; Strongly Agree: 21.5%; N: 149. The staff responded to my questions in a timely manner: Agency: DOL; Strongly Disagree: 11.0%; Disagree: 4.0%; Neither: 9.0%; Agree: 16.0%; Strongly Agree: 60.0%; N: 100. Agency: OSC; Strongly Disagree: 35.1%; Disagree: 20.3%; Neither: 12.2%; Agree: 13.5%; Strongly Agree: 18.9%; N: 148. Satisfaction with thoroughness of investigation: Agency: DOL; Very Dissatisfied: 22.8%; Dissatisfied: 10.9%; Neither: 12.0%; Satisfied: 12.0%; Very Satisfied: 42.4%; N: 92. Agency: OSC; Very Dissatisfied: 54.7%; Dissatisfied: 15.8%; Neither: 5.8%; Satisfied: 7.9%; Very Satisfied: 15.8%; N: 139. Satisfaction with clarity of written communication: Agency: DOL; Very Dissatisfied: 11.3%; Dissatisfied: 6.2%; Neither: 13.4%; Satisfied: 19.6%; Very Satisfied: 49.5%; N: 97. Agency: OSC; Very Dissatisfied: 29.0%; Dissatisfied: 15.2%; Neither: 16.6%; Satisfied: 18.6%; Very Satisfied: 20.7%; N: 145. Satisfaction with clarity of verbal communication: Agency: DOL; Very Dissatisfied: 11.3%; Dissatisfied: 4.1%; Neither: 15.5%; Satisfied: 19.6%; Very Satisfied: 49.5%; N: 97. Agency: OSC; Very Dissatisfied: 31.7%; Dissatisfied: 12.7%; Neither: 15.5%; Satisfied: 17.6%; Very Satisfied: 22.5%; N: 142. Satisfaction with customer service: Agency: DOL; Very Dissatisfied: 14.0%; Dissatisfied: 7.0%; Neither: 13.0%; Satisfied: 27.0%; Very Satisfied: 39.0%; N: 100. Agency: OSC; Very Dissatisfied: 43.7%; Dissatisfied: 9.9%; Neither: 12.6%; Satisfied: 19.9%; Very Satisfied: 13.9%; N: 151. Satisfaction with investigation of complaint: Agency: DOL; Very Dissatisfied: 23.8%; Dissatisfied: 9.9%; Neither: 10.9%; Satisfied: 18.8%; Very Satisfied: 36.6%; N: 101. Agency: OSC; Very Dissatisfied: 55.6%; Dissatisfied: 13.9%; Neither: 5.3%; Satisfied: 11.9%; Very Satisfied: 13.2%; N: 151. Satisfaction with results of investigation: Agency: DOL; Very Dissatisfied: 30.7%; Dissatisfied: 11.9%; Neither: 14.9%; Satisfied: 18.8%; Very Satisfied: 23.8%; N: 101. Agency: OSC; Very Dissatisfied: 56.7%; Dissatisfied: 8.7%; Neither: 9.3%; Satisfied: 12.7%; Very Satisfied: 12.7%; N: 150. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15- 77. [End of table] The results of the first set of analyses are shown in table 10. The first column of numbers (N) shown in the table indicates the numbers of respondents to each item in the two agencies. The second column of numbers shows the average or mean response to each item. Table 10: Odds and Odds Ratios Indicating Differences in Satisfaction at DOL and OSC before and after Adjusting for Whether the Outcome Was Favorable, Case Processing Time, and Whether Discrimination Was Alleged: The staff are courteous: Agency: DOL; Strongly Disagree: 9.0%; Disagree: 4.0%; Neither: 5.0%; Agree: 21.0%; Strongly Agree: 61.0%; N: 100. Agency: OSC; Strongly Disagree: 12.8%; Disagree: 10.1%; Neither: 12.8%; Agree: 28.9%; Strongly Agree: 35.6%; N: 149. The staff are competent: Agency: DOL; Strongly Disagree: 9.2%; Disagree: 8.2%; Neither: 14.3%; Agree: 16.3%; Strongly Agree: 52.0%; N: 98. Agency: OSC; Strongly Disagree: 22.9%; Disagree: 21.5%; Neither: 18.8%; Agree: 11.8%; Strongly Agree: 25.0%; N: 144. The staff are professional: Agency: DOL; Strongly Disagree: 9.0%; Disagree: 3.0%; Neither: 5.0%; Agree: 26.0%; Strongly Agree: 57.0%; N: 100. Agency: OSC; Strongly Disagree: 17.6%; Disagree: 12.8%; Neither: 18.9%; Agree: 20.9%; Strongly Agree: 29.7%; N: 148. The staff provides consistently good service: Agency: DOL; Strongly Disagree: 10.9%; Disagree: 9.8%; Neither: 15.2%; Agree: 13.0%; Strongly Agree: 51.1%; N: 92. Agency: OSC; Strongly Disagree: 28.7%; Disagree: 25.9%; Neither: 16.1%; Agree: 8.4%; Strongly Agree: 21.0%; N: 143. The staff policies and procedures are customer friendly: Agency: DOL; Strongly Disagree: 14.1%; Disagree: 13.0%; Neither: 9.8%; Agree: 18.5%; Strongly Agree: 44.6%; N: 92. Agency: OSC; Strongly Disagree: 23.4%; Disagree: 20.0%; Neither: 21.4%; Agree: 13.8%; Strongly Agree: 21.4%; N: 145. I have adequate access to staff for advice and assistance: Agency: DOL; Strongly Disagree: 12.5%; Disagree: 9.4%; Neither: 12.5%; Agree: 17.7%; Strongly Agree: 47.9%; N: 96. Agency: OSC; Strongly Disagree: 32.0%; Disagree: 23.8%; Neither: 13.6%; Agree: 10.9%; Strongly Agree: 19.7%; N: 147. The staff keep me informed of significant case developments: Agency: DOL; Strongly Disagree: 12.0%; Disagree: 8.0%; Neither: 13.0%; Agree: 20.0%; Strongly Agree: 47.0%; N: 100. Agency: OSC; Strongly Disagree: 37.8%; Disagree: 22.4%; Neither: 9.1%; Agree: 11.9%; Strongly Agree: 18.9%; N: 143. I know whom to contact if I have additional questions: Agency: DOL; Strongly Disagree: 15.0%; Disagree: 6.0%; Neither: 8.0%; Agree: 21.0%; Strongly Agree: 50.0%; N: 100. Agency: OSC; Strongly Disagree: 26.8%; Disagree: 19.5%; Neither: 10.7%; Agree: 21.5%; Strongly Agree: 21.5%; N: 149. The staff responded to my questions in a timely manner: Agency: DOL; Strongly Disagree: 11.0%; Disagree: 4.0%; Neither: 9.0%; Agree: 16.0%; Strongly Agree: 60.0%; N: 100. Agency: OSC; Strongly Disagree: 35.1%; Disagree: 20.3%; Neither: 12.2%; Agree: 13.5%; Strongly Agree: 18.9%; N: 148. Satisfaction with thoroughness of investigation: Agency: DOL; Very Dissatisfied: 22.8%; Dissatisfied: 10.9%; Neither: 12.0%; Satisfied: 12.0%; Very Satisfied: 42.4%; N: 92. Agency: OSC; Very Dissatisfied: 54.7%; Dissatisfied: 15.8%; Neither: 5.8%; Satisfied: 7.9%; Very Satisfied: 15.8%; N: 139. Satisfaction with clarity of written communication: Agency: DOL; Very Dissatisfied: 11.3%; Dissatisfied: 6.2%; Neither: 13.4%; Satisfied: 19.6%; Very Satisfied: 49.5%; N: 97. Agency: OSC; Very Dissatisfied: 29.0%; Dissatisfied: 15.2%; Neither: 16.6%; Satisfied: 18.6%; Very Satisfied: 20.7%; N: 145. Satisfaction with clarity of verbal communication: Agency: DOL; Very Dissatisfied: 11.3%; Dissatisfied: 4.1%; Neither: 15.5%; Satisfied: 19.6%; Very Satisfied: 49.5%; N: 97. Agency: OSC; Very Dissatisfied: 31.7%; Dissatisfied: 12.7%; Neither: 15.5%; Satisfied: 17.6%; Very Satisfied: 22.5%; N: 142. Satisfaction with customer service: Agency: DOL; Very Dissatisfied: 14.0%; Dissatisfied: 7.0%; Neither: 13.0%; Satisfied: 27.0%; Very Satisfied: 39.0%; N: 100. Agency: OSC; Very Dissatisfied: 43.7%; Dissatisfied: 9.9%; Neither: 12.6%; Satisfied: 19.9%; Very Satisfied: 13.9%; N: 151. Satisfaction with investigation of complaint: Agency: DOL; Very Dissatisfied: 23.8%; Dissatisfied: 9.9%; Neither: 10.9%; Satisfied: 18.8%; Very Satisfied: 36.6%; N: 101. Agency: OSC; Very Dissatisfied: 55.6%; Dissatisfied: 13.9%; Neither: 5.3%; Satisfied: 11.9%; Very Satisfied: 13.2%; N: 151. Satisfaction with results of investigation: Agency: DOL; Very Dissatisfied: 30.7%; Dissatisfied: 11.9%; Neither: 14.9%; Satisfied: 18.8%; Very Satisfied: 23.8%; N: 101. Agency: OSC; Very Dissatisfied: 56.7%; Dissatisfied: 8.7%; Neither: 9.3%; Satisfied: 12.7%; Very Satisfied: 12.7%; N: 150. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15- 77. [A] OF = Whether the Outcome Was Favorable; CPT = Case Processing Time; DA = Whether Discrimination Was Alleged; Shading indicates differences that are statistically significant (P < .05). [End of table] The average responses at DOL on all items except the last (satisfaction with the results of the investigation) tended to be favorable, with mean scores above 3.0. This score indicates that respondents neither agreed nor disagreed, or were neither satisfied nor dissatisfied. At OSC, this was true only for only a couple of the 15 items (the staff were courteous and the staff were professional). Moreover, average scores for every one of the 15 items were significantly higher at DOL than at OSC. The differences (shown in the third column of numbers in the table) ranged from roughly half a point to more than a full point.[Footnote 27] The estimates from our multivariate regression models show that these differences remain sizable and significant even after we take into account (1) whether the outcomes of the different claims in the two agencies were decided favorably or unfavorably, (2) the case processing times, and (3) whether discrimination was alleged. So while the average responses on all items were decidedly higher (reflecting greater satisfaction) when cases were decided in the claimant's favor than when claims were not settled in the claimant's favor (these analyses are not shown, but are available on request), the differences in satisfaction between agencies were only slightly attenuated when the outcome of the claim was taken into account (or held constant). [Footnote 28] The same was true when the other two factors (case processing times and discrimination allegations) were statistically controlled. The results of the second set of analyses are shown in table 11. In these analyses, the 15 satisfaction items are treated categorically. The table shows (in the first two columns of numbers) the numbers of respondents in each agency who responded unfavorably (by not agreeing or not expressing satisfaction) and favorably (by agreeing or expressing satisfaction) to each item.[Footnote 29] While it is common to make comparisons across groups (in our case the two agencies) by converting the numbers to percentages and looking at differences in those percentages, we chose instead to calculate odds and to look at their ratios. Odds are estimated by calculating, for each item in each agency, the number of favorable responses relative to the number of unfavorable responses. For example, with respect to the "staff are courteous" item, the odds on agreeing at DOL were 82/18 = 4.56, while the odds on responding favorably at OSC were 96/53 = 1.81. While less common than percentages or probabilities, these odds have an equally straightforward interpretation. They imply that at DOL there were more than four respondents who agreed the staff were courteous for every one that felt otherwise, while at OSC there were slightly less than two respondents who agreed the staff were courteous for every one that felt otherwise. The ratio of these two odds (in the "odds ratio" column of the table), which for this item is 4.56/1.81 = 2.52, indicates that the odds on agreeing that staff are courteous were more than twice as high at DOL as at OSC. As can be seen by looking down the "odds ratio" column, the odds on responding favorably are higher at DOL than at OSC for every one of the items, in all cases by at least a factor of two and in some cases by factors of 4 or 6. These sizable differences are in all cases statistically significant and mirror the findings from the first set of analyses. Additionally, and as we also saw in the first analyses, the large differences reflected by these ratios do not go away when we control for differences across agencies in whether claims were settled favorably, in processing times, and whether discrimination was alleged. Under the "Odds Ratio, Adjusted for" heading, the agency ratios are the same odds ratio we see in the column to the left of it. But here, it has been adjusted by taking account of the differences across agencies in whether claims were decided favorably, in case processing times, and in whether discrimination was alleged.[Footnote 30] So even (for example) when we control for or take account of the fact that respondents whose claims were decided favorably were more likely to agree that "staff are courteous," and even after we allow for whatever differences there are in favorable and unfavorable claims across the two agencies, we find that respondents at DOL were twice as likely as those at OSC to give favorable responses to the item involving "staff are courteous." And, similarly, DOL respondents were more likely to give favorable responses than OSC respondents to every other item, by factors ranging from roughly 2 to more than 6. The only item on which respondents did not differ after adjustment was the final item in the table, pertaining to satisfaction with the results of the investigation. For that item the difference between agencies in respondents expressing satisfaction (odds ratio = 1.84) was not significant after controlling for whether the investigation yielded an outcome that was favorable or unfavorable to the claimant. In sum, there are very big differences in satisfaction between the two agencies indicated by every one of the 15 items. All but one of the differences persists even after we take account of differences in the two agencies in the favorability of the outcomes of the claims, case processing times, and whether discrimination was alleged. The percentages of satisfied claimants in both agencies may be somewhat biased by the low response rates to the survey. But, our analyses suggest that the differences between agencies, and the lower overall satisfaction among claimants at OSC, do not appear to be accounted for by differences in the outcomes of the claims, or the case processing times, or allegations of discrimination. Table 11: Odds and Odds Ratios Indicating Differences in Satisfaction at DOL and OSC Before and After Adjusting for Whether the Outcome Was Favorable, Case Processing Time, and Whether Discrimination Was Alleged: The staff are courteous: Agency: DOL; N: 100; Mean: 4.21; Difference (Unadjusted) (Unadjusted): 0.57; Differences, Adjusted for[A]: OF: 0.46; CPT: 0.58; DA: 0.46. Agency: OSC; N: 149; Mean: 3.64. The staff are competent: Agency: DOL; N: 98; Mean: 3.94; Difference (Unadjusted) (Unadjusted): 0.99; Differences, Adjusted for[A]: OF: 0.84; CPT: 0.95; DA: 0.90. Agency: OSC; N: 144; Mean: 2.94. The staff are professional: Agency: DOL; N: 100; Mean: 4.19; Difference (Unadjusted) (Unadjusted): 0.87; Differences, Adjusted for[A]: OF: 0.76; CPT: 0.89; DA: 0.76. Agency: OSC; N: 148; Mean: 3.32. The staff provides consistently good service: Agency: DOL; N: 92; Mean: 3.84; Difference (Unadjusted) (Unadjusted): 1.17; OF: 1.02; CPT: 1.00; DA: 1.03. Agency: OSC; N: 143; Mean: 2.67. The staff policies and procedures are customer friendly: Agency: DOL; N: 92; Mean: 3.66; Difference (Unadjusted) (Unadjusted): 0.77; Differences, Adjusted for[A]: OF: 0.61; CPT: 0.81; DA: 0.68. Agency: OSC; N: 145; Mean: 2.90. I have adequate access to staff for advice and assistance: Agency: DOL; N: 96; Mean: 3.79; Difference (Unadjusted) (Unadjusted): 1.17; Differences, Adjusted for[A]: OF: 1.01; CPT: 1.04; DA: 1.09. Agency: OSC; N: 147; Mean: 2.63. for[A]: DA: [Empty]. The staff keep me informed of significant case developments: Agency: DOL; N: 100; Mean: 3.82; Difference (Unadjusted) (Unadjusted): 1.30; Differences, Adjusted for[A]: OF: 1.16; CPT: 1.22; DA: 1.25. Agency: OSC; N: 143; Mean: 2.52. I know whom to contact if I have additional questions: Agency: DOL; N: 100; Mean: 3.85; Difference (Unadjusted) (Unadjusted): 0.94; Differences, Adjusted for[A]: OF: 0.82; CPT: 0.81; DA: 0.88. Agency: OSC; N: 149; Mean: 2.91. The staff responded to my questions in a timely manner: Agency: DOL; N: 100; Mean: 4.10; Difference (Unadjusted) (Unadjusted): 1.49; Differences, Adjusted for[A]: OF: 1.36; CPT: 1.25; DA: 1.48. Agency: OSC; N: 148; Mean: 2.61. Satisfaction with thoroughness of investigation: Agency: DOL; N: 92; Mean: 3.40; Difference (Unadjusted) (Unadjusted): 1.26; Differences, Adjusted for[A]: OF: 0.95; CPT: 1.14; DA: 1.17. Agency: OSC; N: 139; Mean: 2.14. Satisfaction with clarity of written communication: Agency: DOL; N: 97; Mean: 3.90; Difference (Unadjusted) (Unadjusted): 1.03; Differences, Adjusted for[A]: OF: 0.83; CPT: 0.97; DA: 0.95. Agency: OSC; N: 145; Mean: 2.87. Satisfaction with clarity of verbal communication: Agency: DOL; N: 97; Mean: 3.92; Difference (Unadjusted) (Unadjusted): 1.05; Differences, Adjusted for[A]: OF: 0.87; CPT: 0.92; DA: 0.94. Agency: OSC; N: 142; Mean: 2.87. Satisfaction with customer service: Agency: DOL; N: 100; Mean: 3.70; Difference (Unadjusted) (Unadjusted): 1.20; Differences, Adjusted for[A]: OF: 1.04; CPT: 1.02; DA: 1.12. Agency: OSC; N: 151; Mean: 2.50. Satisfaction with investigation of complaint: Agency: DOL; N: 101; Mean: 3.35; Difference (Unadjusted) (Unadjusted): 1.21; Differences, Adjusted for[A]: OF: 1.00; CPT: 1.06; DA: 1.16. Agency: OSC; N: 151; Mean: 2.13. Satisfaction with results of investigation: Agency: DOL; N: 101; Mean: 2.93; Difference (Unadjusted) (Unadjusted): 0.77; Differences, Adjusted for[A]: OF: 0.52; CPT: 0.71; DA: 0.74. Agency: OSC; N: 150; Mean: 2.16. Source: GAO analysis of OPM customer satisfaction survey data. GAO-15- 77. [A] OF = Whether the Outcome Was Favorable; CPT = Case Processing Time; DA = Whether Discrimination Was Alleged; Shading indicates differences that are statistically significant (P < .05). [End of table] [End of section] Appendix IV: Comments from the Department of Labor: U.S. Department of Labor: Assistant Secretary for Veterans' Employment and Training: Washington, D.C. 20210: November 3, 2014: Yvonne Jones: Director: Strategic Issues: U.S. Government Accountability Office: Dear Ms. Jones: Thank you for the opportunity to review and comment on the Government Accountability Office (GAO) draft report 15-77: "Veterans Reemployment Rights: Department of Labor has Higher Performance Than the Office o/Special Counsel on More Demonstration Project Measures." The report evaluates the relative performance of the Department of Labor (DOL) and the Office of Special Counsel (OSC) in investigating complaints alleging that a federal employer violated the Uniformed Services Employment and Reemployment Rights Act of 1994 (USERRA). DOL's Veterans' Employment and Training Service (VETS) is proud of its USERRA investigative program. Approximately one in five of the USERRA complaints filed over the past two years involve a federal employer; the remaining 80 percent involve state or private sector employers. As highlighted in the table below, the report found that DOL performed extremely well in the critical areas of customer satisfaction, timeliness, and cost, and compared very favorably in its capacity to investigate and resolve cases. Table: DOL and OSC Performance Metrics: Percentage of Respondents Satisfied with Overall Customer Service: DOL: 66%; OSC: 34%. Average Investigation Time of Closed Cases: DOL: 41 days; OSC: 151 days. Average Case Investigation Cost: DOL: $1,112; OSC: $3,820. Percentage of Cases Resolved: DOL: 97%; OSC: 88%. Source: GAO Analysis of DOL, OSC, and OPM Information. GAO-15-77. [End of table] VETS is particularly gratified by the positive feedback from the veterans and members of the uniformed services whose complaints we investigated. Customer service is integral to our work and an important element in the intensive training provided to our USERRA investigators. Our customer satisfaction scores indicate that our claimants are satisfied with our service throughout the investigation process, even in cases that are not resolved in their favor. VETS believes it is important that federal investigations remain impartial until the merits of a claim are determined. Our investigators are trained to keep the claimant involved throughout the investigation, explaining the status, process, and critical issues so that claimants can assist investigators in identifying important information and so that the outcome of the investigation does not come as a surprise when claimants receive formal notification. Additionally, our investigators succeed because they are part of a full-service, veteran-oriented organization that is in tune with the needs of veterans broadly, not only with respect to reemployment rights but also with respect to transitioning to the civilian workforce, training and retraining, and securing employment. Because our investigators in their daily work are broadly involved with veteran issues and programs, they are even more effective in resolving USERRA investigations. We concur with GAO's recommendations regarding the customer satisfaction survey. VETS plans to continue a customer satisfaction survey for USERRA claimants in FY 20 IS, and will take steps to maximize response rates. DOL is committed to continuous improvement of our USERRA program, and have integrated lessons learned in the course of the demonstration project to both federal and non-federal investigations conducted by the agency. Additionally, in FY 2014, VETS invested in an electronic case management system that will be customized and implemented in FY 2015. This investment will allow senior investigators, agency leadership, and the Office of the Solicitor greater oversight of and collaboration with our investigators. We look forward to continuing in this most important mission. Respectfully, Signed by: Keith Kelly: [End of section] Appendix V: Comments from the Office of Special Counsel: Office of Special Counsel: 1730 M Street, NW, Suite 300: Washington, DC 20036-4505: October 30, 2014: Yvonne D. Jones: Director, Strategic Issues: U.S. Government Accountability Office: 441 G Street, N.W. Washington, D.C. 20548: Re: OSC Comments on GAO Draft Report GAO-15-77: Dear Ms. Jones: Congress charged the Government Accountability Office (GAO) with assessing the relative performance of the Office of Special Counsel (OSC) and the Department of Labor (DOL) during the 36-month Demonstration Project established by the Veterans' Benefits Act of 2010, Pub. L. No. 111-275. Under the project, OSC and DOL shared responsibility for investigating and resolving complaints against federal agencies under the Uniformed Services Employment and Reemployment Rights Act (USERRA), which protects veterans and service members from discrimination and provides for their reinstatement to civilian jobs upon completion of duty. This letter constitutes OSC's official comments to GAO's report. Introduction: OSC secured relief for 94 veterans during the period evaluated by GAO. We fought to ensure that service members received compensation and other relief for any unlawful harm they faced due to their military commitments. In thirteen of these cases, the results we obtained led to system-wide changes within agencies to prevent USERRA violations from recurring. GAO's report largely ignores the positive, real-life impact of OSC's efforts to protect service members. While OSC and DOL each demonstrated clear strengths in our respective enforcement efforts, GAO presents an assortment of unreliable data and unsupportable conclusions in forming a subjective characterization that DOL had higher performance than OSC on more program measures. Importantly, GAO's depiction also fails to take into account OSC's significant advantage in securing relief for veterans—-arguably the most important criteria in evaluating this project. Had GAO focused more on actual relief for service members, the title of GAO's report very well could have been: As discussed in detail below, OSC processed more cases and achieved a higher number and percentage of outcomes in favor of service members by a wide margin. OSC obtained over 50 percent more positive case outcomes than DOL, despite first having to establish the necessary infrastructure to receive, investigate, and resolve USERRA cases, which has long existed at DOL. Indeed, OSC achieved these results using a staff of seven employees with an average annual caseload of 28, while DOL had a staff of 31 employees with an average annual caseload of five. OSC Successes during the USERRA Demonstration Project: OSC resolved 366 USERRA complaints and assisted 94 veterans in obtaining relief, including job offers, back pay, promotions, reinstatement, and restoration of benefits. In more than a dozen cases, OSC obtained broad, systemic relief, including changes to agency policies, forms, and practices as well as USERRA training for federal managers and human resources staff. These agency-wide remedies are particularly beneficial because they help the federal government meet its obligation to be a model employer by improving compliance with USERRA and better deterring and preventing future violations. Below are some examples of the favorable results OSC achieved for service members during the project, including by using an expanded mediation program, which achieved positive case resolutions in 18 out of 20 cases mediated.[Footnote 1] * Change in Agency Policy – A Reservist from Florida employed by the Drug Enforcement Administration was improperly required to provide two days of advance notice and a written explanation supporting her leave requests before being permitted to report for military duty. After OSC's investigation, the agency granted the service member's leave requests and, with OSC's assistance, revised its leave and attendance policies to comply with USERRA. * Reinstatement for Guardsman – A National Guardsman and cashier at the Defense Commissary Agency in California was improperly denied reemployment upon returning from a seven-month tour of duty, and told to apply for unemployment benefits. OSC intervened, and the agency agreed to reinstate him to his former position, restore his benefits and seniority, and provide him with back pay. * Job for Injured Veteran – A returning injured Iraq war veteran from New Jersey was not reemployed in his former civilian technician job for the U.S. Army because the Army claimed that it had no record he had left his job for military service. OSC located the service member's former supervisor, confirmed that the service member had given proper notice, and secured a civilian position with appropriate seniority, pay and benefits for the service member. * Restored Job Offer – A National Guardsman was offered a job in Oklahoma with the Department of Homeland Security, but he could not attend the initial training because it conflicted with an upcoming deployment. As a result, the agency rescinded its job offer. OSC resolved the case by obtaining the agency's agreement to reinstate the offer and place the service member in the next available training that did not conflict with his deployment. * Assistance for Disabled Veteran – A federal employee and National Guardsman from Florida was denied reasonable accommodations for his military-related injury. When his workers' compensation claim was denied, he was forced to use personal leave. During mediation, both agency counsel and the service member's supervisor communicated how much the agency valued the service member and his work. OSC mediated a settlement that included a new job series for the veteran, as well as training, save pay, and moving expenses. * Expanded Promotional Opportunities – While working as a police officer for the U.S. Mint in Pennsylvania, a Reservist was called to active duty for two years. During his absence, the agency issued vacancy announcements for Sergeant positions, but he was not notified or given an opportunity to apply. OSC contacted the agency, which agreed to schedule the service member for the next Sergeant's exam, provide him with priority consideration for the next Sergeant vacancy, and implement a mechanism whereby service members are notified about and permitted to apply for promotional opportunities while they are absent performing military duty. * Increased Pay for Displaced Veterans – Three disabled veterans at the U.S. Postal Service in Texas were improperly not paid the required “out of schedule” premium pay after they were transferred and had their work schedules changed. OSC contacted the agency, which agreed to review the schedule changes and compensation of all affected employees, including several other veterans, to ensure that they each received the pay to which they were entitled. GAO's Subjective and Inaccurate Evaluation of the USERRA Demonstration Project: OSC has significant concerns about the accuracy, emphasis, objectivity, and overall quality of GAO's assessment. Congress asked GAO to evaluate OSC's and DOL's performance in several areas, including case outcomes, costs, customer satisfaction, and capacity to conduct USERRA investigations. As stated, we recognize DOL's strengths in processing cases quickly and believe its larger staff was used to generate efficiencies in this area. However, major flaws in GAO's report raise serious questions about the value of the report as a means of evaluating the strengths and weaknesses of each agency. Our specific concerns are discussed in further detail below. 1. Case Outcomes: GAO's report diminishes and obscures the single-most important factor in evaluating relative performance for federal-sector USERRA cases. The question of how many service members are assisted, and what specifically is accomplished on their behalf, is of critical importance in evaluating relative performance under USERRA. GAO's own quantitative analysis of case outcome data should be straightforward: OSC received and completed more cases, and achieved a higher number and percentage of positive case outcomes for service members during the project. Specifically, OSC received 434 cases and resolved 366; DOL received 319 cases and resolved 308; OSC obtained relief for service members in 94 cases (26%), while DOL did so in 62 cases (20%).[Footnote 2] These numbers are a clear indication of higher performance. Properly measuring performance on case outcomes necessarily has a qualitative aspect as well. Unfortunately, GAO abdicated this responsibility, stating, “[W]e did not independently assess the quality of agencies' case investigations to determine if DOL and OSC arrived at the appropriate case outcomes.” GAO's incomplete assessment is troubling given that congressional requesters explicitly asked GAO to compare case outcomes. To assist with this analysis, OSC provided GAO with summaries of the allegations and specific relief obtained for each case it resolved favorably for veterans during the relevant time period. OSC also suggested numerous ways in which the qualitative data could be objectively measured, such as the number and frequency of cases in which systemic corrective actions were achieved, and the number and frequency of cases in which service members received back pay, new job opportunities, promotional opportunities, job training, or other remedies tailored to ensure successful and lasting integration into the civilian workforce. Unfortunately, GAO conducted no such evaluation, refused to incorporate any of the suggested measures, and did not request or receive any comparable data from DOL to adequately assess case outcomes. After GAO indicated it was unable and unwilling to conduct any qualitative assessment of case outcomes, OSC suggested that examples of case resolutions be included to explain the specific relief obtained for veterans in select cases. GAO initially committed to including several summaries, then reneged, and ultimately included one brief case description. GAO's failure to fully address case outcomes may stem from a fundamental misunderstanding of what constitutes a favorable outcome for a service member. In its report, GAO increased DOL's number of positive case outcomes from 55 to 62 by including seven cases that DOL determined had merit, but did not necessarily result in any relief for the service member or a verifiable offer by the agency to take corrective action (“merit-not resolved”). These seven cases represent 11% of DOL's total 62 positive case outcomes reported by GAO; without these cases, DOL's overall corrective action rate is 18% (not 20%), which is the rate GAO used in its earlier draft reports. In contrast, OSC's 94 positive case outcomes (a 26% corrective action rate) include only cases where we either secured full relief for the service member (91 cases) or an offer of full relief from the agency that the service member declined (three cases). Simply identifying a case as having merit does not constitute obtaining a favorable outcome for a service member; indeed, in those cases the agency has not ameliorated or even sought to address the harm to the veteran. By failing to recognize this important distinction, and including as positive resolutions cases in which the service member may not have received anything, GAO has not responded to Congress's mandate to appropriately assess case outcomes. 2. Cost: GAO's analysis is based on incomplete and unverifiable data. OSC raised numerous questions about the reliability of the cost data in GAO's report. Given the wide disparity in relative costs to complete a case, it appears that DOL and OSC were not reporting the same information. However, after repeated assurances from GAO that OSC's questions would be addressed, GAO never did so. As it stands now, the overall value, objectivity, and accuracy of the cost information in GAO's report is highly questionable. According to GAO's report, DOL completed its share of federal USERRA claims with the equivalent of one full-time employee in each year covered by the project. Specifically, GAO's report states that DOL completed all of its project casework, over a three-year period, at a total cost of $354,000 (or $118,000/year—the equivalent of one full- time employee). This number does not appear credible, as DOL had a staff of 31 investigators assigned to these cases, and an additional 14 senior investigators overseeing their work.[Footnote 3] The total expenditures reported by GAO are highly unlikely given the large number of DOL staff and supervisors assigned to the project. They are unrealistic when considered in context with the reported information on each employee's overall annual caseload. According to GAO, “the 31 [DOL] investigators who worked on demonstration cases were assigned an average of about 15 cases (demonstration and nondemonstration) per investigator per year.” GAO's report also indicates that project cases represented about one-third of each investigator's annual caseload, or “about five demonstration cases per investigator per year during the demonstration project.” Finally, GAO notes that DOL averaged “21 staff hours per case.” When these aggregate numbers are broken down, the end result is that each DOL investigator worked only 105 hours per year on project cases, or about 5% of the 2,088 hours that a full-time federal employee is expected to work on an annual basis. Moreover, taking GAO's reported cost data at face value, it seems that each DOL investigator works only 315 hours per year on all cases, or about 15% of an average work year of 2,088 hours for a full-time employee in the federal government. As stated, GAO's cost analysis is not credible. Indeed, their reported cost data strongly suggests that each DOL investigator was idle for 85% of their annual work hours. Finally, OSC is perplexed by GAO's conclusion that “neither agency tracked cost data in a way that allows costs to be traced back to individual cases; therefore, we were not able to calculate the total and average amount spent on closed cases.” We agree that each agency's cost data should be traced back to individual cases. This is the best way to verify the total number of hours and costs associated with each case. During the first year of the project, OSC established a procedure that tracked costs to specific cases. OSC provided this cost data to GAO for its 2012 evaluation, which GAO analyzed and deemed reliable in report number GAO-12-860R. Nevertheless, GAO requested in 2012 that OSC change its cost methodology to be more consistent with DOL's system, which cannot track cost data back to individual cases. OSC complied with GAO's request and raised concerns about the verifiability of this method—the same concerns that GAO now seems to raise in this report. Had GAO allowed OSC to continue with its earlier methodology, and required DOL to develop a similar approach, both agencies would have provided better and more verifiable data to GAO, allowing for more credible conclusions in this important area. 3. Customer Satisfaction: GAO's assessment of customer satisfaction is not reliable because it fails to address the significant limitations to the data upon which its conclusions are based. During the course of the project, we received positive feedback from individual service members, their representatives, and federal agencies. A sampling of their comments is below: I was extremely pleased with my investigator and happy to have someone so thorough and knowledgeable about the process. Her feedback and understanding of my issues was outstanding and resulted in my claim being resolved! OSC has very dedicated measures and personnel in place offering sound advice and service along every step of the process. Especially good marks to [OSC investigator] who was very courteous and whose knowledge was very useful. OSC is very thorough and effective. I believe that they take each matter seriously and treat each individual with respect and fairness. [OSC investigator] did a wonderful job. The entire process was simple and easy ... I have recommended the process to others and all have been very pleased. Thanks again to both [OSC mediators] for hosting the mediation and making [it] a refreshing, open and frank discussion among reasonable people. The mediation clearly laid the groundwork for [agency] and I to come to an agreement. [From agency representative] The experience with [OSC's mediators] was entirely professional and productive. I think the claimant and the agency did this the right way (with your help) given our relative perspectives and interests. Notwithstanding this positive feedback, a small number of customer satisfaction survey respondents reported lower levels of satisfaction with OSC. However, GAO failed to put these numbers in proper context. Despite an extremely small sample size, low response rate, and several biases in the customer satisfaction survey data, GAO relied heavily on the data to draw dubious conclusions about relative customer satisfaction levels at DOL and OSC, stating that the differences between the two agencies' scores are “pronounced.” In terms of sample size and response rate, there were only 252 (out of a possible 677) respondents to the customer satisfaction survey. Thus, the overall response rate was 37%, far below the Office of Management and Budget's accepted benchmark response rate of 80%, which GAO cited in several meetings with OSC and DOL but fails to highlight in the report. Although GAO admits that the low response rate “can potentially affect the conclusions that can be drawn from the survey,” it does not expand on the significant limitations and potential biases regarding the survey data until the very end of the report, in an appendix: A high response rate increases the likelihood that the survey results reflect the views and characteristics of the target population, whereas a low response rate can be an indicator of potential nonresponse bias, which would be detrimental to the accuracy of the results of a study in a variety of ways. (p. 35) The low overall response rate and the significant difference in response rates across two different agencies is potentially troublesome inasmuch as responders and nonresponders may differ with respect to their satisfaction levels with the investigation of their claims or differ on other characteristics that affect satisfaction. To the extent that such differences exist, estimates of the level of satisfaction in the two agencies and the differences in satisfaction between them may be biased. (Appendix II, p. 49) In addition, the Office of Personnel Management (OPM), which developed and implemented the customer satisfaction survey, expressed its concerns about other biases in the survey data. GAO was provided with OPM's feedback, but failed to include it in the report. According to OPM: At DOL, respondents were more likely to have their case settled favorably than nonrespondents. Of all the cases settled within the survey timeframe, only 19% had an outcome that was favorable to the client. However, 35% of survey respondents had an outcome in their favor. The results were [statistically] significant… [Thus], customers with favorable case outcomes are overrepresented in the DOL survey results. The patterns of responding could mean that OSC's lower scores on the Customer Satisfaction Survey compared to DOL may be partially attributable to OSC having a higher percentage of respondents with cases involving retaliation or discrimination, which may lower satisfaction levels, and to DOL's overrepresentation of respondents who had a favorable case outcome, which may inflate satisfaction levels.[Footnote 4] GAO states that it conducted additional analyses to account for potential biases in the customer satisfaction survey data. Nonetheless, GAO includes the raw (not adjusted) customer satisfaction survey scores in the report. The adjusted scores, which reduce the differences between OSC's and DOL's customer satisfaction survey ratings, are included in an appendix to this GAO report at pp. 56-57. GAO's decision to report the raw, unadjusted data is misleading and undermines the report's value as a means of assessing relative customer satisfaction levels at OSC and DOL. While we strive to achieve a high level of customer satisfaction in every USERRA case we investigate and seek to resolve—and we acknowledge the lower level of customer satisfaction in the 37% of responding cases—it is critical that GAO provide the proper context for analyzing this data. Conclusion: Although we remain deeply troubled by the shortcomings of GAO's evaluation, we at OSC are honored to have the opportunity to assist service members in restoring, resuming, and rebuilding their civilian lives following uniformed service. We look forward to further dialogue with Congress and the Department of Labor on these important issues. Sincerely, Signed by: Carolyn N. Lerner: Appendices: Comments from the Office of Special Counsel: Footnotes: [1] To review all 94 positive case resolutions, see Appendix 1, “Summaries of OSC Positive Case Resolutions during the USERRA Demonstration Project,” and Appendix 2, “Summary of OSC Mediated USERRA Settlements during the Demonstration Project.” Both documents were provided to GAO during its evaluation. [2] As explained below, the correct positive case resolution rate for DOL is 18% (or 55 cases), as stated in GAO's original draft report. [3] GAO's original draft report included 14 senior investigators as part of DOL's staff handling cases during the project. However, all references to the additional staff were removed from the final report. [4] From “Final Report for Customer Satisfaction Survey Non-Response Analyses,” U.S. Office of Personnel Management, April 2013, at pp. 4-5 (emphasis added). Appendix 1: U.S. Office of Special Counsel: Summaries of OSC Positive Case Resolutions During the USERRA Demonstration Project: This document provides case summaries of OSC's positive resolutions during the USERRA Demonstration Project, broken down by types of relief: (1) systemic relief and training; (2) reemployment and related benefits; (3) initial hiring and other discrimination; (4) promotion and other injunctive relief; and (5) monetary relief and other benefits. Some resolutions involved multiple types of relief, but are listed only once in the section constituting the primary form of relief. We have highlighted in bold particularly illustrative examples of the various types of relief that OSC secured for service members. Note that these case summaries are in addition to the 18 cases resolved through OSC's Alternative Dispute Resolution process, as described in Appendix 2, “Summaries of OSC Mediated USERRA Settlements During the Demonstration Project.” Systemic Relief and Training: Florida: An Air Force Reservist employed by the Drug Enforcement Agency (DEA) was required to provide two days of advance notice and a written explanation supporting her leave requests before being released from her civilian position to perform military duty. After OSC investigated the case, the service member's leave requests were granted, and OSC assisted the agency in revising its leave and attendance policies to comply with USERRA. North Carolina: A Navy Reservist, who was terminated from a civilian position with Marine Corps Community Services during her probationary period, alleged that her Reserve obligations played a role in her termination. Because the Reservist had found other employment and did not wish to return to the agency, the agency agreed to change her termination to a voluntary resignation, to pay her a lump sum to compensate her for her lost income while she found a new job, and to add USERRA awareness training to its annual supervisor training curriculum. Iowa: A member of the Army National Guard was tentatively offered a position on a special project with the Department of Agriculture (USDA). Even though the claimant notified the USDA of his military obligations during the interview process, his new supervisor insisted that he was not permitted to be absent during his first year on the job, and in particular stated that he would have to move his annual training dates if he wanted the position. Although the claimant ultimately accepted a position with a different agency, OSC provided specific written guidance to the USDA regarding its obligations to service members under USERRA during the initial hiring process, and ensured that this information was communicated to the individuals involved in that process. California: A National Guardsman and police officer with the Department of Veterans Affairs (VA) indicated that the agency refused to allow him to use paid leave for his military duty and failed to provide him adequate notice of transfers to different shifts and duty locations. After OSC intervened, the VA agreed to permit the Guardsman to use paid leave for future military duty, provide him with better notice of changes to his schedule, and arrange for USERRA training at his VA facility. South Dakota: The claimant alleged that the Department of the Interior, National Park Service failed to rehire him as a part-time seasonal employee following his service in the Army National Guard. After OSC investigated and determined that the claimant likely would have been rehired for a short period of time had he not performed Guard duty, the agency agreed to settle his complaint by providing him a lump sum payment for lost wages and providing USERRA training for its staff. Indiana: The claimant was a member of the Army Reserve and also worked as a criminal investigator for the U.S. Army. His managers and supervisors constantly required that he provide advance written notice of military service, despite the fact that verbal notice is sufficient under USERRA. After OSC became involved, the agency trained its human resources office and other management officials regarding USERRA's guidelines for advance notice and documentation of military service. New Jersey: The claimant is a member of the Coast Guard Reserve who works as a police officer for the U.S. Mint. When he was recalled to active duty, the agency issued vacancy announcements for Sergeant and Lieutenant positions, but he was not notified or given the opportunity to apply. OSC contacted the agency, which agreed to resolve his complaint by scheduling him for the next Sergeant's promotional exam, providing him with priority consideration for the next Sergeant's vacancy after the exam, and implementing a mechanism whereby service members are notified of and permitted to apply for promotional opportunities while absent due to military duty. Texas: Three disabled veterans alleged that the U.S. Postal Service improperly failed to pay them “out of schedule” premium pay after transferring them and changing their work schedules. OSC contacted the agency, and it agreed to review the schedule changes and pay of all affected employees, including the claimants and several other veterans, to ensure that they received the pay to which they were entitled. (This summary reflects relief for multiple claimants.) Reemployment and Related Benefits: New Jersey: After returning home, an injured Iraq war veteran was not reemployed in his former job as a technician for the U.S. Army because the agency had no record that he had left his job to perform military service. OSC located his former supervisor, who confirmed that the service member had informed him of his military service. OSC then secured the veteran a civilian position with appropriate seniority, pay, and benefits for the service member. Massachusetts: A member of the Army National Guard was a regular letter carrier with the U.S. Postal Service. While deployed, he was reassigned to a new location because his position was being “excessed.” The new location would significantly increase his commute time to work. Upon his return from military service, the claimant discovered that a less senior coworker had been given a temporary position at the original location because he had been notified of that option when his position was “excessed.” Because the claimant was deployed at the time, he was not aware of that option. When OSC contacted the agency, it agreed that if the claimant had been given the opportunity to bid on the temporary position, he would have received it based on his seniority. The agency then placed the claimant in a temporary position at his original location, resolving his complaint. Puerto Rico: An employee of the Department of Homeland Security (DHS), and member of the National Guard, returned from military service to find that, during her deployment, she was not given proper credit toward a within-grade pay increase or accrual of annual and sick leave. At OSC's request, DHS agreed to adjust her benefits to the correct levels. Georgia: A U.S. Postal Service worker was denied leave for an upcoming deployment with the Air Force Reserve after providing his supervisor with verbal notice and his military orders. The agency then demanded that he quit his deployment and return to his job or be fired. OSC contacted the agency, which rescinded its demand and made all the necessary changes to ensure the employee was approved for military leave and will be reemployed properly upon his return. North Carolina: After a six-month deployment, an Air Force Reservist and police officer with the U.S. Army contacted his supervisor to plan his transition back to work. Shortly thereafter, the agency advised him that he was AWOL and needed to return to work immediately even though he was still well within the 90-day return period provided for under USERRA. OSC resolved the case by having the agency rescind the AWOL charges and reinstate the Reservist. Georgia: The claimant was employed as a GS-13 technician with the U.S. Air Reserve. After a one-year Active Guard/Reserve tour, she sought reemployment in her former position, but the agency notified her that the position was being eliminated. The agency instead placed her in a different GS-13 position that she believed was of a lesser status than her former position, in violation of USERRA. OSC contacted the agency and negotiated a settlement whereby the agency agreed to reassign her to a new GS-13 position of similar status to her former position in a different location, and pay her relocation expenses. Maryland: The claimant was a member of the Air Force Reserve and employee at the VA. After an extended deployment, he attempted to contact the agency several times about returning to work, but could not get an answer regarding his reemployment. After OSC intervened, the agency promptly reemployed the claimant, thereby resolving his claim. Maryland: An air traffic controller (ATC) with the Federal Aviation Administration suffered service-connected injuries during a two-year deployment with the Army Reserve. As a result, she was unable to continue to perform ATC duties and requested assistance in finding an appropriate position to accommodate her disabilities. After the agency told her to find something on her own, she served on a six-month detail. When that detail ended, she found a new, permanent position, but it came with a significant pay cut and extended her time for retirement eligibility. After OSC opened its investigation, the parties executed a settlement agreement whereby the agency assigned claimant to a higher-rated position, increased her base pay to the level she had in the ATC position, and paid for her to attend a leadership development program. California: After two deployments, an Army Reservist who worked at a VA facility was reemployed into the same career field and pay grade, but in a position with lesser duties. The claimant also alleged that he was not permitted to make up appropriate contributions to his Thrift Savings Plan (TSP). OSC found that the Reservist was not properly reemployed and that his inability to make up his TSP contributions was in violation of USERRA. At OSC's request, the agency moved the Reservist into an appropriate position and allowed him to make up his TSP contributions. California: After returning from deployment, a Reservist who worked as a civilian with the U.S. Navy was laid off for budgetary reasons, effective 40 days after her return. OSC informed the agency that USERRA prohibits terminating a service member's employment, except for cause, for six months following a period of service lasting more than 30 days. At OSC's request, the agency agreed to provide the Reservist with back pay for the remainder of the protected period (140 days), give her a lump sum payment for all the paid leave she would have accrued, and allow her to make up contributions to her TSP. North Carolina: The claimant is an employee of the U.S. Army and member of the Army Reserve. After completing active duty, she timely requested reemployment, but the agency impermissibly delayed it by two weeks. Shortly thereafter, the claimant returned to active duty for one year. When she later returned to work, she received a counseling statement with her military orders attached, discovered that the agency had automatically paid out 19.5 hours of annual leave despite her request to be placed on unpaid leave, and imposed a $2,000 debt against her that it could not justify. OSC negotiated a resolution with the agency whereby it agreed to compensate claimant for the two weeks during which she should have been reemployed, restore her 19.5 hours of annual leave, remove the counseling statement from her personnel file, and waive the $2,000 debt. At the claimant's request, the agency also agreed to work with her on a lateral transfer to a different work unit. California: At the end of the claimant's duty with the Navy Reserve, he made a timely request for reemployment in his civilian position with the U.S. Navy. However, after initially confirming his requested start date, the agency delayed his reemployment another six weeks because his pre-service position was no longer available. OSC facilitated a settlement agreement under which the agency agreed to provide the claimant with back pay and restore his seniority and other benefits as of the date he should have been reemployed six weeks earlier. California: A National Guardsman who was a cashier at the Defense Commissary Agency was improperly denied reemployment upon returning from a seven-month tour of duty, and told to apply for unemployment. OSC intervened, and the agency agreed to reinstate him to his former position, restore his benefits and seniority, and provide him with back pay. Virginia: An employee of the Office of Personnel Management was a member of the Army Reserve. Upon returning from an active duty deployment to Afghanistan, she had service-connected disabilities that required time off from work for treatment, and her doctor prescribed that she telework full time. When she made a request for accommodation, her supervisor declined approval to telework because she was still “in training.” After OSC's investigation began, the agency addressed her concerns to her satisfaction, and the claim was resolved. Puerto Rico: A member of the Air National Guard was away from his civilian position with the U.S. Postal Service for approximately nine months for military duty. After being injured, he received treatment in a military medical facility. Following his release, he was cleared to return to work with certain restrictions. He timely applied for reemployment, but was told that agency headquarters would have to approve his reemployment before he could return to work. Shortly after OSC commenced its investigation, the agency promptly reinstated him. Pennsylvania: The claimant was employed as a chief business officer with the VA, before leaving to perform military duty. While he was away on military duty, a coworker sent him a new job announcement indicating that the agency was advertising his position. He soon discovered that the agency had moved him to a different position, albeit at the same pay level, as a Community Living Center administrator. However, this position had significantly different duties, and unlike his former position, it had no supervisory authority. After OSC contacted the agency regarding the claimant's allegations, it addressed his concerns to his satisfaction, and the claim was resolved. Initial Hiring and Other Discrimination: Minnesota: A member of the Air Force Reserve was a human resource specialist at the VA. After being appointed to a new position, her direct supervisor made negative comments about her qualifications and military service, refused to act on her requests to transfer to another unit, and placed her on a Performance Improvement Plan (PIP). After OSC intervened, the agency offered to find her a new position under a different supervisor and rescind the PIP, which she agreed to as a satisfactory resolution to her complaint. Minnesota: A member of the Air Force Reserve was a staffing supervisor for the VA. After being promoted to a position in a new office, she reported to a different supervisor who made disparaging comments about her work performance and military affiliation. After OSC intervened, the parties reached a settlement whereby the claimant moved to a different position within the agency and received a monetary award. Florida: A National Guardsman and civilian U.S. Air Force pilot alleged that the agency discriminated against him because of his military obligations and disclosures of management wrongdoing. Specifically, the agency removed him from flight status and initiated a Command Directed Investigation (CDI) against him. OSC investigated and, after finding evidence to substantiate his allegations, requested that the agency take corrective action. Because he did not wish to return to the agency after his latest deployment, the agency agreed to restore his flight status and rescind the CDI, correct his personnel records and provide him a neutral job reference, and pay him a lump sum to cover his transition period to a new job. Puerto Rico: A member of the Air National Guard is employed by the Transportation Security Administration. The claimant has a service- connected disability, for which he requested a reasonable accommodation. His accommodation request was denied, and he was placed on four different periods of light duty, after which the agency placed him on leave without pay and instructed him not to return to work until he was cleared to perform all normal functions of his position. After OSC investigated, he reached an agreement with the agency to return to duty with appropriate accommodations, resolving his claim. Missouri: The claimant, a member of the Army Reserve, applied for a special agent position with the Department of State. After receiving an offer to attend a training class, he was mobilized and requested a deferral pending his return from military duty. Upon his return, he was placed on a wait list and passed over for new openings, rather than being given priority consideration based on his prior offer and deferral due to his military service. After OSC began investigating, the agency reinstated its job offer, resolving the claim. Texas: An Army Reservist and special agent with the Bureau of Alcohol, Tobacco, and Firearms alleged that, due to his military obligations, he received a lower performance rating and award than his work merited. OSC investigated and found evidence supporting claimant's allegations. At OSC's request, the agency conducted a review of his performance appraisal, made revisions, and gave him an additional performance award. Florida: An Air Force veteran with service-connected disabilities and investigator for the Federal Aviation Administration alleged that he was denied reasonable accommodations, such as the ability to telework. After OSC began investigating, he was transferred to a different office with a new supervisor, which resolved the claim to his satisfaction. Italy: A service member received a tentative job offer for a customs and border clearance agent position at the U.S. Army in Vicenza, Italy. However, after he informed the agency that he was in the middle of a 10-month active duty deployment to Afghanistan, the agency rescinded the job offer. At OSC's request, the agency extended the service member a new employment offer. Oklahoma: A National Guardsman was offered a job as an immigration enforcement agent with the Department of Homeland Security (DHS), but he could not attend the initial training because it conflicted with an upcoming deployment. As a result, DHS rescinded its offer of employment. OSC resolved the case by obtaining DHS's agreement to reinstate the employment offer and place the service member in the next available training course that did not conflict with his deployment. Wisconsin: A disabled veteran working for the Department of Agriculture alleged that his supervisor put negative comments in his mid-year performance review due to his veteran status and disclosures of management wrongdoing. OSC contacted the agency, which offered to remove the negative comments from his review. Georgia: The claimant worked as a civilian attorney for the U.S. Army' s Warrior Transition Brigade in Georgia. He was also a member of the Air National Guard. He drilled for the Guard in Ohio because when he relocated for his civilian position, there were no available Guard slots in Georgia. He alleged that his supervisor treated him negatively when the supervisor found out that the claimant was a Guard member, including accusing him of timecard fraud on his drill days, demanding that he provide documentation of his drill duties, and placing him on extended administrative leave to “investigate” the alleged fraud. During OSC's investigation, the claimant entered into a settlement agreement whereby he resigned in exchange for attorneys' fees and a lump sum payment. Florida: The claimant is a U.S. Navy veteran who worked as a surgeon at the VA. He alleged a hostile work environment based on several instances of co-workers making negative remarks to veterans based on their military service. After OSC contacted the agency, the parties reached a settlement under which claimant agreed to retire from the agency in exchange for monetary compensation. Texas: A member of the Army Reserve and a civilian employee at the U.S. Army alleged that the agency planned to transfer him from his current work site to a new facility in order to limit the number of service members at any one site. After OSC opened its investigation, the agency withdrew the proposed transfer action, thereby resolving the claim. Missouri: The claimant was a civilian employee with the Department of Labor. After working at the agency for one year, he gave notice that he would be joining the Army National Guard and would soon depart for 18.5 weeks of training. Thereafter, he alleges that his performance ratings were lowered and that he was not promoted from GS-11 to GS-12. According to the claimant, these actions were motivated by his new Guard membership. During OSC's investigation, the claimant and the agency entered into a settlement agreement through his union, the terms of which are confidential. California: The claimant was a member of the Navy Reserve who applied for a position with the Department of Veterans Affairs. After being selected for the position and beginning to take the required steps to complete the hiring process, he was placed on active duty orders for one year. After notifying his prospective supervisor of his upcoming deployment, however, the agency indicated that it was rescinding his job offer. OSC contacted the agency, which stated that it never intended to rescind the job offer, that the email stating such had been sent in error, and that it was committed to hiring claimant as soon as he returned from his military obligations. New Mexico: A Marine was deployed overseas when he applied and was tentatively selected for a nuclear transport courier position with the Department of Energy. However, his tentative selection was withdrawn when he was unable to complete a required drug test within 30 days due to his overseas deployment. At OSC's request, the agency agreed to restore the service member's tentative selection for the position and reschedule him for his pre-employment drug testing so that he could proceed with the employment process. South Carolina: A member of the Air Force Reserve who worked for the VA alleged that the agency discriminated against him based on his Reserve duty by failing to award him performance bonuses and by making him work mandatory overtime during time periods in which he had weekend military obligations. During OSC's investigation, the agency agreed to settle the matter by paying the claimant a lump sum and agreeing to limit his mandatory overtime to better accommodate his military duty. Germany: The claimant is a disabled veteran employed with the U.S. Army & Air Force Exchange Service. He alleged that he was passed over for a promotion due anti-military animus by his supervisors. During OSC' s investigation, the parties agreed to mediation pursuant to an Equal Employment Opportunity complaint, and the case was settled on confidential terms. Pennsylvania: After applying for a position with DHS, an Army veteran alleged that the agency disqualified him because he did not reside within the local commuting area, which he alleged was improper because that area was not defined in the job announcement. After OSC began its investigation, the agency agreed to accept his application and consider him for the position. Germany: A member of the Army Reserve was offered a position with the NATO Special Operations Headquarters in Mons, Belgium. The offer was withdrawn after the claimant notified the agency that she would be on active duty for seven months. OSC intervened, and the agency agreed to reoffer the claimant the position for a later “report to duty” date that was compatible with the end date of her military service. Florida: An Army veteran employed by the U.S. Postal Service in a temporary position alleged that she had received an unfair performance evaluation based on her veteran status, which prevented her from being re-hired by the agency. After OSC investigated, claimant's supervisor completed a new evaluation for her, resolving her claim. Arizona: A member of Air Force Reserve and employee with the VA received “Outstanding” performance ratings for several years. After the claimant's performance rating dropped to “Excellent,” and then further to “Fully Successful,” he addressed the issue with the rating official and asked how he could improve his performance. The official replied that he needed to “prioritize between the military and his VA job” and that if he completed fewer work orders than his peers due to his absences for military duty, he should “make it up on his own time.” OSC contacted the agency regarding these allegations, and it changed his 2013 rating from “Fully Successful” to “Exceptional,” resolving the claim to his satisfaction. Promotion and Other Injunctive Relief: North Carolina: A member of the Coast Guard Reserve worked for the Environmental Protection Agency. Following his deployment, he applied and interviewed for a promotion, but was not selected. He alleges that he was asked repeatedly about his military service and that his supervisors assigned him more tasks than coworkers who were not Guard or Reserve members. During OSC's investigation, the parties agreed to mediation pursuant to an Equal Employment Opportunity complaint, and the case was settled on confidential terms. Minnesota: A member of the Air Force Reserve claimed he was not reemployed at the proper seniority level and pay grade when he returned from active duty to his civilian job at the VA. OSC investigated and determined that the employee would have been promoted had he not been absent, and as a result of OSC's action, the agency retroactively promoted him to the appropriate position. Florida: A member of the Army Reserve was employed as a GS-11 civilian with the Army Combat Readiness Center. While he was deployed to Afghanistan, all GS-11 employees in his department were promoted to the GS-12 level. Although he was also eligible, he was not considered for the same promotion because the agency failed to notify him of the opportunity while he was deployed. When his deployment ended, he returned to the GS-11 position. After OSC intervened, he was retroactively promoted to the GS-12 level with an effective date of his first workday back from deployment, and received back pay from the same date. Indiana: A member of the Army Reserve was employed with the U.S. Army Corps of Engineers. During a three-year overseas deployment with the agency, he received a permanent promotion, with a step increase, from his original position as a welder to a construction representative position. Upon his return from military duty, however, neither his promotion nor his step increase had been implemented. As relief, he sought either the construction representative position or another overseas deployment with the agency. During OSC's investigation, the agency offered him another overseas deployment, to which he agreed in resolution of his claim. Indiana: The claimant, a federal air marshal and Army Reservist, suffered injuries while on active duty, which prevented him from returning to his civilian position upon his release from active duty. At the agency's direction, the claimant applied for and received a disability annuity from the Office of Personnel Management. After convalescing from his injuries and being cleared by his physician to return to work, he requested reemployment with the agency, but was refused. After a period of unemployment, he was hired by the VA. With OSC's assistance, he agreed to settle his claims with the agency in exchange for a lump sum payment to compensate him for the gap in his employment. Colorado: A Federal Aviation Administration employee was denied a promotion because of his membership in the Air Force Reserve. OSC negotiated a resolution wherein the agency agreed to retroactively promote the employee, provide him with appropriate back pay, and make the necessary adjustments to his TSP. Florida: A member of the Army National Guard was a part-time flexible carrier with the U.S. Postal Service. After returning from one year of active duty, the claimant discovered that his replacement had been converted to a full-time permanent carrier position. After OSC opened its investigation, the agency entered into a settlement agreement whereby it converted the claimant to a full-time permanent carrier position with a retroactive start date, and provided him with the associated back pay and benefits. California: A police officer with the U.S. Army returned from deployment as a Reservist to find that his colleagues had been placed in a new position description and promoted, while he had not. After OSC intervened, the agency agreed to promote him to a higher grade, retroactive to the date he would have been promoted had he not been on active duty; provide him with the back pay associated with the retroactive promotion; and place him in the correct position description and command structure with his colleagues. Massachusetts: Contrary to USERRA, the U.S. Air Force extended an employee's time-in-grade requirement for a promotion by the length of his absence for Army Reserve duty. OSC's intervention resulted in the agency agreeing to promote the Reservist to the higher grade, retroactive to the date he would have been promoted had he not been deployed, and to provide him with the back pay associated with the retroactive promotion. Oregon: Upon her return from active duty, an Air Force Reservist was told by the Department of Energy that it would not promote her because of her military service. OSC negotiated a resolution wherein the agency agreed to promote the Reservist to a higher grade, retroactive to the date she would have been promoted had he not been on active duty; issue her corresponding back pay based on the retroactive promotion date; and reassign her to another organization within the agency that would enable her to receive the additional experience and training necessary to be promoted to the next higher grade level. Iowa: A National Guard member was hired into a new position with the U.S. Postal Service, which required her to complete a 90-day probationary period. When she returned from Guard duty, the agency told her that her probationary period would be extended by the number of days she was absent for such duty. OSC contacted the agency and it resolved the matter by crediting her Guard duty toward her probationary period. California: A member of the Coast Guard Reserve and employee with the Food and Drug Administration took two weeks off to fulfill her annual drill responsibilities. Upon her return to work, the claimant's supervisory duties had been greatly reduced and several of her subordinates had been reassigned to different departments. After OSC began its investigation, the claimant's supervisor addressed her concerns, and the case was resolved to her satisfaction. New York: A member of the Air Force Reserve and an employee at the VA alleged that his within-grade pay increase was improperly delayed by the amount of time he spent on military duty. During OSC's investigation, the agency voluntarily resolved the matter by back- dating the claimant's pay increase and providing applicable back pay. Virginia: A member of the Virginia National Guard and a housekeeper with the VA was promoted to an electrician position, pending a physical exam. Shortly after accepting the position, the claimant took military leave and was unable to complete the required physical exam until the day of his return, one day after his official start date for the new position. The delay in scheduling his physical exam caused his “effective” start date for pay purposes to be delayed. After OSC intervened, the agency agreed to correct the effective date to the date he would have started the new position had he not been prevented from taking the physical exam due to his military service. Ohio: The claimant is an employee of the VA and member of the Army National Guard. While on military duty, he became eligible for his within-grade pay increase, but the pay increase was not processed because he was absent performing military service. Additionally, he had requested to use a specific number of hours of paid military leave and annual leave to cover his absence, which the agency paid out, but when he returned to work, the agency audited his payroll account and erroneously believed that he had been “overpaid,” generating a debt of roughly $2,000. The agency began garnishing his paychecks to recover the amount. After OSC contacted the agency, it waived the debt, processed his within-grade pay increase and made it retroactive, and provided him with back pay and compensation for the garnished wages. Monetary Relief and Other Benefits: Georgia: The claimant was a U.S. Postal Service employee and a member of the Air Force Reserve. Prior to performing two separate periods of military duty, he submitted a leave request slip to use 16 hours of sick leave and 64 hours of paid military leave for various absences from work. However, he was instead charged as being AWOL for both the 16 hours of sick leave and 64 hours of military duty. After OSC became involved, the agency removed the AWOL charge from his records and paid him for the appropriate periods of sick leave and military leave. Ohio: A member of the Army Reserve employed by the Pentagon Force Protection Agency was selected for a three-year Active Guard and Reserve position. Despite the claimant filling out a detailed checklist before his tour began, the agency did not take appropriate actions to stop claimant's health insurance, life insurance, and TSP loan repayment deductions, resulting in a debt being collected from him. After OSC opened an investigation, the agency resolved his benefits issues to his satisfaction. Maryland: A member of the Coast Guard Reserve was an employee with the VA. When he was called to active duty, he requested to use his remaining paid military leave and then be placed on military leave without pay. Upon his return, he discovered that the agency had not followed his instructions and he unsuccessfully attempted to resolve the issue with various agency officials. After OSC intervened, the parties reached a settlement agreement whereby the claimant received proper credit for his military service and monetary compensation. Oregon: A member of the National Guard was working at a U.S. Army chemical depot that was scheduled to close. As a result of his Guard duty, his job-transfer options were more limited than those of his colleagues, he was denied access to a Priority Placement Program (PPP), and he was scheduled to be discharged several months before the depot actually closed (in contrast to many of his co-workers). OSC contacted the agency, which agreed to allow the service member to remain employed at the depot until it closed and placed him in the PPP with his colleagues. New York: A member of the Army Reserve was a supervisory employee for the Transportation Security Administration. He was deployed on active duty, for which he used 15 days of paid military leave. However, the agency later rescinded the leave as being impermissibly issued. The following year, he again was denied paid leave. OSC contacted the agency, which agreed to adjust his leave balance and restore the pay that it had erroneously collected. Virginia: A commissioned officer with the Army Reserve JAG Corps was employed by the Social Security Administration. After receiving his commission, he went on military duty to satisfy a mandatory six-month training requirement. He requested to use 30 days of paid military leave to cover part of that period, but only 15 days were approved based on a technicality. To make up the difference, he used 15 days of annual leave. After OSC intervened, the agency agreed to permit the use of the additional 15 days of military leave and restore his annual leave. South Carolina: A veteran with 34 years of combined active duty and Reserve military service was denied full credit for that service toward her federal civil service retirement from her training specialist position with the U.S. Army. As a result, seven months after she retired, her annuity payments were cut in half despite her efforts to ensure that the time was properly credited. After OSC contacted the agency, it agreed to take the necessary steps to restore her full retirement credit and ensure that she again received the full annuity payments to which she is entitled. California: The claimant was a member of the Army Reserve who worked at the VA. She requested to use three days of paid military leave and two days of annual leave for the days she was performing military duty. However, because she could not produce documentation of her military duty to the agency's satisfaction, she was instead placed on leave without pay. After OSC contacted the agency regarding her allegations, the agency took steps to provide her with the leave to which she was entitled. Germany: The claimant was performing Reserve duty in Germany while on unpaid leave from his civilian job with the U.S. Border Patrol. During this time, he was offered a civilian position with the U.S. Army in Germany. However, the Army considered him a “local hire” and therefore refused to provide him with a living quarters' allowance, even though his presence in Germany was due to his military orders. While OSC was investigating, the Army offered to provide him a living quarters' allowance, and the claim was resolved. North Carolina: An Army Reservist and a civilian employee for the U.S. Army, was scheduled to be relocated from Georgia to North Carolina, and to receive Relocation Incentive pay after 12 months of service at the new location. After relocating her family and purchasing a residence, she was put on active duty orders for one year. Upon her return, she was initially denied the Relocation Incentive pay. During OSC's investigation, the Acting Commanding General approved her request for Relocation Incentive pay, and her claim was resolved. California: The claimant was an Army Reservist and a civilian police officer with the U.S. Marine Corps. While away on 15 days of Reserve training, his orders were extended for 36 days, exhausting his paid leave at his civilian job. The agency placed him on regular leave without pay, not leave without pay for military duty, which could adversely affect his seniority and benefits. After OSC opened its investigation, the agency properly characterized claimant's leave without pay to ensure that he would receive his full entitlements under USERRA. Texas: The claimant, whose Navy Reserve unit is located in Florida, is an employed in Texas by the U.S. Army. Because of the distance between his Reserve unit and his civilian position, the claimant requires significant travel time to perform military duty. Although he provided notice of his military drill dates as well as an airline ticket receipt for his travel to his unit in Florida, the agency only approved military leave for the dates the claimant was performing military duty, and not for his day of travel. After OSC contacted the agency and explained USERRA's guidance regarding travel to and from military duty, the agency credited the claimant with the appropriate military leave and the claim was resolved. California: A Navy Reservist employed with the Department of Defense went on active duty for six months, and upon his return, the agency improperly coded his absence as regular (non-military) leave without pay. As a result of this coding error, one week each of annual leave and sick leave were deducted from his paid leave balances. After OSC intervened, the agency re-coded his military leave and adjusted his leave balances, resolving his claim. Alabama: An Air Force Reservist employed with the Defense Contract Management Agency alleged that the agency improperly charged him for health insurance premiums while he was deployed and was garnishing his paycheck to recover the amount of the premiums. During OSC's investigation, the agency resolved the claim by reimbursing him for the erroneous collections. [End of Appendix 1] Appendix 2: U.S. Office of Special Counsel: Summaries of OSC Mediated USERRA Settlements During the Demonstration Project: Below are summaries of the facts and resolutions reached in cases resolved through OSC's USERRA mediation program during the Demonstration Project. The USERRA mediation program was developed through a thorough dispute systems design process. The program generated positive outcomes in 18 out of 20 cases, described below. In order to protect confidentiality required in federal agency mediations required by the Administration Disputes Resolution Act of 1996, we have omitted the identity of the Claimants and the agencies in these mediations. Mediation Case Summary #1: A member of the Air Force Reserve filed a claim of USERRA discrimination, alleging that the agency refused to afford him rights and benefits of employment given to other employees. The employee asserted that the most blatant example of this related to the storage of his belongings. The employee was on a one-year assignment for his agency when he was deployed to Afghanistan. Before leaving, the employee communicated with the agency to determine how to arrange for storage of his belongings between the time his deployment ended and the training for his next agency assignment began. The agency responded that there was no administrative mechanism in place to provide for the storage of the claimant's belongings and, consequently, the claimant would be required to cover the cost. He noted that other employees transferring from one assignment to another without an intervening deployment would have been entitled to storage of belongings during training. The claimant requested relief in the form of reimbursement of his storage expenses, movement of his belongings to the agency's storage facility for the remainder of his training period before the next scheduled assignment, and modification of agency policies to accommodate the needs of reservists who are faced with similar challenges. Through mediation, the claimant shared his perspective on the difficulties he encountered with navigating the bureaucracy before, during, and after deployment. The agency acknowledged these challenges and provided perspective on the changes that they were making to ensure that employees who are deployed would not face similar challenges upon their return. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency reimbursing the claimant for the storage costs incurred upon return from deployment, and disseminating information to the agency's 20,000 employees about a change to agency policy of instituting administrative mechanisms to cover the storage costs for military reservists. Mediation Case Summary #2: A member of the Army Reserve filed a claim of USERRA discrimination, alleging that he paid health insurance premiums while deployed because he was informed that if he did not, he would lose his insurance and would not be able to get it back right away. Additionally, the agency included notices in the claimant's file that indicated that he was absent for an extended period due to illness or injury—-rather than deployment. The claimant was concerned that these notices would negatively impact his promotion and career prospects. The claimant requested relief in the form of a refund of the insurance premium payments and the removal of the extended absence forms from his file. Through mediation, the claimant and the agency discussed what had transpired and considered the claimant's concerns about the record in his file. They also heard the claimant's concern that he did not want this to happen to any other service members in the future. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency removing the documents in the claimant's file regarding extended absence due to illness or injury; reimbursing the claimant for health insurance payments he made while deployed; and changing the agency-wide form so that “extended absence” forms would now include a category for extended absence to perform military service. Mediation Case Summary #3: A member of the Army Reserve filed a claim of USERRA discrimination. The claimant alleged that his agency violated USERRA when it did not provide him the training and tools that he needed to reintegrate into his job after his deployment. The claimant asserted that because of this, his work performance suffered and he was subjected to reprimands and lowered performance evaluations. The claimant, who was not working for a period while this matter was being addressed, requested relief in the form of a return to work, the removal of reprimands from his file, adjustments to his evaluations, and the provision of training and tools to help him to do his work effectively. The agency shared the interest in getting the claimant back to work and creating a work environment that promoted success for the employee and quality service for the clientele that the agency serves. With these goals as guidelines, the claimant and the agency brainstormed ideas for a mutually beneficial solution. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency returning the claimant to work under a different supervisor, providing training and tools for the claimant to do his job, providing the claimant a new performance plan, establishing a clean performance record for the claimant, and considering the claimant's within grade increase within thirty days of the claimant's return to work. Mediation Case Summary #4: A member of the Air Force National Guard filed a claim of USERRA discrimination alleging that the agency failed to provide reasonable accommodations for his military-related injury. Although he loved his job, his injury prevented him from continuing. He had requested retraining for another position within the agency that he could do, but his supervisor instead advised him to submit a worker's compensation claim and took the claimant off duty while the claim was pending. Unfortunately, his worker's compensation claim was denied, and the claimant was then forced to use personal leave when his injuries prevented him from performing his duties. In this case, mediation with a collaborative educational component reaped tremendous rewards. The agency believed that since they did not have a job in the claimant's city that would suit him, their USERRA obligation was met. Once the agency's attorney spoke with an OSC USERRA subject matter expert, they were comfortable creating a new job in another city within the broader region that the claimant could do with his physical limitations. At mediation, both agency counsel and the claimant's upper level supervisor communicated how much the agency valued the claimant and his work. Through mediation, the location, pay, and nature of the new job were negotiated to fit the needs of both parties (the agency was under severe budget restrictions; the claimant preferred to stay within the state and did not want to lose pay while training in a new job series). The claimant left the process extremely excited about his new position. Mediation Case Summary #5: A member of the Navy Reserve filed a claim of USERRA discrimination, asserting that his supervisor approved his leave on the condition that he continue to do his agency work while deployed. The claimant asserted that the most blatant example of this was when his supervisor required him to leave a military base-—while on Navy orders—-and travel to a neighboring town to perform civilian duties. The claimant requested relief in the form of payment for time spent completing civilian work, which he typically had to complete very late at night or very early in the morning due to the time differences. Through mediation, the claimant and agency discussed their concerns, reviewed the claimant's documentation to support his claim, and explored creative options to address the claimant's concerns. Both the claimant and the agency wanted to put this matter behind them and move forward in a productive way. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency compensating the claimant for his civilian work during his deployment in the amount of $132,733. Given that the supervisor at issue had retired, the claimant was excited to move forward with new management, and expressed contentment with the settlement. The claimant noted that he appreciated the “refreshing, open, and frank discussion” in mediation. The agency expressed appreciation for the opportunity to work collaboratively with the claimant to resolve the matter early in the process. Mediation Case Summary #6: A member of the Navy Reserve filed a claim of USERRA discrimination and hostile work environment. The claimant asserted that his first line supervisor, upon learning of the claimant's impending six-month absence due to military duty, rated the claimant “unacceptable” on a mid-year performance appraisal. The claimant noted that he had never encountered performance issues in the past. Second, he recounted difficulties concerning his leave requests for military duty. The claimant requested relief in the form of reassignment to another office within the agency (away from his present supervisor) and review and adjustment of the appraisal rating (from “unacceptable” to “meets expectations”). Through mediation, the claimant and the agency discussed each other's concerns, learned more about the applicability of USERRA to the claim, and brainstormed ideas for finding a mutually acceptable resolution. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency changing the claimant's performance rating, working with the claimant and an HR representative to create a professional development plan for the claimant, approving the claimant's participation in an upcoming management class, and approving the claimant's detail to another office within the agency. Mediation Case Summary #7: A member of the Air National Guard filed a claim of USERRA discrimination, asserting that his first-line supervisor lowered his performance ratings in response to a recent deployment to Afghanistan. The claimant had received the highest rating in four of five categories in his previous rating cycle. Yet upon returning to work after deployment he was rated lower in four of five performance areas, despite receiving no feedback on performance-related concerns in the timeframe between the two reviews. The claimant requested relief in the form of improved communication with the first line supervisor, a process by which the first-line supervisor would flag performance concerns when they are identified, and justification for the lower performance ratings. Through mediation, the claimant and the agency discussed perspectives on and concerns about the performance review and engaged in open dialogue about how to improve future communication and enhance the working relationship. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency reassessing the claimant's performance appraisal and providing justification for his rating in each performance area, alerting the claimant to any future performance concerns in a timely fashion, conducting communication “check-ins” on a regular basis, and instituting a process for the claimant to raise and manage performance concerns with those under his supervision. Mediation Case Summary #8: A member of the Navy Reserve filed a claim of USERRA discrimination. The claimant asserted that his agency violated USERRA when it terminated his “excepted service” position while he was deployed. Although the claimant signed a Memorandum of Understanding providing that his position was a time-limited “not-to-exceed” appointment, his expectation was that the position would be renewed because the agency had done so the past two years. The claimant requested relief in the form of reinstatement of his position so that he could continue to add value to the federal workforce. The agency asserted that its hands were tied on reinstatement because the funding for the position (which came from another agency) was no longer available. Through mediation, the claimant and the agency brainstormed ideas for a mutually beneficial solution. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency arranging and covering the costs of the claimant's attendance at a week-long Career Seminar for Military Personnel, providing USERRA training for agency personnel, drafting and making available a letter of recommendation for the claimant; and providing a lump sum payment of $60,000 to the claimant. Mediation Case Summary #9: A member of the U.S. Army Reserve filed a USERRA discrimination claim alleging that since he joined the U.S. Army Reserve he has been denied the opportunity to advance at his job. The claimant alleged that he has not been able to move beyond part-time status for over fifteen years; during this time he had been deployed and on active duty four times. He also requested a retroactive Cost of Living Allowance. According to the claimant, his managers told him that he had not been promoted because he was deployed when the full-time jobs were announced, and that his absence also prevented him from qualifying for the COLA others received. The claimant also raised issues concerning leave and military leave related to his military service. Mediation in this case entailed a rich discussion between the parties along with USERRA education. OSC mediators worked with the parties to help them understand the USERRA requirements pertaining to promotions and COLA and the all the complaint was fully resolved. In particular, the claimant was promoted into a new, full-time job—a goal he had sought for over a decade. Mediation Case Summary #10: A member of the Air Force Reserve filed a claim of USERRA discrimination. The claimant, who had always received high performance ratings, alleged that the agency lowered his performance rating from “exceeds” to “meets” in four categories with the only major difference from the prior year's performance period being that the claimant was away on active duty for four months during that year. In mediation, the claimant was able to express a sense of feeling “overlooked” after returning from deployment. After actively contributing as part of his military unit, he was not receiving enough assignments in his civilian job to keep him busy, despite requests for more to do. The claimant requested relief in the form of a more accurate performance rating and a different performance rater for the future. Through mediation, the agency representative was able to communicate that the agency viewed the claimant as a highly effective and valuable employee. The agency acknowledged that it had erred by taking the claimant's absence from the office due to deployment as a negative factor in the performance evaluation rating. After the discussion, the claimant felt comfortable with his supervisor and no longer felt it necessary to change performance raters. Thus settlement, education in USERRA's requirements, and a renewed working relationship were all achieved through the mediation. In addition, the claimant's performance rating was increased from “highly effective” to “exemplary” and he received the corresponding performance award bonus of $425. Mediation Case Summary #11: A member of the Army National Guard filed a claim of USERRA discrimination and hostile work environment. The complainant alleged that management would not change his work schedule to accommodate his weekend drill schedule without a memorandum from his military unit. In addition, the claimant alleged that his military leave wages from a previous tour of duty were wrongly garnished by the agency. The claimant requested relief in the form of being able to request schedule changes without a memo from his unit, an end to the alleged anti-military attitude in the workplace, and repayment for the military leave wrongly garnished. Mediation allowed both the claimant and his manager to spend a concentrated time listening and better understanding each other's concerns. They cleared up misunderstandings, demonstrated a willingness to listen and communicate, and re-established trust. Both parties expressed relief at the dissipation of tension and repaired work relationship. In the settlement the agency agreed to a cash payment of $250, returned wrongfully garnished pay, and agreed to arrange for USERRA law training for managers in their facility. Moreover, the claimant and his supervisor agreed upon a process for discussing and integrating his drill and work schedules with the help of higher level supervisors if necessary. Mediation Case Summary #12: A veteran, who returned from serving in the Air Force, filed a claim of USERRA discrimination and hostile work environment. The claimant alleged that upon reemployment he was given little to no work, was given a letter of reprimand, and was generally treated poorly by his supervisor. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency assigning claimant to an alternate supervisor, holding disciplinary action in abeyance, providing a letter of referral, providing the claimant with a mentor, and providing supervisory training to the claimant's supervisor. Mediation Case Summary #13: A member of the Army Reserve filed a claim of USERRA discrimination, alleging that he was not selected for promotion by the agency because he was away on active duty. While the claimant was on active duty, though, other employees and subordinates in similar departmental roles, who were hired around the time the claimant was hired, received promotions to higher grade levels. The claimant requested relief in the form of receiving retroactive promotions with back pay. Through mediation, the claimant and the agency counsel and management representatives were able to discuss misunderstandings of facts and misapplications of USERRA law in a timely and efficient manner without a lengthy investigation and prosecution of the matter. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency promoting the claimant to a higher position and providing training opportunities to the claimant to enhance his professional development. Mediation Case Summary #14: A member of the Army National Guard filed a claim of USERRA discrimination, alleging that he was incorrectly charged military leave and annual leave whenever he had a drill weekend. Because of this, the claimant had lost 110 hours of annual leave. Additionally, the claimant asserted that he was not earning the appropriate number of hours of sick leave in each pay period. The claimant requested that annual leave for the drill periods in question be restored and a reconciliation of his sick leave balance to account for what he should have earned. The claimant also wanted to make sure that this would not happen to anyone in the future. Through conciliation, the agency was educated about leave provisions under USERRA. Settlement was achieved, with the claimant agreeing to withdraw his complaint in exchange for the agency restoring his annual leave and adjusting his sick leave allocation. Mediation Case Summary #15: A member of the Air Force Reserve filed a claim of USERRA discrimination and hostile work environment. The claimant's primary concerns centered on the nature and level of his job duties. Through mediation, the claimant and the agency representatives were able to discuss the claimant's underlying fundamental concern about his career advancement. They discussed ways to increase open communication between them, and the agency agreed to help the claimant identify and discuss with management positions that might be a good match for him. The agency also agreed to advise higher-level supervisors that claimant's annual appraisal should be based solely on performance as a civilian employee and must not be affected by his military service. Finally, the agency increased the claimant's annual leave as required by a Presidential directive. Mediation Case Summary #16: A member of the Navy Reserve filed a claim of USERRA discrimination, asserting that his agency did not reemploy him in his temporary manager position upon his return to work—-instead, another employee was still in his prior position. However, the facility where the claimant had worked was in the process of being closed while the complaint was processed. Given this, the claimant requested relief in the form of monetary compensation. Through mediation, the claimant and the agency discussed their concerns and heard from a USERRA subject matter expert to better understand the applicability of USERRA to this matter. Settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency changing the termination date of the claimant's employment (to coincide with the closure of the facility), providing pay and employment benefits consistent with the change of termination date, providing one week severance pay, and placing the claimant on a reemployment priority list for one year from the date of the agreement. Mediation Case Summary #17: A member of the National Guard filed a claim of USERRA discrimination, alleging that after sustaining a service-connected disability, his agency did not work with him effectively to find a position to accommodate him. The agency asserts that it did attempt to accommodate the claimant, through providing agency job opportunities, but these opportunities were rejected by the claimant. In the USERRA claim, the claimant requested relief in the form of back pay. Through mediation, the parties discussed the issues and explored the interests of the claimant in remaining close to home and in focusing on restoring his health. In light of this discussion, settlement was achieved, with the claimant agreeing to withdraw the claim in exchange for the agency providing him with two additional months of pay, followed by four months of leave without pay, at the end of which the claimant would separate from the agency. The agency also agreed to provide the claimant with resources and assistance to complete workers' compensation and federal disability retirement applications and to arrange training in USERRA responsibilities (provided by OSC) for key staff in this 45,000-person agency. Mediation Case Summary #18: A former member of the Army Reserve filed a claim of USERRA discrimination, alleging that his agency refused to provide him with military leave and harassed him regarding his military service. The employee, who retired from the Army with a permanent disability, planned to submit a federal disability retirement package but was concerned that the military leave issue would not be resolved before he left the agency. The claimant sought relief in the form of payment for fifteen days of military leave and, if possible, promotion from a GS-11 to a GS-12. Through the process, the claimant learned about a debt he had incurred due to the agency's payment of his health care premiums for a period of time. In mediation, the parties explored the applicability of USERRA to this claim and the primary needs of the parties. The claimant was anxious to complete his retirement application and focus on the needs of his family. The agency was anxious to resolve the dispute and fill the claimant's position at an already overburdened agency facility. OSC mediated a settlement in which the claimant agreed to voluntarily separate from the agency in exchange for the agency paying the claimant a lump sum amount of $2,656 to cover the health care premium related debt—of which the claimant was unaware until the mediation, paying the claimant for fifteen days of military leave, and providing the claimant with assistance on his federal disability retirement application. [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: Yvonne D. Jones at (202) 512-2717 or at jonesy@gao.gov: Staff Acknowledgments: In addition to the contact named above, Signora May (Assistant Director); Peter Beck (Analyst in Charge), Dawn Bidne, Karin Fangman, Paul Kinney, Donna Miller, Cynthia Saunders, Douglas Sloane, and Lou V. B. Smith made key contributions to this report. [End of section] Related GAO Products: Transitioning Veterans: Improved Oversight Needed to Enhance Implementation of Transition Assistance Program. GAO-14-144. Washington, D.C.: March 5, 2014. Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure Demonstration Project Data Integrity. GAO-12-860R. Washington, D.C.: Sept. 10, 2012. Veterans' Reemployment Rights: Steps Needed to Ensure Reliability of DOL and Special Counsel Demonstration Project's Performance Information. GAO-11-312R. Washington, D.C.: June 10, 2011. Military Personnel: Considerations Related to Extending Demonstration Project on Servicemembers' Employment Rights Claims. GAO-08-229T. Washington, D.C.: October 31, 2007. Servicemember Reemployment: Agencies Are Generally Timely in Processing Redress Complaints, but Improvements Needed in Maintaining Data and Reporting. GAO-11-55. Washington, D.C.: October 22, 2010. Military Personnel: Improvements Needed to Increase Effectiveness of DOD's Programs to Promote Positive Working Relationships between Reservists and Their Employers. GAO-08-981R. Washington, D.C.: August 15, 2008. DOD Financial Management: Adjudication of Butterbaugh Claims for the Restoration of Annual Leave or Pay. GAO-08-948R. Washington, D.C.: July 28, 2008. Military Personnel: Federal Agencies Have Taken Actions to Address Servicemembers' Employment Rights, but a Single Entity Needs to Maintain Visibility to Improve Focus on Overall Program Results. GAO- 08-254T. Washington, D.C.: November 8, 2007. Military Personnel: Considerations Related to Extending Demonstration Project on Servicemembers' Employment Rights Claims. GAO-08-229T. Washington, D.C.: October 31, 2007. Military Personnel: Improved Quality Controls Needed over Servicemembers' Employment Rights Claims at DOL. GAO-07-907. Washington, D.C.: July 20, 2007. Office of Special Counsel Needs to Follow Structured Life Cycle Management Practices for Its Case Tracking System. GAO-07-318R. Washington, D.C.: February 16, 2007. Military Personnel: Additional Actions Needed to Improve Oversight of Reserve Employment Issues. GAO-07-259. Washington, D.C.: February 8, 2007. Military Personnel: Federal Management of Servicemember Employment Rights Can Be Further Improved. GAO-06-60. Washington, D.C.: October 19, 2005. U.S. Office of Special Counsel's Role in Enforcing Law to Protect Reemployment Rights of Veterans and Reservists in Federal Employment. GAO-05-74R. Washington, D.C.: October 6, 2004. [End of section] Footnotes: [1] Over each of the next 4 years, the Department of Defense estimates that approximately 170,000 to 185,000 active duty servicemembers will separate from the military and about 60,000 National Guard and Reserve members will be demobilized and deactivated from active duty. GAO, Transitioning Veterans: Improved Oversight Needed to Enhance Implementation of Transition Assistance Program, GAO-14-144 (Washington, D.C.: March 5, 2014). [2] In addition to those serving in the armed forces and the Army and Air National Guards (when engaged in active duty for training, inactive duty training, or full-time National Guard duty), USERRA covers the commissioned corps of the Public Health Service and other persons designated by the President in time of war or national emergency. Pub. L. No. 103-353, 108 Stat. 3149 (Oct. 13, 1994), codified at 38 U.S.C. §§ 4301-4335. USERRA is the most recent in a series of laws protecting veterans' employment and reemployment rights going back to the Selective Training and Service Act of 1940. Pub. L. No. 783, 54 Stat. 885, 890 (Sept. 16, 1940). [3] Pub. L. No. 111-275, § 105, 124 Stat. 2864, 2868-70 (Oct. 13, 2010). [4] OSC is an independent investigative and prosecutorial agency with the primary mission of protecting the employment rights of federal employees and applicants for federal employment. [5] Pub. L. No. 108-454, §204, 118 Stat. 3598, 3606-08 (Dec. 10, 2004). Under VBIA, the demonstration project was originally scheduled to end on September 30, 2007, but through a series of extensions ran through December 31, 2007. [6] See GAO, Military Personnel: Improved Quality Controls Needed over Servicemembers' Employment Rights Claims at DOL, GAO-07-907 (Washington, D.C.: July 20, 2007). [7] GAO-07-907. [8] 5 U.S.C. § 2302. [9] GAO, Veterans' Reemployment Rights: Steps Needed to Ensure Reliability of DOL and Special Counsel Demonstration Project's Performance Information, GAO-11-312R (Washington, D.C.: June 10, 2011). [10] OPM is acting as survey administrator. [11] GAO, Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure Demonstration Project Data Integrity, GAO-12-860R (Washington, D.C.: Sept. 10, 2012). [12] There are 13 prohibited personnel practices including discrimination, retaliation, or unauthorized preference or improper advantage. 5 U.S.C. § 2302. [13] GAO-11-312R. [14] DOL is responsible for investigating claims alleging a federal agency failed to apply veterans' preference in hiring or during a reduction-in-force. [15] Six DOL investigators who worked on demonstration project cases are no longer active investigators. In addition, DOL has 81 other investigators--for a total of 106--qualified to investigate USERRA claims and DOL officials told us these investigators were available to conduct demonstration case investigations, as needed. According to OSC, one of the investigators who worked on demonstration project cases is no longer an active OSC investigator. [16] OSC did not assign and track their demonstration case assignments in a way that enabled them to report the exact number of cases assigned per investigator per fiscal year. So, we calculated the number of cases opened per OSC USERRA Unit investigator or attorney as a proxy. These figures were calculated by using case tracking data to identify the number of cases assigned to OSC over time, alongside corresponding employment data, to determine the number of investigators employed during specific time periods. According to OSC officials, OSC's USERRA Unit Chief and one other employee did not regularly receive cases to investigate, but were assigned cases on an ad hoc basis. We included these two employees in our calculations because they did receive cases to investigate. [17] These caseload averages represent the average caseload if the staffing levels had remained constant for the entire fiscal year. However, because OSC had temporary staff and other staff who came and left the USERRA Unit, we were unable to calculate the exact average caseload per fiscal year. [18] As mentioned previously, DOL has 106 investigators that can investigate USERRA claims. [19] DOL has investigated USERRA cases since the law's 1994 inception, as well as investigating prior cases under the predecessor Veterans' Reemployment Rights Act. [20] Pub. L. No. 104-320 , 110 Stat. 3870 (Oct. 19, 1996). [21] Response rates were calculated based on survey information provided by OPM on July 28,2014. We received updated case information from each agency after this date. As such, there may be a difference when calculating the response rate using the case outcome data presented above, versus the information provided by OPM. [22] GAO-12-860R. [23] GAO, Veterans' Reemployment Rights: Steps Needed to Ensure Reliability of DOL and Special Counsel Demonstration Project's Performance Information, GAO-11-312R (Washington, D.C.: June 10, 2011). [24] GAO, Veterans' Reemployment Rights: Department of Labor and Office of Special Counsel Need to Take Additional Steps to Ensure Demonstration Project Data Integrity, GAO-12-860R (Washington, D.C.: Sept. 10, 2012). [25] The adjusted estimates of agency differences and the F-statistics associated with them are from ordinary least-squares (OLS) regression models that regressed the scores on each item for all respondents on agency, and on (one at a time) case processing time, discrimination allegation, and favorability of the outcome. [26] The adjusted odds ratios for these analyses where responses are treated as categorical are derived from logistic regression models in which the odds on responding favorably are regressed on agency (again a dummy variable as above) and on (one at a time) case processing time, discrimination allegation, and favorability of the outcome. [27] Shading in the table indicates differences which are statistically significant at the .05 level. [28] For example, the unadjusted difference between agencies in the mean score on the "staff are courteous" item was 0.57, while the adjusted differences (or differences after taking the outcome of the claim into account) was 0.46. Many of the differences were slightly smaller after adjusting for these different factors, but every one of them remained statistically significant. [29] Respondents in the "Not Agree" and "Not Satisfied" categories include both those who disagreed or were dissatisfied and those who were in the "Neither" category. [30] Because of the small size of the sample, we controlled for these three different factors one at a time, in different logistic regression models. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website [hyperlink, http://www.gao.gov]. Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. Connect with GAO: Connect with GAO on facebook, flickr, twitter, and YouTube. Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts. Visit GAO on the web at [hyperlink, http://www.gao.gov]. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]; E-mail: fraudnet@gao.gov; Automated answering system: (800) 424-5454 or (202) 512-7470. Congressional Relations: Katherine Siggerud, Managing Director, siggerudk@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, DC 20548. Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, DC 20548. [End of document]