This is the accessible text file for GAO report number GAO-11-69 
entitled 'Military And Veterans Disability System: Pilot Has Achieved 
Some Goals, but Further Planning and Monitoring Needed' which was 
released on December 6, 2010. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as 
part of a longer term project to improve GAO products' accessibility. 
Every attempt has been made to maintain the structural and data 
integrity of the original printed product. Accessibility features, 
such as text descriptions of tables, consecutively numbered footnotes 
placed at the end of the file, and the text of agency comment letters, 
are provided but may not exactly duplicate the presentation or format 
of the printed version. The portable document format (PDF) file is an 
exact electronic replica of the printed version. We welcome your 
feedback. Please E-mail your comments regarding the contents or 
accessibility features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States Government Accountability Office: 
GAO: 

Report to Congressional Committees: 

December 2010: 

Military And Veterans Disability System: 

Pilot Has Achieved Some Goals, but Further Planning and Monitoring 
Needed: 

GAO-11-69: 

GAO Highlights: 

Highlights of GAO-11-69, a report to congressional committees. 

Why GAO Did This Study: 

Since 2007, the Departments of Defense (DOD) and Veterans Affairs (VA) 
have been testing a new disability evaluation system designed to 
integrate their separate processes and thereby expedite veterans’ 
benefits for wounded, ill, and injured servicemembers. Having piloted 
the integrated disability evaluation system (IDES) at 27 military 
facilities, they are now planning for its expansion military-wide. 
Part of the National Defense Authorization Act for Fiscal Year 2008 
required GAO to report on DOD and VA’s implementation of policies on 
disability evaluations. This report examines: (1) the results of the 
agencies’ evaluation of the IDES pilot, (2) challenges in implementing 
the IDES pilot to date, and (3) whether DOD and VA’s plans to expand 
the IDES adequately address potential future challenges. GAO analyzed 
data from DOD and VA, conducted site visits at 10 military facilities, 
and interviewed DOD and VA officials. 

What GAO Found: 

In their evaluation of the IDES pilot as of February 2010, DOD and VA 
concluded that it had improved servicemember satisfaction relative to 
the existing “legacy” system and met their established goal of 
delivering VA benefits to active duty and reserve component 
servicemembers within 295 and 305 days, respectively, on average. 
While these results are promising, average case processing times have 
steadily increased since the February 2010 evaluation. At 296 days for 
active duty servicemembers, as of August 2010, processing time for the 
IDES is still an improvement over the 540 days that DOD and VA 
estimated the legacy process takes to deliver VA benefits to members. 
However, the full extent of improvement of the IDES over the legacy 
system is unknown because (1) the 540-day estimate was based on a 
small, nonrepresentative sample of cases and (2) limitations in legacy 
case data prevent a comprehensive comparison of timeliness, as well as 
appeal rates. 

Piloting of the IDES has revealed several implementation challenges 
that have contributed to delays in the process, the most significant 
being insufficient staffing by DOD and VA. Staffing shortages were 
severe at a few pilot sites that experienced caseload surges. For 
example, at one of these sites, due to a lack of VA medical staff, it 
took 140 days on average to complete one of the key features of the 
pilot—the single exam—compared with the agencies’ goal to complete 
this step of the process in 45 days. The single exam posed other 
challenges that contributed to process delays, such as exam summaries 
that did not contain sufficient information for VA to determine the 
servicemember’s benefits and disagreements between DOD and VA medical 
staff about diagnoses for servicemembers’ medical conditions. Cases 
with these problems were returned for further attention, adding time 
to the process. Pilot sites also experienced logistical challenges, 
such as incorporating VA staff at military facilities and housing and 
managing personnel going through the process. 

As DOD and VA prepare to expand the IDES worldwide, they have made 
preparations to address a number of these challenges, but these 
efforts have yet to be tested, and not all challenges have been 
addressed. To address staffing shortages and ensure timely processing, 
VA is developing a contract for additional medical examiners, and DOD 
and VA are requiring local staff to develop written contingency plans 
for handling surges in caseloads. However, the agencies lack 
strategies for meeting some key challenges, such as ensuring enough 
military physicians to handle anticipated workloads. They also do not 
have a comprehensive monitoring plan for identifying problems as they 
occur—such as staffing shortages and insufficiencies in medical exams—
in order to take remedial actions as early as possible. 

What GAO Recommends: 

GAO is making several recommendations to improve DOD and VA’s planning 
for expansion of the new disability evaluation system, including 
developing a systematic monitoring process and ensuring that adequate 
staff is in place. DOD and VA generally concurred with GAO’s 
recommendations and provided technical comments that GAO incorporated 
into the report as appropriate. 

View [hyperlink, http://www.gao.gov/products/GAO-11-69] or key 
components. For more information, contact Daniel Bertoni at (202) 512-
7215 or bertonid@gao.gov. 

[End of table] 

Contents: 

Letter: 

Background: 

Pilot Evaluation Results Are Promising, but the Degree of Improvement 
Achieved Is Unknown: 

Pilot Sites Experienced Several Challenges: 

DOD and VA Expansion Plans Address Some Though Not All Challenges: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: IDES Pilot Processing Times for Reserve Component 
Servicemembers: 

Appendix III: Comments from the Department of Defense: 

Appendix IV: Comments from the Department of Veterans Affairs: 

Appendix V: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: Military Treatment Facilities Piloting the IDES: 

Table 2: Percentage of Legacy Cases with Referral and Appeal Dates, by 
Military Service: 

Table 3: Percentage of Legacy Cases with Data Used for DOD 
Comparisons, by Military Service: 

Table 4: Selected Characteristics of IDES Pilot Sites Visited: 

Table 5: Percentage of Legacy Cases with Data Used for Comparison of 
Time in Active Duty, by Military Service: 

Figures: 

Figure 1: Overview of the Legacy and IDES Processes: 

Figure 2: Timeliness Goals for the Steps of the IDES Process: 

Figure 3: Average Case Processing Times and Changes in Active Caseload 
by Location at Least 1 Year After Implementation and in August 2010: 

Figure 4: Active Duty IDES Case Processing Times by Service, as of 
August 29, 2010: 

Figure 5: Percentage of IDES Active Duty Cases Completed in 295 Days 
or Less by Service, as of February 2010: 

Figure 6: Army Servicemember IDES and Legacy Appeal Rates, as of early 
2010: 

Figure 7: Single Exam Processing Time for Active Duty Servicemembers 
at IDES Pilot Sites, as of August 29, 2010: 

Figure 8: MEB Processing Times for Active Duty Servicemembers at IDES 
Pilot Sites, as of August 29, 2010: 

Figure 9: Informal PEB Processing Times for Active Duty Servicemembers 
in the IDES Pilot, by Military Service, as of August 29, 2010: 

Figure 10: Average Cases per DOD Board Liaison at IDES Pilot Sites: 

Figure 11: Average Number of Days to Deliver VA Benefits for Reserve 
Component Servicemembers, by Military Service, as of August 29, 2010: 

Figure 12: Percentage of Cases Meeting 305-Day Goal for Delivery of VA 
Benefits to Reserve Component Servicemembers, by Military Service, as 
of February 2010: 

Figure 13: Average Number of Days to Complete Single Exams for Reserve 
Component Servicemembers, by IDES Pilot Site, as of August 29, 2010: 

Figure 14: Average Number of Days to Complete MEB Documentation for 
Reserve Component Servicemembers, by IDES Pilot Site, as of August 29, 
2010: 

Figure 15: Average Number of Days to Complete the Informal PEB for 
Reserve Component Servicemembers, by Military Service, as of August 
29, 2010: 

Abbreviations: 

DOD: Department of Defense: 

IDES: integrated disability evaluation system: 

IT: information technology: 

MEB: medical evaluation board: 

NDAA: National Defense Authorization Act: 

PEB: physical evaluation board: 

PTSD: posttraumatic stress disorder: 

VA: Department of Veterans Affairs: 

VBA: Veterans Benefits Administration: 

VHA: Veterans Health Administration: 

VTA: Veterans Tracking Application: 

WWCTP: Office of the Deputy Under Secretary of Defense for Wounded 
Warrior Care & Transition Policy: 

[End of section] 

United States Government Accountability Office: 
Washington, DC 20548: 

December 6, 2010: 

Congressional Committees: 

Over 40,000 servicemembers have been wounded in the wars in Iraq and 
Afghanistan, as of October 2010. After receiving medical treatment, 
many wounded servicemembers must navigate a complex disability 
evaluation system that begins with the Department of Defense (DOD) 
determining whether they are medically fit to continue their military 
service. If they are found unfit, servicemembers continue through the 
system to obtain a determination of their eligibility for military 
disability benefits. Once servicemembers are discharged from the 
military, they may also be eligible to receive disability benefits 
from the Department of Veterans Affairs (VA), but they must first 
undergo an entirely separate VA disability evaluation process. A 
series of articles in 2007 by The Washington Post concerning 
conditions at Walter Reed Army Medical Center, and subsequent reports 
from numerous high-level commissions and review groups, highlighted 
problems with the DOD and VA disability evaluations systems.[Footnote 
1] These included long delays, duplication in DOD and VA processes, 
confusion among servicemembers, and distrust of systems regarded as 
adversarial by servicemembers and veterans. 

In response to these concerns, DOD and VA jointly designed a new 
disability evaluation system that integrates DOD and VA processes, 
with the goal of expediting the delivery of benefits to 
servicemembers. DOD and VA began pilot testing the integrated 
disability evaluation system (IDES) in November 2007 at three 
Washington, D.C., area military treatment facilities and, by March 
2010, added 24 more facilities to the pilot. DOD and VA are now 
planning to expand the piloted system to 28 additional facilities, as 
a first step toward replacing the military's existing--or "legacy"--
disability evaluation system with the IDES worldwide. 

In January 2008, Congress enacted the National Defense Authorization 
Act for Fiscal Year 2008 (NDAA 2008) requiring DOD and VA, to the 
extent feasible, to jointly develop and implement a comprehensive 
policy on improvements to the care, management, and transition of 
recovering servicemembers, including improvements to the agencies' 
disability evaluation systems.[Footnote 2] The NDAA 2008 also required 
GAO to report on the progress DOD and VA have made in developing and 
implementing the comprehensive policy.[Footnote 3] In agreement with 
cognizant congressional staff, we reviewed DOD and VA's progress in 
implementing policies related to their disability evaluation systems, 
focusing on the agencies' joint pilot of the IDES. Specifically, we 
examined: (1) the results of DOD and VA's evaluation of the pilot, (2) 
challenges in implementing the piloted system to date, and (3) DOD and 
VA plans to expand the piloted system and whether those plans 
adequately address potential challenges. 

To examine DOD and VA's evaluation of the IDES pilot, we identified 
the goals that DOD and VA expected the pilot to achieve and reviewed 
their assessment of whether those goals were met. As part of this 
work, we assessed the reliability of two types of data that DOD and VA 
planned to use as the basis of their pilot evaluation--case data from 
both pilot and legacy disability evaluation systems, as well as data 
from surveys DOD conducted to gauge servicemember satisfaction. We 
obtained the case data and survey data as of early 2010, the same 
cutoff dates that DOD and VA used for their pilot evaluation.[Footnote 
4] To identify challenges in implementing the piloted system to date, 
we visited 10 of the 27 military treatment facilities participating in 
the pilot.[Footnote 5] We selected these 10 facilities to obtain 
perspectives from sites in different military services and 
geographical regions and with varying caseloads and organizational 
structures. For all of the research objectives, we conducted 
interviews with key officials involved in the pilot at DOD, VA, and 
each of the military services. Furthermore, we analyzed pilot case 
data and reviewed reports, guidance, plans, and other documents. We 
also reviewed relevant federal laws and regulations. We conducted this 
performance audit from November 2009 to December 2010, in accordance 
with generally accepted government auditing standards. Those standards 
require that we plan and perform the audit to obtain sufficient, 
appropriate evidence to provide a reasonable basis for our findings 
and conclusions based on our audit objectives. We believe that the 
evidence obtained provides a reasonable basis for our findings and 
conclusions based on our audit objectives. 

The NDAA 2008 also requires us to certify whether we had timely access 
to sufficient information to make informed judgments on the matters 
covered by our report.[Footnote 6] We were provided sufficient 
information in a timely manner to make informed judgments on the audit 
objectives covered in this report. 

Background: 

The DOD Legacy Disability Evaluation System: 

The military's legacy disability evaluation process begins at a 
military treatment facility when a physician identifies a condition 
that may interfere with a servicemember's ability to perform his or 
her duties.[Footnote 7] On the basis of medical examinations and the 
servicemember's medical records, a medical evaluation board (MEB) 
identifies and documents any conditions that may limit a 
servicemember's ability to serve in the military. 

The servicemember's case is then evaluated by a physical evaluation 
board (PEB) to make a determination of fitness or unfitness for duty. 
Each of the services conducts this process for its servicemembers. The 
Army has three PEBs, which are located at Fort Sam Houston, Texas; 
Walter Reed Army Medical Center in Washington, D.C.; and Fort Lewis, 
Washington. The Navy and Air Force each have one PEB: the Navy's is 
located at the Washington Navy Yard in Washington, D.C., and the Air 
Force's is located in San Antonio, Texas. The PEB process begins with 
an "informal" PEB--an administrative review of the case file by PEB 
adjudicators without the presence of the servicemember. If the 
servicemember is found to be unfit due to medical conditions incurred 
in the line of duty, the informal PEB assigns the servicemember a 
combined percentage rating for those unfit conditions, and the 
servicemember is discharged from duty. Disability ratings range from 0 
(least severe) to 100 percent (most severe) in increments of 10 
percent. Depending on the overall disability rating and number of 
years of active duty or equivalent service, the servicemember found 
unfit with compensable conditions is entitled to either monthly 
disability retirement benefits or lump sum disability severance pay. 
[Footnote 8] 

Servicemembers have opportunities to appeal the results of their 
disability evaluations. If servicemembers are dissatisfied with the 
informal PEB's decisions, they may request a hearing with a "formal" 
PEB. If they then disagree with the formal PEB's findings, they can, 
under certain conditions, appeal to the reviewing authority of the 
PEB.[Footnote 9] 

As servicemembers navigate DOD's disability evaluation system, they 
interface with staff who play key roles in supporting them through the 
process. Military physicians involved in the MEB process play a 
fundamental role because they are responsible for documenting in the 
disability evaluation case file the medical conditions that may limit 
a servicemember's ability to serve in the military. To prepare this 
documentation, military physicians may require that servicemembers 
obtain additional medical evidence from specialty physicians, such as 
a psychiatrist. Throughout the MEB and PEB processes, board liaisons 
serve a key role by explaining the process to servicemembers and 
constructing the case files. The liaisons inform servicemembers of 
board results and of deadlines at key decision points in the process. 
The military also provides legal counsel to advise and represent 
servicemembers going through the disability evaluation process, 
although servicemembers may retain their own representative at their 
own expense. 

The VA Disability Claims Process: 

In addition to receiving disability benefits from DOD, veterans with 
service-connected disabilities may receive compensation from VA for 
lost earnings capacity. In contrast to DOD's disability evaluation 
system, which evaluates only medical conditions affecting 
servicemembers' fitness for duty, VA evaluates all medical conditions 
claimed by the veteran, whether or not they were previously evaluated 
by the military services' medical evaluation process. Although a 
servicemember may file a VA claim while still in the military, he or 
she can only obtain disability compensation from VA as a veteran. 

VA's disability compensation claims process starts when a veteran 
submits a claim to VA's Veterans Benefits Administration (VBA). The 
claim lists the medical conditions that the veteran believes are 
service-connected. For each claimed condition, VA must determine if 
credible evidence is available to support the veteran's contention of 
service connection. A service representative assists the veteran in 
gathering the relevant evidence to evaluate the claim, which may 
include the veteran's military service records and treatment records 
from VA medical facilities and private medical service providers. 
Also, if necessary for reaching a decision on a claim, VBA arranges 
for the veteran to receive a medical examination conducted by 
clinicians (including physicians, nurse practitioners, or physician 
assistants) certified to perform the exams under VA's Compensation and 
Pension program. Once a claim has all of the necessary evidence, a VA 
rating specialist evaluates the claim and determines whether the 
claimant is eligible for benefits. If so, the rating specialist 
assigns a percentage rating. If VA finds that a veteran has one or 
more service-connected disabilities with a combined rating of at least 
10 percent, the agency will pay monthly compensation. The veteran can 
claim additional benefits over time, for example, if a service-
connected disability worsens or surfaces at a later point in time. 

The Integrated Disability Evaluation System: 

In November 2007, DOD and VA began piloting the IDES, a joint 
disability evaluation system to eliminate duplication in their 
separate systems and to expedite receipt of VA benefits for wounded, 
ill, and injured servicemembers. The IDES merges DOD and VA processes, 
so that servicemembers begin their VA disability claim while they 
undergo their DOD disability evaluation, rather than sequentially, 
making it possible for them to receive VA disability benefits shortly 
after leaving military service. Specifically, the IDES: 

* merges DOD and VA's separate exam processes into a single exam 
process conducted to VA standards. This single exam--which may involve 
more than one medical examination (for example, by different 
specialists)--in conjunction with the servicemembers' medical records, 
is used by military service PEBs to make a determination of 
servicemembers' fitness for continued military service, and by VA as 
evidence of service-connected disabilities. The single exam may be 
performed by medical staff working for either VA, DOD, or a private 
provider contracted with either agency. 

* consolidates DOD and VA's separate rating phases into one VA rating 
phase. If the informal PEB has determined that a servicemember is 
unfit for duty, VA rating specialists prepare two ratings--one for the 
conditions that DOD determined made a servicemember unfit for duty, 
which DOD uses to provide military disability benefits, and the other 
for all service-connected disabilities, which VA uses to determine VA 
disability benefits. Ratings for the IDES are prepared by rating 
specialists at VA's Baltimore and Seattle regional offices. 

* provides VA case managers to perform outreach and nonclinical case 
management and explain VA results and processes to servicemembers. 

By consolidating DOD and VA's separate medical exams and ratings, the 
IDES eliminates several steps from the existing "legacy" systems (see 
figure 1). 

Figure 1: Overview of the Legacy and IDES Processes: 

[Refer to PDF for image: illustration] 

Legacy process: 

Actions performed by Department of Defense (DOD): 

1. Service member referred to disability system. 

2. Military medical providers conduct medical exam. 

3. Medical Evaluation Board (MEB)identifies conditions that may make 
member unfit for duty. 

4. Physical Evaluation Board (PEB) assesses servicemember’s fitness 
for duty. 

5. If found unfit, PEB rates the unfitting conditions to determine 
benefits. 

6. Service member discharged with DOD benefits if eligible. 

Actions performed by Veterans Affairs (VA): 

7. Veteran files claim for benefits with VA. 

8. VA providers examine veteran. 

9. VA rates all of veteran’s service-connected conditions. 

10. Veteran receives VA benefits if eligible. 

IDES process: 

Actions performed by DOD and VA: 

1. Service member referred to disability system. 

2. Medical providers conduct medical exam to VA standards[A]. 

3. Medical Evaluation Board (MEB) identifies conditions that may make 
member unfit for duty. 

4. Physical Evaluation Board (PEB) assesses service member’s fitness 
for duty. 

5. If found unfit, VA rates the conditions to determine both DOD and 
VA benefits. 

6. Service member receives both DOD and VA benefits shortly after 
discharge. 

Sources: GAO analysis of DOD and VA policies. 

Note: Under the legacy system, steps 1, 2, and 3 are not necessarily 
performed in this order. For example, a Navy official told us that 
under the legacy system, the servicemember is referred into the 
disability evaluation system when the MEB completes the documentation 
identifying the conditions that may make a member unfit for duty. With 
regard to step 7, servicemembers may file a claim with VA while still 
in the military, but they can only obtain disability compensation from 
VA as a veteran. With regard to step 8, the exams may be conducted by 
VA clinicians or by private-sector physicians contracted with VA. 

[A] In the IDES process, the medical exam performed to VA standards 
can be conducted by VA, DOD, or private-sector providers contracted 
with either agency. 

[End of figure] 

In designing the IDES, DOD and VA established goals to provide VA 
benefits to active duty servicemembers within 295 days of being 
referred into the system, and to reserve component members within 305 
days.[Footnote 10] In establishing the 295-and 305-day goals, they 
also established timeliness goals for the specific steps of the IDES 
process (see figure 2). 

Figure 2: Timeliness Goals for the Steps of the IDES Process: 

[Refer to PDF for image: timeline] 

IDES goal (in days): 

Service member referred to the IDES: 

MEB phase: 

10 days: DOD board liaison meets with servicemember, compiles medical 
and personnel records (30 for Reserves); 

10 days: VA case manager meets member, files VA claim (30 for 
Reserves); 

45 days: VA, DOD, or contracted providers perform medical exam; 

35 days: MEB identifies potentially unfitting conditions. 

PEB phase: 

15 days: Informal PEB determines fitness for duty; 

15 days: VA completes ratings; 

30 days: Member may appeal fitness decision to formal PEB; 

15 days: Member may appeal rating decision to VA; 

30 days: Member may appeal formal PEB decision to military department; 

15 days: Administrative processing throughout PEB phase; 

End of PEB phase. 

45 days: Servicemember separates from military; 

30 days: VA issues benefits letter. 

Total: 295 days (305 for Reserves[A]). 

Sources: GAO analysis of DOD and VA policies and guidance. 

[A] DOD guidance allows 40 more days for reserve component members 
than for active duty members in completing the first two steps of the 
process to provide for employer notification, establish orders for 
active duty, and to compile medical records. However, DOD and VA's 
goal for total IDES processing time is only 10 days longer for reserve 
component members than for active duty members. 

[End of figure] 

DOD and VA first piloted the IDES at 3 Washington, D.C., area military 
treatment facilities, beginning in November 2007 (see table 1). They 
added 18 military facilities to the pilot in fiscal year 2009 and 6 in 
fiscal year 2010. DOD and VA stated that expansion to additional sites 
was intended to assess the IDES system in a variety of geographic 
areas and to test the agencies' capacity to handle additional 
caseload. According to DOD, the 27 pilot sites represented almost half 
of the servicemembers in the military services' disability evaluation 
systems. 

Table 1: Military Treatment Facilities Piloting the IDES: 

Military service: Air Force (6); 
Initial pilot sites (3): Malcolm Grow Medical Center, Andrews Air 
Force Base (MD); 
Phase 1 expansion--fiscal year 2009 (18): 
Elmendorf Air Force Base (AK); 
MacDill Air Force Base (FL); 
Nellis Air Force Base (NV); 
Travis Air Force Base (CA); 
Vance Air Force Base (OK); 
Phase 2 expansion--fiscal year 2010 (6): [Empty]. 

Military service: Army (15); 
Initial pilot sites (3): Walter Reed Army Medical Center (Washington, 
D.C.); 
Phase 1 expansion--fiscal year 2009 (18): 
Fort Belvoir (VA); 
Fort Carson (CO); 
Fort Drum (NY); 
Fort Meade (MD); 
Fort Polk (LA); 
Fort Richardson (AK); 
Fort Sam Houston (TX); 
Fort Stewart (GA); 
Fort Wainwright (AK); 
Phase 2 expansion--fiscal year 2010 (6): 
Fort Benning (GA); 
Fort Bragg (NC); 
Fort Hood (TX); 
Fort Lewis (WA); 
Fort Riley (KS). 

Military service: Navy[A] (6); 
Initial pilot sites (3): National Naval Medical Center (MD); 
Phase 1 expansion--fiscal year 2009 (18): 
Camp Lejeune (NC); 
Camp Pendleton (CA); 
Naval Hospital Bremerton (WA); 
Naval Medical Center San Diego (CA); 
Phase 2 expansion--fiscal year 2010 (6): Naval Medical Center 
Portsmouth (VA). 

Total number of pilot sites: 27. 

Source: DOD. 

Note: Numbers in parentheses indicate numbers of IDES sites. 

[A] Navy IDES pilot sites serve both Navy and Marine Corps 
servicemembers, since the Marine Corps is within the Department of the 
Navy. 

[End of table] 

Pilot Evaluation Results Are Promising, but the Degree of Improvement 
Achieved Is Unknown: 

DOD and VA's Evaluation Shows That the Pilot Is Achieving Some of Its 
Goals: 

In their planning documents for the IDES pilot, DOD and VA stated that 
they were basing their evaluation of the effectiveness of the IDES 
pilot on whether it has achieved three key goals relative to the 
legacy process: increased servicemember satisfaction, improved case- 
processing time, and a reduction in servicemember appeal rates. In 
addition, they also examined IDES program costs. To determine whether 
they have achieved their goals, the agencies surveyed servicemembers 
in the IDES pilot and legacy systems and are using a data system--
called the Veterans Tracking Application (VTA)--that enables them to 
track case processing time and appeals. They have been monitoring 
their progress on these goals through weekly reports. 

In August 2010, DOD and VA officials issued an interim report to 
Congress summarizing their evaluation results to date. In this report, 
the agencies concluded that servicemembers who went through the IDES 
pilot were more satisfied than those who went through the legacy 
system, and that the IDES process met the agencies' goals of 
delivering VA benefits to active duty servicemembers within 295 days 
and to reserve component servicemembers within 305 days. Specifically, 
they reported that, as of February 2010, the IDES process took an 
average of 274 days to complete for active duty servicemembers and 281 
days for reserve component members who, according to the interim 
report, comprise 15 percent of IDES participants. Furthermore, they 
concluded that the IDES pilot has achieved a faster processing time 
than the legacy system, which they estimated to be 540 days.[Footnote 
11] 

While overall results were promising, data presented in the report had 
some limitations, and the report itself did not include certain 
analyses. For example, DOD officials told us that the 540-day estimate 
for the legacy process was based upon a review of a small and 
nonrepresentative sample of legacy cases during the agencies' "table 
top" planning exercise in August 2007.[Footnote 12] In addition, 
although DOD officials told us that they planned to compare average 
processing times of pilot cases with a broader sample of legacy cases, 
and to determine whether fewer servicemembers are appealing the 
findings of informal PEBs and formal PEBs in the pilot compared with 
the legacy, the interim report did not include these comparisons. In 
addition, in their planning documents for the IDES pilot, DOD and VA 
indicated that they were establishing a goal to deliver VA benefits to 
80 percent of members in the IDES pilot within the 295-and 305-day 
time frames. However, their interim report did not discuss whether 
this goal was met. 

Our review of DOD and VA's data and weekly reports generally confirm 
DOD and VA's findings, as of early 2010. However, while the agencies 
have largely met their overall goal to increase servicemember 
satisfaction and met their timeliness goal as of February 2010, since 
that time, case processing times have been steadily increasing as the 
caseload has increased. In addition, not all of the service branches 
are achieving the same results. 

* Servicemember satisfaction: Our review of the survey data that DOD 
used for the interim report (as of February 2010), as well as a recent 
weekly report, indicate that, on average, servicemembers in the IDES 
process have had higher satisfaction levels than those who went 
through the legacy process. In addition, a higher percentage of 
servicemembers who went through the IDES process felt that the process 
was fair compared with those who went through the legacy system. 
However, servicemembers in the Air Force who went through the IDES 
pilot indicated less satisfaction with the process than those who went 
through the legacy system, though Air Force members represented a 
small proportion of pilot cases--about 7 percent of those enrolled in 
the pilot.[Footnote 13] We reviewed the agencies' survey methodology 
and generally found their survey design and conclusions to be sound 
(see appendix I for further information on our review). 

* Average case processing times: The agencies have been meeting their 
295-day and 305-day timeliness goals for much of the past 2 years, but 
more recent weekly reports indicate case processing time has been 
increasing and that they are now missing their goal for active duty 
members.[Footnote 14] As of August 29, 2010, the agencies missed the 
goal for active duty servicemembers by 1 day, while still meeting the 
305-day goal for reserve component members by 7 days. Processing times 
have increased as caseload has increased, from about 5,750 active 
cases in February to about 9,650 cases in August 2010. We reviewed the 
reliability of the VTA data upon which the agencies based their 
analyses and generally found these data to be sufficiently reliable 
for purposes of these analyses.[Footnote 15] 

The increases in overall case processing time and caseloads mirror the 
trends at individual sites. For each pilot site, case processing times 
have generally increased as workloads have increased. For example, 
figure 3 shows the case processing times 1 year or more after 
implementation and in August 2010 for the first seven pilot sites. 

Figure 3: Average Case Processing Times and Changes in Active Caseload 
by Location at Least 1 Year After Implementation and in August 2010: 

[Refer to PDF for image: horizontal bar graph] 

Processing time: 

Location: Andrews (Air Force); 
About 1 year after implementation[A]: 292 days; 
August 29, 2010[B]: 355 days; 
Change in active caseload: Increased by 23 cases. 

Location: Walter Reed (Army); 
About 1 year after implementation[A]: 250 days; 
August 29, 2010[B]: 324 days; 
Change in active caseload: Decreased by 13 cases. 

Location: Fort Belvoir (Army); 
About 1 year after implementation[A]: 238 days; 
August 29, 2010[B]: 298 days; 
Change in active caseload: Increased by 44 cases. 

Location: Fort Meade (Army); 
About 1 year after implementation[A]: 257 days; 
August 29, 2010[B]: 283 days; 
Change in active caseload: Increased by 44 cases. 

Location: Fort Stewart (Army); 
About 1 year after implementation[A]: 203 days; 
August 29, 2010[B]: 258 days; 
Change in active caseload: Increased by 144 cases. 

Location: Bethesda (Navy); 
About 1 year after implementation[A]: 286 days; 
August 29, 2010[B]: 380 days; 
Change in active caseload: Increased by 135 cases. 

Location: San Diego (Navy); 
About 1 year after implementation[A]: 236 days; 
August 29, 2010[B]: 316 days; 
Change in active caseload: Increased by 413 cases. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] For the oldest pilot locations--Walter Reed Army Medical Center, 
Andrews Air Force Base, and Bethesda Naval Medical Center--the first 
average case processing times shown were as of May 31, 2009, which is 
more than 1 year after these sites began implementing the pilot in 
November 2007 because this was the first month that the agencies 
reported processing times by location. The first processing date for 
all other sites comes from the weekly report closest to 1 year after 
each site began implementing the pilot. (The implementation dates 
were: October 1, 2008, for Fort Belvoir and Fort Meade; October 31, 
2008, for Naval Medical Center San Diego; and November 30, 2008, for 
Fort Stewart). 

[B] The end date for the changes in active caseload is August 22, 
2010, and differs by 1 week from the end date of August 29, 2010, used 
for the average case processing time because the data used in these 
two analyses are presented in different appendices to the weekly 
reports that rotate each week. 

[End of figure] 

Of the four military services, only the Army and Navy were achieving 
the 295-and 305-day goals on average, as of February 2010, and only 
the Army was achieving these goals as of August 2010. Because the Army 
comprises a large proportion of cases (approximately 60 percent of 
IDES pilot cases that have completed the whole process), it has 
lowered the overall average processing time to near or below the 
established goals. Figure 4 shows the average case processing times 
for active duty, by service, as of August 2010. (See appendix II for 
reserve component.) 

Figure 4: Active Duty IDES Case Processing Times by Service, as of 
August 29, 2010: 

[Refer to PDF for image: horizontal bar graph] 

Service goal: 295 days. 

Service: Air Force; 
Elapsed time in days: 339. 

Service: Army; 
Elapsed time in days: 266. 

Service: Navy; 
Elapsed time in days: 340. 

Service: Marine Corps; 
Elapsed time in days: 334. 

Service: All; 
Elapsed time in days: 296. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[End of figure] 

As of February 2010, the agencies also had not met the goal of 
processing 80 percent of all pilot cases within targeted time frames. 
Specifically, about 60 percent of active duty pilot cases have been 
completed within 295 days, according to our analysis of the agencies' 
case data intended for their interim report. Further, none of the four 
military services have achieved this goal, although the Army has had 
the highest rate of cases (66 percent) meeting the goal, while only 42 
percent of Air Force cases were processed within the time frame (see 
figure 5 for active duty and appendix II for reserve component). 

Figure 5: Percentage of IDES Active Duty Cases Completed in 295 Days 
or Less by Service, as of February 2010: 

[Refer to PDF for image: horizontal bar graph] 

Service goal: 80%. 

Service: Air Force; 
Cases meeting goal: 42%. 

Service: Army; 
Cases meeting goal: 66%. 

Service: Navy; 
Cases meeting goal: 57%. 

Service: Marine Corps; 
Cases meeting goal: 53%. 

Service: All; 
Cases meeting goal: 60%. 

Sources: GAO analysis of pilot case data from DOD and VA. 

[End of figure] 

Extent of Improvement Over the Legacy System Is Unknown Due to Gaps in 
Legacy Data: 

DOD and VA planned to compare the case processing times of 
servicemembers in the IDES pilot and servicemembers who, between 
fiscal years 2005 and 2009, were enrolled in the legacy system at 
pilot sites prior to pilot implementation, but significant gaps in the 
legacy case data preclude reliable comparisons. DOD compiled the 
legacy case data from each of the military services and the VA, but 
the military services each had slightly different disability 
evaluation processes, used different data systems, and did not track 
the same information. As a result, information needed to conduct a 
comparison is not available for all services. For example, the Navy, 
Marine Corps, and Air Force legacy data do not have information on 
when the servicemember was referred into the disability evaluation 
system and, as a result, case-processing time for the legacy system 
DOD-wide cannot be known.[Footnote 16] DOD officials said they planned 
to estimate legacy case processing time by approximating the dates 
that servicemembers in the Navy, Marine Corps, and Air Force were 
referred into the disability evaluation process, but their methodology 
was based on a limited number of Army cases (see appendix I for 
further information). In addition, for legacy cases across all 
military services, VA was not able to provide data on the date VA 
benefits were delivered, so total case processing time from referral 
to delivery of VA benefits cannot be measured. However, while legacy 
case data are not sufficiently reliable for comparison with the IDES 
overall, the Army's legacy data appear to be reliable on some key 
processing dates, making some limited comparisons possible. Our 
analysis of Army legacy data suggests that, under the legacy process, 
active duty Army cases took 369 days to complete the DOD legacy 
process and reach the VA rating phase--though this figure does not 
include time to complete the VA rating and provide the benefits to 
servicemembers--compared with 266 days to deliver VA benefits to 
servicemembers under the pilot, according to the agencies' August 
weekly report.[Footnote 17] However, Army comparisons cannot be 
generalized to the other services. 

The agencies also planned to compare servicemembers' appeal rates in 
the pilot and legacy systems, but similar gaps in the legacy data 
preclude a comparison DOD-wide. For example, the legacy data that DOD 
compiled did not contain data on appeals of informal PEB decisions to 
the formal PEB in the Navy and Marines, and consequently the rate of 
appeals across the military in the legacy system is unknown. While the 
Army's appeals data appear to be more reliable, potentially making 
some limited comparisons possible, the agencies' method for comparing 
pilot appeals with legacy has limitations. DOD officials told us they 
are planning to compare the proportion of informal PEB decisions that 
were appealed to a formal PEB hearing in the pilot and legacy systems. 
However, this will not take into account that, under the legacy 
system, a servicemember could appeal the informal PEB's decision for 
two reasons--because they were dissatisfied with the fitness decision 
or the disability rating the PEB assigned, while in the IDES, they can 
only appeal the informal PEB decision to a formal PEB if they are 
dissatisfied with the fitness decision. Under the IDES, servicemembers 
who disagree with the disability rating can appeal to VA for a rating 
reconsideration. By not including appeals to VA for rating 
reconsiderations, the agencies may overestimate the decrease in 
appeals in the IDES pilot. For example, our analysis of data as of 
early 2010 for the Army indicates that Army members in the pilot 
appealed 7.5 percent of informal PEB decisions. However, when appeals 
to VA are factored in, 13 percent of Army members in the pilot filed 
an appeal, which is the same proportion as in the legacy system (see 
figure 6). 

Figure 6: Army Servicemember IDES and Legacy Appeal Rates, as of early 
2010: 

[Refer to PDF for image: horizontal bar graph] 

Legacy[A]: Cases with an appeal; 
Informal PEB appeals: 12.9%. 

IDES pilot[B]: Cases with an appeal; 
Informal PEB appeals: 7.5%; 
VA rating reconsiderations: 5.6%; 
Total: 13.1%. 

Sources: GAO analysis of legacy data and pilot case data provided by 
DOD and VA. 

[A] Legacy data is as of January 2010 on servicemembers who were 
referred to the disability evaluation system between fiscal years 2005 
and 2009 at 21 military treatment facilities selected as IDES pilot 
sites. 

[B] Pilot case data is as of February 2010 for servicemembers who were 
referred to the IDES beginning November 2007. 

[End of figure] 

In addition to evaluating the three goals, DOD and VA initially 
planned a cost-benefit analysis of the IDES program but have only 
completed an analysis of costs. According to data provided to us in 
August 2010, DOD projects that costs directly associated with 
implementing the IDES will be $63 million greater per year when 
compared with the legacy system, after full expansion of the IDES. 
[Footnote 18] In October 2010, VA reported to us total IDES cost 
estimates of approximately $50 million for fiscal year 2011--about $33 
million for VBA, which provides VA case managers and rating staff to 
the IDES, and $17 million for the Veterans Health Administration 
(VHA), which provides medical staff to perform the single exams. 
[Footnote 19] These analyses did not quantify the value of potential 
benefits created by the pilot, for example time savings from DOD 
physicians no longer needing to perform disability examinations, which 
allows them to perform other duties. 

Pilot Sites Experienced Several Challenges: 

As DOD and VA tested the IDES at different facilities and added 
caseload to the pilot, they encountered several challenges that led to 
delays in certain phases of the process. Among these were insufficient 
staffing, challenges in conducting the single exams, logistical 
challenges related to integrating VA staff, as well as housing and 
managing servicemembers going through the IDES. DOD and VA were able 
to address some, but not all, of these challenges as they arose. 

DOD and VA Did Not Sufficiently Staff Many Key Positions in the IDES 
Pilot: 

DOD and VA have not provided sufficient numbers of staff in many of 
the IDES locations, affecting their ability to complete certain phases 
of the IDES process within the goals they established. Officials at 
most of the 10 pilot sites we visited said they have experienced 
staffing shortages to at least some extent, with a few sites--Fort 
Carson and Fort Stewart, in particular--experiencing severe shortages. 

VA or contract examiners: At three pilot sites we visited--Fort 
Carson, Fort Polk, and Fort Stewart--local officials said that a lack 
of VA or VA contractor staff who could perform the required single 
medical exams led to bottlenecks in the process.[Footnote 20] For 
example, as of August 2010, exams at Fort Carson have taken an average 
of 140 days to complete for active duty servicemembers, according to 
the agencies' data, far from achieving their goal to complete single 
medical exams within 45 days (see figure 7; see also appendix II for 
processing times for reserve component members). Across all pilot 
sites, exams have taken 68 days to complete for active duty 
servicemembers, on average, with 8 of the 27 pilot sites meeting the 
45-day goal.[Footnote 21] The different sites we visited faced 
shortages for different types of examiners. For instance, Fort 
Carson's IDES process was particularly hampered by a lack of mental 
health specialists; in contrast, VA officials serving the Fort Polk 
pilot site said they had sufficient specialists to perform specialty 
medical exams but did not have enough examiners to complete general 
medical exams. 

Figure 7: Single Exam Processing Time for Active Duty Servicemembers 
at IDES Pilot Sites, as of August 29, 2010: 

[Refer to PDF for image: vertical bar graph] 

Service goal: 45 days; 
Average for all sites: 68 days. 

Air Force: 

Site: Elmendorf; 
Elapsed time in days: 74. 

Site: Andrews; 
Elapsed time in days: 58. 

Site: Nellis; 
Elapsed time in days: 57. 

Site: Vance; 
Elapsed time in days: 52. 

Site: MacDill; 
Elapsed time in days: 40. 

Site: Travis; 
Elapsed time in days: 33. 

Army: 

Site: Fort Carson; 
Elapsed time in days: 140. 

Site: Fort Stewart; 
Elapsed time in days: 94. 

Site: Fort Richardson; 
Elapsed time in days: 81. 

Site: Fort Wainwright; 
Elapsed time in days: 80. 

Site: Walter Reed; 
Elapsed time in days: 73. 

Site: Fort Sam Houston; 
Elapsed time in days: 70. 

Site: Fort Belvoir; 
Elapsed time in days: 65. 

Site: Fort Polk; 
Elapsed time in days: 62. 

Site: Fort Drum; 
Elapsed time in days: 57. 

Site: Fort Lewis; 
Elapsed time in days: 56. 

Site: Fort Benning; 
Elapsed time in days: 44. 

Site: Fort Hood; 
Elapsed time in days: 39. 

Site: Fort Meade; 
Elapsed time in days: 38. 

Site: Fort Riley; 
Elapsed time in days: 38. 

Site: Fort Bragg; 
Elapsed time in days: 30. 

Navy[A]: 

Site: Bethesda; 
Elapsed time in days: 57. 

Site: Camp Lejeune; 
Elapsed time in days: 49. 

Site: Portsmouth; 
Elapsed time in days: 48. 

Site: San Diego; 
Elapsed time in days: 46. 

Site: Camp Pendleton; 
Elapsed time in days: 45. 

Site: Bremerton; 
Elapsed time in days: 42. 

Marine Corps: 

Site: Camp Lejeune; 
Elapsed time in days: 62. 

Site: Camp Pendleton; 
Elapsed time in days: 59. 

Site: Bethesda; 
Elapsed time in days: 58. 

Site: Portsmouth; 
Elapsed time in days: 50. 

Site: San Diego; 
Elapsed time in days: 47. 

Site: Bremerton; 
Elapsed time in days: 44. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] This figure shows processing times separately for servicemembers 
in the Navy and Marine Corps at the six Navy IDES pilot sites. 

[End of figure] 

Military physicians: At some of the pilot sites we visited, local DOD 
officials felt there were not enough physicians to quickly complete 
and document their determinations of whether servicemembers' medical 
conditions may limit their ability to serve in the military. As a 
result, the sites had difficulty achieving the agencies' goal to 
complete the MEB determinations within 35 days. Across all sites, the 
MEB determination has taken an average of 61 days to complete for 
active duty servicemembers, with 8 of the 27 sites meeting the 35-day 
goal, as of August 2010.[Footnote 22] A few sites we visited were far 
from achieving the 35-day goal, such as Fort Belvoir, where MEB 
determinations averaged 101 days to complete for active duty 
servicemembers (see figure 8 and appendix II for processing times for 
reserve component members). Only the Army, which has physicians 
dedicated to disability evaluation, has established a caseload target 
for MEB physicians--120 servicemembers per physician, but Army 
officials were not able to provide us with data on the extent to which 
pilot sites met this target. The Navy and Air Force have not 
established caseload targets for their physicians; their MEB 
determinations are prepared by physicians who perform other 
responsibilities, such as clinical treatment or supervision. 

Figure 8: MEB Processing Times for Active Duty Servicemembers at IDES 
Pilot Sites, as of August 29, 2010: 

[Refer to PDF for image: vertical bar graph] 

Service goal: 35 days; 
Average for all sites: 61 days. 

Air Force: 

Site: MacDill; 
Elapsed time in days: 106. 

Site: Andrews; 
Elapsed time in days: 85. 

Site: Nellis; 
Elapsed time in days: 58. 

Site: Travis; 
Elapsed time in days: 43. 

Site: Vance; 
Elapsed time in days: 33. 

Site: Elmendorf; 
Elapsed time in days: 30. 

Army: 

Site: Fort Richardson; 
Elapsed time in days: 109. 

Site: Fort Meade; 
Elapsed time in days: 100. 

Site: Fort Belvoir; 
Elapsed time in days: 101. 

Site: Walter Reed; 
Elapsed time in days: 76. 

Site: Fort Stewart; 
Elapsed time in days: 74. 

Site: Fort Carson; 
Elapsed time in days: 69. 

Site: Fort Sam Houston; 
Elapsed time in days: 65. 

Site: Fort Lewis; 
Elapsed time in days: 56. 

Site: Fort Hood; 
Elapsed time in days: 52. 

Site: Fort Polk; 
Elapsed time in days: 47. 

Site: Fort Drum; 
Elapsed time in days: 39. 

Site: Fort Riley; 
Elapsed time in days: 37. 

Site: Fort Benning; 
Elapsed time in days: 35. 

Site: Fort Wainwright; 
Elapsed time in days: 34. 

Site: Fort Bragg; 
Elapsed time in days: 33. 

Navy[A]: 

Site: Camp Lejeune; 
Elapsed time in days: 95. 

Site: Camp Pendleton; 
Elapsed time in days: 73. 

Site: Bethesda; 
Elapsed time in days: 72. 

Site: San Diego; 
Elapsed time in days: 31. 

Site: Portsmouth; 
Elapsed time in days: 24. 

Site: Bremerton; 
Elapsed time in days: 20. 

Marine Corps: 

Site: Camp Lejeune; 
Elapsed time in days: 89. 

Site: Camp Pendleton; 
Elapsed time in days: 66. 

Site: Bethesda; 
Elapsed time in days: 62. 

Site: San Diego; 
Elapsed time in days: 33. 

Site: Portsmouth; 
Elapsed time in days: 23. 

Site: Bremerton; 
Elapsed time in days: 14. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] This figure shows processing times separately for servicemembers 
in the Navy and Marine Corps at the six Navy IDES pilot sites. 

[End of figure] 

DOD PEB adjudicators: Officials with the Air Force and Navy PEBs, who 
determine a servicemember's fitness for duty, also expressed concerns 
about understaffing, though their concerns are not related to the IDES 
alone since they are currently reviewing cases in both the legacy 
system and the IDES pilot. Air Force PEB officials noted that they had 
a substantial backlog of disability evaluation system cases awaiting a 
fitness decision, though they recently added adjudicators to reduce 
the backlog. Navy PEB officials also expressed concerns that lack of 
sufficient staff has made it difficult to process cases in a timely 
manner. At the time of our review, none of the services were meeting 
the agencies' goal for informal PEBs to complete their fitness 
decisions within 15 days, with the Air Force, Navy, and Marine Corps 
far from reaching it (See figure 9 for processing times for active 
duty servicemembers. See also appendix II for reserve component 
processing times). At 23 days, the Army, with 3 PEBs, is slightly 
short of the goal. However, we could not determine case processing 
times at each Army PEB, since the agencies' weekly monitoring report 
presents data by military services but not by individual PEB. In 
addition, Air Force and Army PEB officials informed us that they had 
prioritized IDES pilot cases over legacy cases at some point in time. 
As a result, DOD's data for those services may underestimate the 
amount of time the informal PEB would have taken if IDES cases had not 
received priority. 

Figure 9: Informal PEB Processing Times for Active Duty Servicemembers 
in the IDES Pilot, by Military Service, as of August 29, 2010: 

[Refer to PDF for image: horizontal bar graph] 

Service goal: 15 days. 

Service: Air Force; 
Elapsed time in days: 64. 

Service: Army; 
Elapsed time in days: 23. 

Service: Navy; 
Elapsed time in days: 83. 

Service: Marine Corps; 
Elapsed time in days: 97. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] The Navy PEB determines fitness decisions for servicemembers in 
the Marine Corps. 

[End of figure] 

VA rating staff: Officials at the Baltimore rating office--one of the 
two VA offices that conduct disability ratings for the IDES pilot-- 
expressed significant concerns that they were understaffed, in part 
due to staff turnover. DOD and VA data show that, overall, the VA 
rating offices are not meeting the agencies' goal to complete ratings 
within 15 days, taking 39 days on average for active duty 
servicemembers and 42 days for reserve component members.[Footnote 23] 
We could not determine case processing times at each individual VA 
rating office, since DOD and VA's weekly monitoring reports do not 
provide processing times for the rating phase by office. The weekly 
reports also do not provide data on caseloads at each office. Although 
the Baltimore office currently has fewer rating staff than Seattle, VA 
officials said that it has prepared ratings for the majority of IDES 
pilot cases, based on the way in which VA has allocated cases between 
the two offices. The Baltimore office handles cases for the Air Force, 
Navy, Marines, and 5 of the 15 Army pilot sites, while the Seattle 
office conducts ratings for the remaining 10 Army pilot sites. 
[Footnote 24] VA officials said that to address staffing shortages in 
Baltimore, they have assigned staff from other VA offices to assist 
the Baltimore office. 

VA case managers: DOD and VA have set a target for each VA case 
manager to handle no more than 30 cases at a time, but two sites we 
visited--Fort Carson and Fort Stewart--appeared to be far from these 
targets. At Fort Carson, three VA case managers told us they were 
handling about 900 cases when we visited in April 2010, for a caseload 
ratio of roughly 1:300. At the time of our visit in June 2010, Fort 
Stewart had over 750 active cases with two VA case managers, for a 
caseload ratio of approximately 1:375. Although local officials we 
spoke with at both sites told us that the numbers of VA case managers 
were insufficient, an official at VA's central office told us that VA 
bases staffing of case managers on the number of new (not pending) 
cases each month, and the agencies' data indicates the average number 
of new cases per VA case manager has been about 25 at each site. The 
VA official said that the reason local case managers felt understaffed 
was likely due to other process inefficiencies. In addition, the 
official told us VA can reassign staff from other VA programs to 
assist case managers at IDES pilot sites as needed. At some of the 
other pilot sites we visited, local officials also told us they had 
concerns at times about the numbers of VA case managers available to 
handle the site's caseload, but VA was able to add staff. VA case 
managers at two Air Force sites we visited--Travis and Vance Air Force 
Bases--indicated that their caseloads were manageable. We were unable 
to independently determine the extent to which VA is meeting its 
caseload target because VA does not collect national data on actual 
caseloads per case manager. 

DOD board liaisons: At most of the sites we visited, local officials 
expressed concerns about insufficient numbers of DOD board liaisons, 
who serve as servicemembers' DOD case managers. DOD guidance has been 
inconsistent on the caseload target for DOD board liaisons. While 
DOD's operations manual for the IDES pilot sets a caseload target of 
at most 30 cases per board liaison, guidance on the general disability 
evaluation system sets the target at a maximum of 20 cases per 
liaison. DOD and VA's documents related to planning for IDES expansion 
indicate that DOD is striving for a 1:20 caseload target in the IDES. 
However, 19 of the 27 pilot sites did not meet the 1:30 caseload 
target, and 23 did not meet the 1:20 target (see figure 10). 

Figure 10: Average Cases per DOD Board Liaison at IDES Pilot Sites: 

[Refer to PDF for image: vertical bar graph] 

Goals: 
1:30 caseload; 
1:20 caseload. 

Air Force: 

Site: Travis; 
Cases per DOD board liaison: 49. 

Site: Nellis; 
Cases per DOD board liaison: 30. 

Site: Elmendorf; 
Cases per DOD board liaison: 26. 

Site: MacDill; 
Cases per DOD board liaison: 22. 

Site: Andrews; 
Cases per DOD board liaison: 14. 

Site: Vance; 
Cases per DOD board liaison: 5. 

Army: 

Site: Fort Lewis; 
Cases per DOD board liaison: 152. 

Site: Fort Bragg; 
Cases per DOD board liaison: 84. 

Site: Fort Meade; 
Cases per DOD board liaison: 76. 

Site: Fort Carson; 
Cases per DOD board liaison: 70. 

Site: Fort Polk; 
Cases per DOD board liaison: 56. 

Site: Fort Sam Houston; 
Cases per DOD board liaison: 56. 

Site: Fort Richardson; 
Cases per DOD board liaison: 46. 

Site: Fort Stewart; 
Cases per DOD board liaison: 45. 

Site: Fort Hood; 
Cases per DOD board liaison: 44. 

Site: Fort Belvoir; 
Cases per DOD board liaison: 37. 

Site: Fort Wainwright; 
Cases per DOD board liaison: 35. 

Site: Fort Drum; 
Cases per DOD board liaison: 35. 

Site: Fort Riley; 
Cases per DOD board liaison: 35. 

Site: Walter Reed; 
Cases per DOD board liaison: 27. 

Site: Fort Benning; 
Cases per DOD board liaison: 10. 

Navy[A]: 

Site: Camp Lejeune; 
Cases per DOD board liaison: 86. 

Site: Camp Pendleton; 
Cases per DOD board liaison: 60. 

Site: Portsmouth; 
Cases per DOD board liaison: 41. 

Site: San Diego; 
Cases per DOD board liaison: 38. 

Site: Bremerton; 
Cases per DOD board liaison: 35. 

Site: Bethesda; 
Cases per DOD board liaison: 12. 

Sources: GAO presentation of data from the Departments of the Air 
Force, second quarter, fiscal year 2010; Army, May 2010; and Navy, 
October 2010. 

[A] These Navy military treatment facilities also serve members in the 
Marine Corps. 

[End of figure] 

Local DOD and VA officials attributed staffing shortages to higher 
than anticipated caseloads and difficulty finding qualified staff in 
rural areas. At several of the pilot sites we visited, officials said 
that caseloads were higher than the initial estimates that they had 
based staffing levels upon. DOD officials said that they had based 
caseload estimates on a 1-year history of caseload at each site. While 
some sites have added staff as caseloads increased, others, such as 
Fort Polk, located in central Louisiana, have had difficulty finding 
qualified staff, particularly physicians, in this rural area.[Footnote 
25] 

Two of the pilot sites we visited--Fort Carson and Fort Stewart--were 
particularly challenged to provide staff in response to surges in 
caseload, which occurred when Army units were preparing to deploy to 
combat zones. Through the Army's predeployment medical assessment 
process, large numbers of servicemembers were determined to be unable 
to deploy due to a medical condition and were referred to the IDES 
within a short period of time, overwhelming the staff.[Footnote 26] 
These two sites were unable to quickly increase staffing levels, 
particularly clinicians performing the single exam. The VA medical 
center conducting the single medical exams for Fort Carson experienced 
turnover among its examiners at the same time that the caseload 
surged, while at Fort Stewart, the contractor performing the single 
medical exams had difficulties finding qualified physicians in a rural 
area of Georgia. To address caseload surges, examiners were reassigned 
from other locations to the pilot sites. For example, VA officials 
told us they assigned examiners from other VA medical centers to the 
Fort Carson IDES and established a contract with a private-sector 
provider to complete the exams that VA examiners would normally have 
performed for veterans in the area claiming VA disability 
compensation. At Fort Stewart, the contractor told us that they had 
reassigned examiners from their Atlanta clinic to Fort Stewart. 

Insufficiency of Exam Summaries and Disagreements about Medical 
Diagnoses and Ratings Can Prolong Case Processing Time: 

Issues related to the completeness and clarity of single exam 
summaries were an additional cause of delays in the VA rating phase of 
the IDES process. Officials from VA rating offices said that some exam 
summaries did not contain information necessary to make a rating or 
fitness decision, or were unclear as to the examiners' diagnoses and 
conclusions. As a result, VA rating office staff must ask the examiner 
to clarify the summary or add information and, in some cases, redo the 
exam, adding time to the process. In addition, VA rating staff told us 
that it is sometimes unclear who they should contact if they identify 
insufficiencies in an exam summary and finding the appropriate person 
also adds time. However, the extent to which insufficient exam 
summaries caused delays in the IDES process is unknown because DOD and 
VA's VTA system does not track whether an exam summary had to be 
returned to the examiner or whether it was resolved. Due to these 
limitations, VA officials told us that VA rating staff have created 
logs of outstanding insufficient exams and sent them to VA examiners 
to correct. 

VA officials attributed the problems with exam summaries to several 
factors, including the difficulty of conducting exams for IDES pilot 
cases, which may entail evaluating many complex medical conditions and 
may involve several physicians and specialists. In addition, VA 
officials indicated that, at sites with exam backlogs, such as at Fort 
Carson, it may be difficult for examiners to ensure quality when are 
trying to complete exams quickly. Furthermore, VA staff noted that 
some errors were common, such as missing information for 
musculoskeletal conditions and traumatic brain injury, suggesting that 
some examiners may not be aware of the information required for 
certain types of medical conditions. Finally, while examiners are 
supposed to receive the servicemember's complete medical records prior 
to the date of the exam, some VA examiners also told us that they did 
not receive the records in time for the exam in some cases, or the 
records were not well-organized. As a result, they lacked key 
information, such as the servicemember's medical history and results 
of laboratory tests. According to the agencies' operations manual for 
the IDES pilot, the DOD board liaison should compile the complete 
medical records within 10 days of an active duty servicemember being 
referred to the IDES, but some DOD officials we spoke with said that 
it is sometimes difficult to obtain all of the records, particularly 
when servicemembers have received treatment from private-sector 
physicians.[Footnote 27] 

In addition, while the single exam in the IDES eliminates duplicative 
exams performed by DOD and VA in the legacy system, it raises the 
potential for there to be disagreements about diagnoses of 
servicemembers' conditions, with implications for their disability 
ratings, as well as processing times. DOD officials we spoke with in 
our interviews and site visits also said that their physicians 
sometimes disagree with VA medical diagnoses, particularly for mental 
health conditions, and this has extended processing times for some 
cases. In addition, since medical diagnoses are a basis for VA's 
disability ratings, DOD may subsequently disagree with the ratings VA 
completed for determining DOD disability benefits. The number of cases 
with disagreements about diagnoses and ratings, and the extent to 
which they have increased processing time, are unknown because the VTA 
system does not track when a case has had such disagreements. However, 
officials at 4 of the 10 pilot sites we visited said that military 
physicians have disagreed with VA diagnoses in at least some cases. In 
addition, PEB officials in two of the three military services--the 
Army and the Navy--said that they have sometimes disagreed with the 
rating VA produced for determining DOD disability benefits. 

An example can illustrate the implications of differences in 
diagnoses. Officials at Army pilot sites informed us about cases in 
which a military physician had treated members for a mental condition, 
such as anxiety or depressive disorder. However, when the members went 
to see the VA examiners for their single exam, the examiners diagnosed 
them with posttraumatic stress disorder (PTSD). When such cases were 
sent to the PEB, it returned them to the MEB because it was unclear to 
the PEB which conditions should be the basis of their decision on the 
servicemembers' fitness for duty. The cases then languished because 
the military physicians experienced difficulties resolving the 
discrepancy with the VA diagnosis. 

To address such processing delays, the Army issued guidance in 
February 2010 stating that MEB physicians should review all of the 
medical records (including the results of the single exam) and 
determine whether to revise their diagnoses. If after doing so the MEB 
physician maintains that their original diagnosis is accurate, they 
should write a memorandum summarizing the basis of their decision, and 
the PEB should accept the MEB's diagnosis. Some Army officials we 
spoke with believe that this guidance has been helpful for enabling 
cases to move forward when there are differences in diagnoses. The 
other services do not have written guidance on how to address 
differences in diagnoses, though Navy officials told us that they have 
provided verbal guidance to their physicians, and Air Force officials 
said they have not had cases with significant disagreements about 
diagnoses. 

In some cases, due to the differences in diagnoses, DOD has also 
disagreed with the rating that VA prepared for DOD disability 
benefits, particularly in cases involving servicemembers with mental 
health conditions.[Footnote 28] For example, Army and Navy officials 
told us about cases in which the PEB found the servicemember unfit due 
to a mental condition, such as major depression, and asked VA to 
complete a rating for this condition. However, VA returned a rating 
for occupational and social impairment caused by PTSD, since the 
examiner had diagnosed the member with PTSD. DOD requires a rating for 
only the conditions for which the member was found unfit for duty 
because it can only provide disability benefits for those conditions. 
However, according to VA regulations for rating mental disorders, VA 
does not rate each mental health condition individually; rather, VA 
bases its rating on the degree to which the combination of symptoms of 
mental disorders cause occupational and social impairment.[Footnote 
29] As such, when rating mental health conditions for IDES cases, VA 
officials said that rating specialists would consider both the 
symptoms of mental conditions diagnosed by DOD physicians and those 
identified by the VA examiner. Both Army and Navy PEB officials said 
that they generally accept VA ratings in these cases, even though the 
rating is not for the unfitting conditions alone. However, they noted 
that, if they feel the VA rating is in error, there is no guidance on 
how disagreements about servicemembers' ratings should be resolved. 
Army and Navy officials said that they may return the case to VA and 
informally request that they reconsider the case, though Navy PEB 
officials said that they are hesitant to do so because it may further 
delay the case. 

DOD and VA officials attributed disagreements about diagnoses to 
several factors. They noted that VA examiners may not have received or 
reviewed the servicemembers' medical records prior to the exam, and 
therefore may not be aware of the medical conditions for which the 
members had been previously diagnosed and treated. In addition, DOD 
and VA identify conditions for different purposes in the disability 
evaluation system. While DOD identifies conditions that make a 
servicemember unable to perform their duties, VA identifies all 
service-connected conditions. As such, VA examiners are likely to 
identify a broader set of conditions than DOD's physicians. In 
addition, local officials we spoke with in some of our site visits 
said that servicemembers may be more willing to disclose all of their 
medical conditions to VA than to DOD because VA could potentially 
compensate them for all of the conditions. Furthermore, VA officials 
noted that servicemembers' health conditions may have changed between 
the time DOD physicians identified the conditions and VA performed the 
exam. Finally, DOD and VA officials said that differences in opinions 
about diagnoses are common among physicians, particularly in the 
mental health field. For example, they noted that it be can be 
difficult to distinguish PTSD from anxiety, depression, and other 
mental health conditions.[Footnote 30] 

Pilot Sites Faced Various Logistical Challenges Integrating VA Staff: 

DOD and VA officials at several pilot sites said that they experienced 
some logistical challenges integrating VA staff at the military 
facilities. At a few sites, it took time for VA staff to receive 
common access cards needed to access the military facilities and to 
use the facilities' computer systems. During the time that VA staff 
did not have access cards, they were unable to access VA computer 
systems, such as those for establishing the VA claim, requesting 
exams, and viewing exam results, via DOD's network. 

In addition, DOD and VA staff noted several difficulties using the 
agencies' multiple information technology (IT) systems to process 
cases. While the agencies both use the VTA system to manage cases, VA 
also has IT systems for completing certain tasks, and the military 
services also have their own case tracking systems. This causes DOD 
and VA staff to have to enter the same data multiple times into 
different IT systems. In addition, some VA staff working on military 
bases reported that using the military services' computer systems to 
access VA systems has significantly slowed down computer processing 
speeds. Finally, DOD and VA staff cannot directly access each others' 
systems, making it more cumbersome for case managers to determine the 
status of servicemembers' cases. For example, without access to VA's 
system for managing exams, DOD board liaisons cannot readily provide 
servicemembers with information on when or where their exams are 
scheduled and must contact VA case managers to obtain the information. 
A few sites we visited were able to address some IT issues. For 
example, at Fort Polk, VA officials said they were adding a new 
telecommunications line to provide faster computer processing speeds 
for their staff. 

In addition, VA physicians working at military facilities need to be 
credentialed by DOD before they can begin working on base, which 
involves verification of their education, license, and clinical 
history. Some VA officials said that this process could take 1 month 
or longer to complete.[Footnote 31] 

Extended Periods in the Military Disability Evaluation Process Posed 
Housing and Other Challenges at Some Pilot Sites: 

Although many DOD and VA officials we interviewed at central offices 
and pilot sites felt that the IDES process expedited the delivery of 
VA benefits to servicemembers, several also indicated that it may 
increase the amount of time servicemembers are in the military's 
disability evaluation process. Data on legacy cases are not 
sufficiently reliable to determine whether this is the case military-
wide, but Army data appear to be sufficiently reliable to allow for 
some limited analysis. Our analysis of Army pilot and legacy data as 
of early 2010 shows that compared with legacy cases, active duty cases 
in the pilot took on average 39 more days to reach the end of the PEB 
phase--the last step of the DOD disability evaluation process before 
servicemembers begin transitioning from military service or, if found 
fit, back to duty. For reserve component cases in the Army, IDES pilot 
cases took on average 17 more days to reach the end of the PEB phase, 
compared with legacy cases. It was not possible to conduct this 
analysis for the other military services because their legacy data 
lacked information on when servicemembers were referred into the 
disability evaluation system. 

Some DOD officials noted that the increased time that servicemembers 
are in the military's disability evaluation process means that they 
must be cared for and managed for a longer period. Officials in our 
site visits and interviews said that some pilot sites have had 
challenges housing servicemembers in the IDES, in part due to 
servicemembers being in the process longer. For some servicemembers in 
the disability evaluation system, the military services may move them 
to temporary medical units or, for those needing longer-term medical 
care or complex case management, to special medical units such as a 
Warrior Transition Unit in the Army or Wounded Warrior Regiment in the 
Marine Corps.[Footnote 32] However, these units were full at a few 
pilot sites we visited, or members in the IDES did not meet the 
criteria for entering the special medical units. Where servicemembers 
remain with their units while going through the disability evaluation 
system, the units cannot replace them with able-bodied members. 
Officials at Fort Carson said that this created a challenge for combat 
units. Because most servicemembers in the IDES did not meet the 
criteria for entering Warrior Transition Units, combat units had to 
find another organizational unit to take charge of members in the IDES 
so they could replace them with soldiers ready and able to deploy to 
combat areas. In addition, officials at Naval Medical Center San Diego 
and Fort Carson said that some members are not gainfully employed by 
their units and, left idle while waiting to complete their disability 
evaluation process, are more likely to engage in negative behavior, 
potentially resulting in their being discharged due to misconduct and 
a forfeiture of disability benefits.[Footnote 33] We were unable to 
assess the extent or cause of this problem because the VTA system that 
tracks servicemembers in the IDES does not capture sufficient detail 
on reasons for servicemembers dropping out of the IDES, or which 
organizational unit(s) the servicemember was assigned to while in the 
IDES. DOD officials also noted that servicemembers benefit from 
continuing to receive their salaries and benefits while their case 
undergoes scrutiny by two agencies, though some also acknowledged that 
these additional salaries and benefits create costs for DOD. 

DOD and VA Expansion Plans Address Some Though Not All Challenges: 

DOD and VA Have Incorporated Many Lessons Learned into Their Planning 
for Worldwide Expansion of the IDES but Lack Concrete Plans for 
Addressing Some Challenges: 

DOD and VA plan to expand the IDES to sites worldwide on an ambitious 
timetable--to 113 sites during fiscal year 2011, a pace of about 1 
site every 3 days. Expansion is scheduled to occur in four stages, 
beginning with 28 sites in the southeastern and western United States 
by the end of December 2010.[Footnote 34] 

DOD and VA have many efforts under way to prepare for IDES expansion. 
At each site, local DOD and VA officials are expected to work together 
to prepare for implementation. This includes completing a site 
assessment matrix--a checklist of information DOD and VA officials at 
each site should obtain and preparations they should make. While most 
pilot sites had used a site assessment matrix to prepare for IDES 
implementation, the agencies completed a significant revision of the 
matrix in August 2010, and they now request additional information and 
documentation to address areas where prior IDES sites had experienced 
challenges. In addition, while during the pilot phase local DOD and VA 
officials were encouraged to develop written agreements on IDES 
procedures, the matrix now requests that a written agreement be 
completed prior to implementing the IDES. Finally, senior-level local 
DOD and VA officials will be expected to sign the site assessment 
matrix to certify that a site is ready for IDES implementation. This 
differs from the pilot phase where, according to DOD and VA officials, 
some sites implemented the IDES without having been fully prepared. In 
addition, in September 2010, the military services and VA held 
preimplementation training conferences for local DOD and VA staff. At 
the time of our review, the first 28 expansion sites were completing 
their site assessment matrices. 

Through the new site assessment matrix and other initiatives under 
way, DOD and VA are addressing several of the challenges identified in 
the pilot phase. These include ensuring sufficient exam and case 
management staff, being prepared to deal with surges in caseloads, 
addressing exam sufficiency issues, and making adequate logistical 
arrangements. 

Ensuring sufficient exam resources: The matrix asks whether a site can 
complete single exams within the IDES' 45-day time frame and within 
DOD's TRICARE access standards.[Footnote 35] The matrix asks for 
detailed information, such as who will conduct the exams (VA, VA 
contractor, or military providers), where the exams will be conducted, 
and VA's anticipated overall volume of disability compensation and 
pension exams in the area. In addition to the matrix, VA has several 
initiatives under way to increase resources and expedite exams. VA 
plans to award a new contract under which it can acquire examiners for 
sites that do not have sufficient staff to perform exams, such as 
sites located where VA does not have medical facilities or in rural 
areas where VA has had difficulty hiring staff. VA has also recently 
changed its exam policy so that exams performed by nurse practitioners 
or physician assistants certified to perform disability exams no 
longer have to be cosigned by a physician, which is expected to 
expedite completion of more exam reports. 

Ensuring sufficient VA rating staff: VA officials said that they have 
hired new staff to replace those that recently left the Baltimore 
rating office and anticipate hiring a small number of additional 
staff. Based on caseload projections, they expect that, once the 
additional staff are hired, the Baltimore office will be close to 
having sufficient rating staff. Although VA officials said that the 
Baltimore office conducted ratings for a majority of cases during the 
IDES pilot phase, they have projected that the workload will be 
divided almost evenly between the Baltimore and Seattle offices once 
the IDES is fully expanded worldwide. 

Ensuring sufficient DOD PEB adjudicators: Air Force officials informed 
us they added adjudicators for the informal PEB and have since 
eliminated their case backlog. They are currently adding adjudicators 
for the formal PEB. Navy PEB officials also said that they are adding 
adjudicators through activation of reserve component personnel for 
special work and expected that they would be in place by November 2010. 

Ensuring sufficient case management staffing: The site assessment 
matrix also asks whether local facilities will have sufficient trained 
DOD board liaison staff to meet a 1:20 caseload ratio and sufficient 
VA case managers to meet a 1:30 caseload ratio. In addition, according 
to DOD officials, each of the military services is increasing its 
board liaison staffing levels to achieve 1:20 caseload ratios. VA 
officials said that they plan to hire an additional 73 case managers. 

Coping with caseload surges: The matrix asks sites to provide a longer 
and more detailed caseload history--a 2-year, month-by-month history-- 
as opposed to the 1-year history that DOD based its caseload 
projections on during the pilot phase. In addition, the matrix asks 
sites to anticipate any surges in caseloads, such as those due to 
seasonal trends. Sites are also expected to provide a written 
contingency plan for dealing with caseload surges. In addition, the 
matrix asks sites to develop a system for communicating updates, such 
as information on expected caseload surges, to stakeholders. VA 
officials also said that the Army has agreed to keep them better 
informed of deployments that could result in caseload surges. Further, 
VA officials noted that they are developing a plan for addressing the 
additional need for examiners during surges, through which VA offices 
with lower demand for disability exams would send examiners to an IDES 
site experiencing a surge in exam workloads. 

Ensuring the sufficiency of single exams: The site assessment matrix 
asks sites whether all staff who will conduct exams are trained to VA 
standards and certified by VA to conduct disability compensation and 
pension exams. In addition, VA has begun the process of revising its 
exam templates, to better ensure that examiners include the 
information needed for a VA disability rating decision and enable them 
to complete their exam reports in less time. Finally, a VA official 
stated that VA is examining whether it can add capabilities to the VTA 
system that would enable staff to identify where problems with exams 
have occurred and track the progress of their resolution. For sites 
that choose to have military physicians perform the single exams, VA 
officials said that they have provided materials to DOD from their 
national training program, and DOD has made these materials accessible 
on its Web site. To help improve the ability of DOD board liaisons to 
obtain servicemembers' medical and personnel records prior to the 
exam, DOD officials said that they are revising their policies to 
require reserve component units to provide the records when a reserve 
member is referred to the IDES. 

Ensuring adequate logistics at IDES sites: The site assessment matrix 
asks sites whether they have the logistical arrangements needed to 
implement the IDES, including necessary facilities, IT, and 
transportation for servicemembers to exam locations. For example, the 
matrix asks whether the military treatment facility will address the 
needs of VA staff for access cards, identification badges, and 
security clearances, and whether all VA medical providers will be 
credentialed and privileged to practice at the DOD facility. In terms 
of IT, the matrix asks whether DOD sites will enable VA staff access 
to VA information systems needed to perform their duties. The matrix 
also asks sites to identify IT contacts from both VA and DOD so that 
they may work together to resolve IT problems. Furthermore, DOD and VA 
are developing a general memorandum of agreement on IDES information 
sharing. This agreement is intended to enable DOD and VA staff access 
to each other's IT systems, for example, to allow DOD staff to track 
the status of VA exams. DOD officials also said that they are 
developing two new IT solutions. According to officials, one system 
currently being tested would help military treatment facilities better 
manage their cases. Another IT solution, still at a preliminary stage 
of development, would integrate the VTA with the services' case 
tracking systems so as to reduce multiple data entry. 

However, in some areas, DOD and VA's efforts to prepare for IDES 
expansion do not fully address some challenges or are not yet complete. 

Ensuring sufficient military physician staffing: While DOD and VA are 
taking steps to address shortages of examiners, case managers, and 
adjudicators, they do not yet have strategies or plans to address 
potential shortages of military physicians for completing MEB 
determinations. For example, the site assessment matrix does not 
include a question about the sufficiency of military providers to 
handle expected numbers of MEB cases at the site, or ask sites to 
identify strategies for ensuring sufficient military physicians if 
there is a caseload surge or staff turnover. 

Ensuring sufficient housing and organizational oversight for IDES 
participants: Although the site assessment matrix asks sites whether 
they will have sufficient temporary housing available for 
servicemembers going through the IDES, the matrix requires only a yes 
or no response and does not ensure that sites will have conducted a 
thorough review of their housing capacity prior to implementing the 
IDES. For example, sites are not asked about the capacity of their 
medical hold units or special units for wounded servicemembers, or to 
identify other options if their existing units do not have sufficient 
capacity for their projected IDES caseload. In addition, the site 
assessment matrix does not address whether sites have plans for 
ensuring that IDES participants are gainfully employed or sufficiently 
supported by their organizational units. 

Addressing differences in diagnoses: According to a DOD official, as 
part of its revision of its IDES operations manual, DOD is currently 
developing guidance on how staff should address differences in 
diagnoses between military physicians and VA examiners, and between 
military PEBs and VA disability rating staff. DOD anticipated issuing 
the new guidance in September 2010, but at the time of our review had 
not yet done so. In addition, a VA official stated that VA is 
developing new procedures for identifying cases with potential for 
multiple mental health diagnoses and will ask VA examiners to review 
the servicemembers' medical records and reconcile differing diagnoses. 
However, since the new guidance and procedures are still being 
developed, we cannot determine whether they will resolve discrepancies 
or disagreements. Significantly, DOD and VA do not have a mechanism 
for tracking disagreements about diagnoses and ratings, and 
consequently, may not be able to determine whether the guidance 
sufficiently addresses the discrepancies or whether it requires 
further revision. 

DOD and VA Lack a Mechanism for Monitoring Problems That May Emerge 
with Full Implementation: 

As DOD and VA move quickly to implement the IDES worldwide, they have 
some mechanisms in place to monitor challenges that may arise in the 
IDES. DOD officials said that they expect to continue holding 
postimplementation "hotwash" meetings, in which they review individual 
sites' implementation. In addition, DOD and VA will continue to 
regularly collect and report data on caseloads, processing times, and 
servicemember satisfaction. Furthermore, the new site assessment 
matrix asks sites to develop plans for VA and DOD local staff to meet 
weekly for the first 60 to 90 days after implementing the IDES, then 
no less than monthly to address any identified challenges. VA 
officials also said that they will continue to prepare a report on an 
annual basis on challenges in the IDES. To prepare this report, they 
will obtain input and data from local DOD and VA officials. 

However, DOD and VA do not have a system-wide monitoring mechanism to 
help ensure that steps they took to address challenges are sufficient 
and to identify problems in a more timely basis. For example, they do 
not collect data centrally on staffing levels relative to caseload. 
Consequently, despite efforts to acquire additional staff, as local 
sites experience staffing turnover in the future, DOD and VA central 
offices may not become aware that a site is short-staffed until their 
monitoring reports show lengthy processing times. As a result, DOD and 
VA may be delayed in taking corrective action, since it takes time to 
assess what types of staff are needed at a site and to hire or 
reassign staff. In addition, without information on when or how often 
other problems occur, such as insufficient exam summaries or 
disagreements about diagnoses, DOD and VA managers may not be able to 
target additional training or guidance where needed. Furthermore, 
while DOD and VA report data on processing times by phase of the 
process, military treatment facility, and military service, their 
monitoring reports do not show processing times or caseloads for each 
VA rating office and each of the five PEBs (three Army and one each 
for the Navy and Air Force), limiting their ability to identify if 
specific rating or PEB offices are experiencing challenges. 

DOD and VA also lack mechanisms or forums for systematically sharing 
information on challenges as well as best practices. For example, 
while the site assessment matrix indicates that sites are expected to 
hold periodic meetings to identify local challenges, DOD and VA have 
not established a process for local sites to systematically report 
those challenges to DOD and VA management and for lessons learned to 
be systematically shared system-wide. During the pilot phase, VA 
surveyed pilot sites on a monthly basis about challenges they faced in 
completing single exams. Such a practice has the potential to provide 
useful feedback if extended to other IDES challenges. 

Conclusions: 

By merging two duplicative disability evaluation systems, the IDES 
shows promise for expediting the delivery of VA benefits to 
servicemembers leaving the military due to a disability. 
Servicemembers who proceed through the process are able to leave the 
military with greater financial security, since they receive 
disability benefits from both agencies shortly after discharge. 
Further, having both DOD and VA personnel involved in reviewing each 
disability evaluation may result in a more thorough scrutiny of cases 
and informed decisions on behalf of servicemembers. 

However, piloting of the system at 27 sites has revealed several 
significant challenges that require careful management attention and 
oversight before DOD and VA expand the system military-wide. DOD and 
VA are currently taking steps to address many of these challenges, and 
the agencies have developed a site implementation process that 
encourages local DOD and VA officials to identify and resolve local 
challenges prior to transitioning to the new system. However, given 
the agencies' ambitious implementation schedule--more than 100 sites 
in a year--it is unclear whether all of these challenges will be fully 
dealt with before DOD and VA deploy the integrated system to 
additional military facilities. For example, it is unclear whether 
sites will have sufficient military physicians to complete key steps 
of the process in a timely manner. Insufficient staffing of any one 
part of the process is likely to lead to bottlenecks, delaying not 
only servicemembers' receipt of disability benefits, but also their 
separation from the military and reentry into civilian life. In 
addition, DOD's preparations of sites for the IDES do not ensure that 
military facilities have adequate capacity or plans for housing and 
providing organizational oversight over servicemembers in the IDES, 
who potentially could remain at the locations for extended periods of 
time. Furthermore, while integrating VA medical exams into DOD's 
disability evaluation system eliminates duplicative exams, it raises 
the potential for there to be disagreements about diagnoses of 
servicemembers' conditions, with implications for servicemembers' 
disability ratings and their DOD disability compensation. While DOD is 
developing guidance to address such disagreements, it is important 
that the agencies have a thorough understanding of how often and why 
these disagreements occur and continually review whether their new 
guidance adequately addresses this issue so as to be able to make 
improvements where needed. 

Successful implementation of any program requires effective 
monitoring. DOD and VA currently have mechanisms to track numbers of 
cases processed, timeliness, and servicemember satisfaction, but they 
do not routinely monitor factors--such as staffing levels relative to 
caseload, disagreements about diagnoses, and insufficient exam 
summaries--that can delay the process. In addition, they do not 
monitor timeliness and caseloads for some of the key IDES offices, 
namely each VA rating office and each PEB. Ultimately, the success or 
failure of the IDES will depend on DOD and VA's ability to 
sufficiently staff local sites, the VA rating offices, and the PEBs, 
and to resolve other challenges not only at the initiation of the 
transition to IDES but also on an ongoing, long-term basis. By not 
monitoring staffing and other risk factors, DOD and VA may not be able 
to ensure that their efforts to address these factors are sufficient 
or to identify problems as they emerge and take immediate steps to 
address them before they become major problems. 

Recommendations for Executive Action: 

To ensure that the IDES is sufficiently staffed and that military 
treatment facilities are prepared to house personnel in the IDES, we 
recommend that the Secretary of Defense direct the military services 
to conduct thorough assessments prior to each site's implementation of 
the IDES of the following three issues: 

* the adequacy of staffing of military physicians for completing MEB 
determinations at military treatment facilities; contingency plans 
should be developed to address potential staffing shortfalls, for 
example, due to staff turnover or caseload surges; 

* the availability of housing for servicemembers in the IDES at 
military facilities; alternative housing options should be identified 
if sites do not have adequate capacity; and: 

* the capacity of organizational units to absorb servicemembers 
undergoing the disability evaluation; plans should be in place to 
ensure servicemembers are appropriately and constructively engaged. 

To improve their agencies' ability to resolve differences about 
diagnoses of servicemembers' conditions, and to determine whether 
their new guidance sufficiently addresses these disagreements, we 
recommend that the Secretaries of Defense and Veterans Affairs take 
the following two actions: 

* conduct a study to assess the prevalence and causes of such 
disagreements; and: 

* establish a mechanism to continuously monitor disagreements about 
diagnoses between military physicians and VA examiners and between 
PEBs and VA rating offices. 

To enable their agencies to take early action on problems at IDES 
sites postimplementation, we recommend that the Secretaries of Defense 
and Veterans Affairs develop a system-wide monitoring mechanism to 
identify challenges as they arise in all DOD and VA facilities and 
offices involved in the IDES. This system could include: 

* continuous collection and analysis of data on DOD and VA staffing 
levels, sufficiency of exam summaries, and diagnostic disagreements; 

* monitoring of available data on caseloads and case processing time 
by individual VA rating office and PEB; and: 

* a formal mechanism for agency officials at local DOD and VA 
facilities to communicate challenges and best practices to DOD and VA 
headquarters offices. 

Agency Comments and Our Evaluation: 

We provided a draft of this report to DOD and VA for review and 
comment. The agencies provided written comments, which are reproduced 
in appendixes III and IV. DOD and VA generally concurred with our 
recommendations. Each agency also provided technical comments, which 
we incorporated as appropriate. 

DOD concurred with our recommendation to ensure that, before the IDES 
is implemented at each new site, a thorough assessment be done of the 
site's staffing adequacy, the availability of housing for 
servicemembers in the IDES, and the capacity of organizational units 
to appropriately and constructively engage servicemembers in the IDES. 
However, DOD stated that the IDES site assessment matrix addresses 
plans to ensure that servicemembers are gainfully employed while in 
the IDES. We changed our report to more clearly indicate that the site 
assessment matrix does not, in fact, address such plans. We believe 
that specifically identifying this in the matrix could help local DOD 
officials, including servicemembers' unit commanders, focus on 
ensuring gainful employment or other support. 

DOD concurred, and VA concurred in principle, with our recommendation 
to study and establish mechanisms to monitor diagnostic differences. 
VA identified a plan to study the prevalence and causes of diagnostic 
differences and determine by July 1, 2011, whether mechanisms are 
needed. DOD stated that it expects, as diagnostic differences are 
monitored and studied, that the agencies will address and resolve many 
of the issues identified in our report. We agree that the planned 
study could yield valuable insights on how to resolve diagnostic 
differences but emphasize that continuous monitoring of such 
differences over a period of time may be needed to assess the extent 
and nature of such differences, as well as the success of any actions 
to address them. 

Both agencies concurred with our recommendation to develop monitoring 
mechanisms to help them take early actions on problems that may arise 
at IDES sites postimplementation. VA stated that the VTA system 
currently has data that can be monitored by PEB and VA rating site, 
and DOD said its weekly monitoring report could be modified to present 
these data. Also, VHA plans to monitor the IDES exam workload, 
including numbers of exam requests compared with forecasts, exam 
timeliness, and insufficient exams. Implementation is scheduled for 
December 31, 2010. In terms of identifying site implementation 
problems for quick resolution, DOD stated that the military services 
bring sites' challenges and best practices to the Disability Advisory 
Council, a DOD body that includes VA representatives, which is being 
re-chartered as part of the Benefits Executive Council, a subgroup of 
the VA-DOD Joint Executive Council. VA and DOD's plans sound promising 
and consistent with our recommendations provided that they allow for 
ongoing monitoring of site staffing levels and create a systematic way 
for local DOD and VA staff to communicate their challenges or best 
practices, enabling the agencies to identify and address problems at 
an early stage. 

We are sending copies of this report to the appropriate congressional 
committees, the Secretary of Defense, the Secretary of Veterans 
Affairs, and other interested parties. The report is also available at 
no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. 

If you or your staff members have any questions about this report, 
please contact me at (202) 512-7215 or at bertonid@gao.gov. Contact 
points for our Offices of Congressional Relations and Public Affairs 
may be found on the last page of this report. Staff members who made 
key contributions in this report are listed in appendix V. 

Signed by: 

Daniel Bertoni: 
Director, Education, Workforce, and Income Security Issues: 

List of Committees: 

The Honorable Carl Levin:
Chairman:
The Honorable John McCain:
Ranking Member:
Committee on Armed Services:
United States Senate: 

The Honorable Daniel Akaka:
Chairman:
The Honorable Richard Burr:
Ranking Member:
Committee on Veterans' Affairs:
United States Senate: 

The Honorable Daniel Inouye:
Chairman:
The Honorable Thad Cochran:
Ranking Member:
Subcommittee on Defense:
Committee on Appropriations:
United States Senate: 

The Honorable Tim Johnson:
Chairman:
The Honorable Kay Bailey Hutchison:
Ranking Member:
Subcommittee on Military Construction, Veterans Affairs, and Related 
Agencies:
Committee on Appropriations:
United States Senate: 

The Honorable Ike Skelton:
Chairman:
The Honorable Howard P. "Buck" McKeon:
Ranking Member:
Committee on Armed Services:
House of Representatives: 

The Honorable Bob Filner:
Chairman:
The Honorable Steve Buyer:
Ranking Member:
Committee on Veterans' Affairs:
House of Representatives: 

The Honorable Norman Dicks:
Chairman:
The Honorable C. W. Bill Young:
Ranking Member:
Subcommittee on Defense:
Committee on Appropriations:
House of Representatives: 

The Honorable Chet Edwards:
Chairman:
The Honorable Zach Wamp:
Ranking Member:
Subcommittee on Military Construction, Veterans Affairs, and Related 
Agencies:
Committee on Appropriations:
House of Representatives: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

In conducting our review of the integrated disability evaluation 
system (IDES) piloted by the Departments of Defense (DOD) and Veterans 
Affairs (VA), our objectives were to examine (1) the results of DOD 
and VA's evaluation of the IDES pilot, (2) challenges in implementing 
the piloted system to date, and (3) DOD and VA plans to expand the 
piloted system and whether those plans adequately address potential 
challenges. We conducted this performance audit from November 2009 to 
December 2010, in accordance with generally accepted government 
auditing standards. Those standards require that we plan and perform 
the audit to obtain sufficient, appropriate evidence to provide a 
reasonable basis for our findings and conclusions based on our audit 
objectives. We believe that the evidence obtained provides a 
reasonable basis for our findings and conclusions based on our audit 
objectives. 

Review of DOD and VA's Evaluation of the IDES Pilot: 

To address objective 1, we reviewed DOD and VA policy guidance, 
reports, and analysis plans to determine how the agencies are 
evaluating the pilot's effectiveness and to obtain information on 
their results. We also reviewed the relevant requirements of the 
National Defense Authorization Act of 2008 as it pertains to this 
review. In addition, we interviewed officials responsible for the 
evaluation at DOD's Office of the Deputy Under Secretary of Defense 
for Wounded Warrior Care & Transition Policy (WWCTP), DOD's Defense 
Manpower Data Center, and two organizations that DOD has contracted 
with to perform the evaluation--Booz Allen Hamilton and Westat. We 
then tested the reliability of the data the agencies are using for 
their evaluation--data from surveys of servicemembers, IDES case data 
from the Veterans Tracking Application (VTA) system, and legacy case 
data that DOD's WWCTP obtained from the military services. Finally, we 
conducted some analyses of IDES and legacy case data for the Army to 
compare the two systems on timeliness and appeal rates, using elements 
of the data that we found to be reliable, but these comparisons have 
limitations and are not generalizable to other military services. The 
sections below describe our data reliability work and our analysis of 
Army data in further detail. 

Review of Satisfaction Survey Data Reliability: 

DOD and VA have been surveying servicemembers going through the IDES 
pilot, and a comparison group of veterans who went through the 
standard "legacy" disability evaluation system, to determine whether 
the IDES pilot has improved servicemember satisfaction. The agencies 
survey all servicemembers in the IDES pilot at three points in time--
following their completion of the medical evaluation board (MEB) of 
the disability evaluation process, completion of the physical 
evaluation board (PEB), and during the transition phase. To create a 
comparison group, the agencies sampled veterans who have been through 
the legacy system at current pilot sites. Their sampling methods were 
designed to ensure that the pilot and legacy groups were of comparable 
size and had similar proportions of servicemembers found unfit for 
duty. DOD and VA are analyzing the differences between the pilot and 
legacy groups' average responses on four "survey composites," or 
general categories composed of several survey questions: overall 
experience, fairness, DOD board liaison officer customer service, and 
VA case manager customer service. 

We reviewed the reliability of surveys DOD and VA are using to obtain 
information on satisfaction levels by examining their survey design 
and analysis. To do so, we interviewed officials at DOD's Defense 
Manpower Data Center and Westat responsible for implementing the 
survey, as well as officials at WWCTP and Booz Allen Hamilton 
responsible for designing the survey and analyzing the survey data. We 
also reviewed the survey instruments, response rates, data analysis 
plans, analysis results, and survey data as of February 28, 2010. We 
found DOD's survey methodology--and the data derived using that 
methodology--to be reliable for purposes of comparing servicemembers' 
satisfaction levels in the IDES and legacy disability evaluation 
systems. 

Pilot and Legacy Case Data Reliability Tests: 

DOD and VA are collecting data on IDES pilot cases through the VTA and 
are using these data to conduct ongoing monitoring of case processing 
times and appeal rates, with the results presented in weekly reports. 
VA manages VTA, but evaluation of the data is primarily conducted by 
staff at DOD's WWCTP and Booz Allen Hamilton. 

For their August 2010 interim report to Congress, DOD staff created a 
data set used to compare pilot and legacy processing times and appeal 
rates. This data set included IDES pilot cases as of February 28, 
2010, with the earliest case started in November 2007. The data set 
also included data, as of January 31, 2010, on legacy cases started 
between fiscal years 2005 and 2009 at the first 21 sites operating the 
IDES pilot, prior to pilot implementation. The agencies also matched 
legacy case data from each of the military services with VA data, in 
order to capture additional processing time it took for servicemembers 
to navigate the VA disability claims process. Because the data set was 
created from February 2010 pilot data, it only included about one-
third of the IDES pilot cases that were completed as of August 29, 
2010. The February 2010 data set included cases from 17 of the 27 
current pilot sites, and 7 of the 17 sites--including some of the 
pilot sites with the largest caseloads such as Fort Carson and Camp 
Lejeune--had fewer than 20 completed cases each when the data set was 
created. 

To assess whether the data DOD and VA are using for their monitoring 
and evaluation are reliable, we obtained the early 2010 data set that 
the agencies' planned to use for their evaluation report to Congress. 
We restricted our reliability assessments to the specific variables 
that the agencies used in their analyses. Following steps detailed 
below, we found that the IDES pilot case data were sufficiently 
reliable for our analyses, but that the legacy case data were 
incomplete with respect to data elements key to measuring case 
processing time and appeal rates. 

To assess the reliability of the agencies' IDES pilot data, we 
interviewed VA database managers responsible for VTA, reviewed VTA 
manuals and guidance, conducted electronic tests of the data and, for 
a small, random sample of cases, checked the data against case files. 

* Through our interviews and document reviews, we concluded that the 
agencies have sufficient internal controls to give reasonable 
assurance that the data are complete. 

* Our electronic testing of the data generally found low rates of 
missing data and errors in completed IDES cases. In these tests, we 
considered a data element to be sufficiently reliable for purposes of 
using in our report if 15 percent or less of the data were missing or 
had errors. Using this standard, we determined that one data element 
for IDES cases--the date that servicemembers separated from the 
military--was not reliable, because: (1) it was missing in 19 percent 
of completed cases and (2) in cases where the date was present, more 
than 30 percent appeared to have errors (for example, the date was 
before a step of the process that it should have followed). 

* We also conducted a trace-to-file process to determine whether date 
fields in the VTA system were an accurate reflection of the 
information in the IDES case files. Specifically, we compared 12 date 
fields in the VTA against a random sample of paper files for 54 
completed cases: 24 from the three Army PEBs, 10 from the Air Force 
PEB, and 20 from the Navy PEB (10 Navy cases and 10 Marine Corps). 
[Footnote 36] In comparing these dates, we allowed for a 10 percent 
discrepancy in dates--i.e., a difference of 2 to 10 days, depending on 
the date and phase of the process[Footnote 37]--to allow for the 
possibility that dates may have been entered into the database after 
an event took place. The trace-to-file process resulted in an overall 
accuracy rate of 84 percent. For five data elements key to DOD and 
VA's evaluation of the IDES pilot, we found that VTA dates reflected 
dates in the case files 85 percent of the time or better. For six key 
data elements--i.e., the end dates of the exam and MEB phases, the 
start of the PEB phase, the date a VA rating request was made, the 
date of the final disposition, and the date servicemembers received VA 
benefits--the VTA dates matched case file dates between 70 to 85 
percent of the time. Although we considered these dates sufficiently 
reliable to include in this report, these dates should be interpreted 
with more caution. The separation date was accurate less than 70 
percent of the time and did not meet our standards of reliability. 

To assess the reliability of the legacy data that the agencies planned 
to compare the IDES pilot against, we tested the data electronically, 
and found that data for key dates and appeals indicators had 
significant gaps because the services did not collect the same 
information for legacy cases that were collected for pilot cases (see 
table 2). For example, only Army cases had information on when 
servicemembers were referred to the MEB process. In addition, the 
legacy data did not include the date on when servicemembers received 
VA benefits--which is necessary for measuring the full length of the 
legacy process. Without sufficient data on the beginning (when 
servicemembers were referred into the system) or end of the process 
(when they received VA benefits), we concluded that the full case 
processing time in the legacy system cannot be known. We also 
concluded that comparisons could not be made between the legacy and 
IDES pilot on appeal rates because only Army and Air Force cases had 
information on whether servicemembers appealed the informal PEB 
decisions. 

Table 2: Percentage of Legacy Cases with Referral and Appeal Dates, by 
Military Service: 

Key data elements: Date of referral; 
Army: 100.0%; 
Navy: 0.0%; 
Marine Corps: 0.0%; 
Air Force: 0.0%; 
All: 62.8%. 

Key data elements: Informal PEB appeal; 
Army: 89.2%; 
Navy: 0.0%; 
Marine Corps: 0.0%; 
Air Force: 80.3%; 
All: 62.4%. 

Source: GAO analysis of DOD legacy case data. 

[End of table] 

GAO's Review of the Agencies' Comparison of Pilot and Legacy Data: 

In addition to reviewing the reliability of the IDES pilot and legacy 
data, we reviewed how DOD and VA are using the data for their 
comparisons of the two disability evaluation systems. Through 
interviews with officials at DOD's WWCTP and Booz Allen Hamilton and 
documents they provided us, we understand that DOD planned to address 
gaps in the legacy data by: (1) approximating the referral dates in 
Air Force, Marine Corps, and Navy cases using Army data and (2) using 
dates when cases were ready to be rated by VA to approximate the end 
of the process. Specifically, to approximate referral dates, they said 
they would use the average time for Army cases between when the 
servicemember was referred and when the MEB documentation identifying 
the servicemember's potentially unfitting medical conditions (i.e., 
the narrative summary) was completed, which they calculated to be 60 
days. For Navy and Marine Corps cases, they then subtracted 60 days 
from the date of the narrative summary to estimate a referral date 
and, for Air Force cases, they did so from the date of the MEB 
decision. However, because only 11 percent of Army legacy cases had a 
narrative summary date, the estimate of 60 days is based on a small 
number of cases (see table 3). To address the lack of data on the date 
VA benefits were delivered, DOD planned to use the date that VA 
determined a case was ready to be rated to approximate the end of the 
process, though this would underestimate the length of time it took to 
deliver VA benefits in the legacy process. 

Table 3: Percentage of Legacy Cases with Data Used for DOD 
Comparisons, by Military Service: 

Key data elements: Narrative summary date; 
Army: 10.5%; 
Navy: 99.9%; 
Marine Corps: 100.0%; 
Air Force: 0.0%; 
All: 35.9%. 

Key data elements: Date VA ready to rate; 
Army: 88.8%; 
Navy: 87.5%; 
Marine Corps: 86.7%; 
Air Force: 87.6%; 
All: 88.2%. 

Source: GAO analysis of DOD and VA legacy case data. 

[End of table] 

GAO Analysis of IDES Case Data: 

For objective 1, we presented information on average processing time 
in the IDES, both overall and by military service, using information 
presented by DOD and VA in their weekly monitoring reports. Where 
information was not available in the weekly reports, we conducted our 
own analysis using the early 2010 data set that DOD and VA intended to 
use for their report to Congress. Specifically, we used these data to 
determine the proportion of pilot cases meeting the 295-day goal for 
active duty servicemembers and the 305-day goal for reserve 
servicemembers. 

In addition, although limitations in the legacy data preclude reliable 
comparisons between the IDES pilot and legacy systems for all the 
military services, the Army legacy data on when servicemembers were 
referred into the IDES were sufficiently complete to make some limited 
comparisons. Specifically, we analyzed Army legacy data to determine 
how long the legacy process took, on average, between when 
servicemembers were referred to the process and when VA was ready to 
conduct the disability rating. We limited our analysis to cases in 
which a VA claim was filed between 2006 and 2009 because data on when 
VA was ready to conduct the rating was missing for a substantial 
number of cases where the VA claim was filed in 2005 and 2010. We 
compared this legacy average with the total pilot case processing time 
through to delivery of VA benefits, but we noted that the legacy 
average does not account for time for VA to complete the rating and 
deliver the benefits. We also analyzed Army data on appeals in order 
to illustrate the limitations of DOD's plan to compare only appeals to 
the informal PEB in the pilot and legacy systems and not take into 
account appeals of rating decisions to VA. We conducted this analysis 
using the legacy data and pilot case data as of early 2010, since DOD 
and VA's weekly reports do not contain information on appeals to VA. 

Identifying Challenges in Implementing the IDES at Pilot Sites: 

To identify challenges in implementing the IDES during the pilot 
phase, we visited 10 of the 27 military treatment facilities 
participating in the pilot. At the site visits, we interviewed 
officials involved in implementing the IDES from both DOD and VA, 
including military facility commanders and administrators, DOD board 
liaisons, military physicians involved in MEB determinations, DOD 
legal staff, VA case workers, VA or contract examiners, and 
administrators at VA medical clinics and VA regional offices. We 
selected the 10 facilities to obtain perspectives from sites in 
different military services and geographical regions and that varied 
in terms of disability evaluation caseloads and how their single exams 
were conducted (by DOD, VA, or a VA contractor) (see table 4). 

Table 4: Selected Characteristics of IDES Pilot Sites Visited: 

Military treatment facility: 71st Medical Group, Vance Air Force Base; 
Military service: Air Force; 
Geographic region: Central; 
IDES caseload[A]: 21; 
Entity performing single exam: VA contractor. 

Military treatment facility: David Grant Medical Center, Travis Air 
Force Base; 
Military service: Air Force; 
Geographic region: West; 
IDES caseload[A]: 112; 
Entity performing single exam: VA. 

Military treatment facility: Bayne Jones Army Community Hospital, Fort 
Polk; 
Military service: Army; 
Geographic region: South; 
IDES caseload[A]: 518; 
Entity performing single exam: VA. 

Military treatment facility: Dewitt Army Community Hospital, Fort 
Belvoir; 
Military service: Army; 
Geographic region: East; 
IDES caseload[A]: 268; 
Entity performing single exam: VA. 

Military treatment facility: Evans Army Community Hospital, Fort 
Carson; 
Military service: Army; 
Geographic region: Central; 
IDES caseload[A]: 1,341; 
Entity performing single exam: VA. 

Military treatment facility: Walter Reed Army Medical Center; 
Military service: Army; 
Geographic region: East; 
IDES caseload[A]: 936; 
Entity performing single exam: VA. 

Military treatment facility: Winn Army Community Hospital, Fort 
Stewart; 
Military service: Army; 
Geographic region: South; 
IDES caseload[A]: 1,209; 
Entity performing single exam: VA contractor. 

Military treatment facility: Naval Hospital Camp Lejeune; 
Military service: Navy; 
Geographic region: South; 
IDES caseload[A]: 1,214; 
Entity performing single exam: VA and VA contractor. 

Military treatment facility: Naval Hospital Camp Pendleton; 
Military service: Navy; 
Geographic region: West; 
IDES caseload[A]: 537; 
Entity performing single exam: VA contractor. 

Military treatment facility: Naval Medical Center San Diego; 
Military service: Navy; 
Geographic region: West; 
IDES caseload[A]: 1,447; 
Entity performing single exam: VA. 

Sources: GAO interviews and data from DOD and VA. 

[A] Caseload size as of August 22, 2010. 

[End of table] 

We also interviewed various offices at DOD and VA involved in 
implementing the IDES pilot. At DOD, this included WWCTP; Office of 
the Assistant Secretary of Defense for Health Affairs; Office of the 
Assistant Secretary of Defense for Reserve Affairs; Air Force Physical 
Disability Division; Army Physical Disability Agency; Navy Physical 
Evaluation Board; Office of the Air Force Surgeon General; Army 
Medical Command; and Navy Bureau of Medicine and Surgery. At VA, we 
interviewed officials in the Veterans Benefits Administration, 
Veterans Health Administration, and VA/DOD Collaboration Service. 

Furthermore, we reviewed relevant documents, including DOD and VA 
policies and guidance and records of "hotwash" meetings, which DOD and 
VA held shortly after implementing the IDES at pilot sites to identify 
implementation successes and challenges. We also reviewed data on 
processing times for the single exams, MEB determinations, informal 
PEB decisions, and VA ratings, as reported in the agencies' weekly 
monitoring reports. In addition, we reviewed relevant federal laws and 
regulations. 

To determine whether the IDES process extended the time that 
servicemembers remained in military service, we analyzed the legacy 
and pilot case data from the early 2010 data set, but we identified 
several limitations with the data. As noted earlier, the date 
servicemembers separated from the military was missing for 19 percent 
of completed IDES pilot cases. Further, as shown in table 5, only Air 
Force cases contained data on the separation date in the legacy data. 
Also noted earlier, only the Army legacy data contained information on 
when servicemembers were referred into the legacy process. As a 
result, for Army cases, we compared the average length of time it took 
cases to reach a final PEB decision in the legacy and pilot, since 
this date was sufficiently complete in both the legacy and pilot data. 
The PEB decision is the last phase of the disability evaluation 
process before a servicemember either begins to transition from 
military service, or if they are found fit, returns to their unit. 

Table 5: Percentage of Legacy Cases with Data Used for Comparison of 
Time in Active Duty, by Military Service: 

Key data elements: Date of separation; 
Army: 0.0%; 
Navy: 0.0%; 
Marine Corps: 0.0%; 
Air Force: 96.5%; 
All: 7.6%. 

Key data elements: Date of final disposition; 
Army: 89.2%; 
Navy: 99.9%; 
Marine Corps: 100.0%; 
Air Force: 100.0%; 
All: 93.2%. 

Source: GAO analysis of DOD and VA legacy case data. 

[End of table] 

Examining DOD and VA's Plans for Expanding the IDES: 

To identify the agencies' preparations for worldwide expansion of the 
IDES, we reviewed documents on DOD and VA's expansion strategy, their 
site assessment matrix, and weekly monitoring reports which, beginning 
in July 2010, tracked key implementation time frames, both nationally 
and at individual military treatment facilities. Our interviews with 
officials involved in the pilot at DOD, VA, and each of the military 
services also provided us with information on the agencies' expansion 
plans. We also reviewed relevant federal laws and regulations. 

We determined the adequacy of the agencies' planning efforts by 
assessing whether their plans addressed the challenges we had 
identified in objective 2. We also determined whether the plans 
incorporated internal controls described in GAO's Standards for 
Internal Control in the Federal Government and best practices for 
program implementation identified in academic literature.[Footnote 38] 

[End of section] 

Appendix II: IDES Pilot Processing Times for Reserve Component 
Servicemembers: 

The figures below show case processing times in the IDES pilot for 
reserve component servicemembers. Figure 11 shows the average number 
of days it took to complete the process--i.e., to deliver VA benefits 
to reserve component servicemembers, as of August 2010. Figure 12 
shows the percentage of cases that met the DOD and VA goal to deliver 
VA benefits within 305 days, as of February 2010. Figures 13-15 show 
the average length of time it took, as of August 2010, to complete 
phases of the IDES process--i.e., the single exam, the MEB 
documentation, and the informal PEB decision, respectively--each of 
which have taken longer, on average, than the goals established by DOD 
and VA. 

Figure 11: Average Number of Days to Deliver VA Benefits for Reserve 
Component Servicemembers, by Military Service, as of August 29, 2010: 

[Refer to PDF for image: horizontal bar graph] 

Service goal: 305 days. 

Service: Air Force; 
Elapsed time in days: 376. 

Service: Army; 
Elapsed time in days: 285. 

Service: Navy; 
Elapsed time in days: 321. 

Service: Marine Corps; 
Elapsed time in days: 368. 

Service: All; 
Elapsed time in days: 298. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[End of figure] 

Figure 12: Percentage of Cases Meeting 305-Day Goal for Delivery of VA 
Benefits to Reserve Component Servicemembers, by Military Service, as 
of February 2010: 

[Refer to PDF for image: horizontal bar graph] 

Service goal: 80%. 

Service: Air Force; 
Percentage of cases meeting goal: 33%. 

Service: Army; 
Percentage of cases meeting goal: 67%. 

Service: Navy; 
Percentage of cases meeting goal: 71%. 

Service: Marine Corps; 
Percentage of cases meeting goal: 52%. 

Service: All; 
Percentage of cases meeting goal: 65%. 

Sources: GAO analysis of pilot case data from DOD and VA. 

[End of figure] 

Figure 13: Average Number of Days to Complete Single Exams for Reserve 
Component Servicemembers, by IDES Pilot Site, as of August 29, 2010: 

[Refer to PDF for image: vertical bar graph] 

Average for all sites: 64 days: 
Service goal: 45 days. 

Air Force: 

Site: Andrews; 
Elapsed tame in days: 61. 

Site: Elmendorf; 
Elapsed tame in days: 59. 

Site: Nellis; 
Elapsed tame in days: 57. 

Site: MacDill; 
Elapsed tame in days: 44. 

Site: Travis; 
Elapsed tame in days: 36. 

Army: 

Site: Fort Wainwright; 
Elapsed tame in days: 136. 

Site: Fort Carson; 
Elapsed tame in days: 109. 

Site: Fort Stewart; 
Elapsed tame in days: 93. 

Site: Fort Richardson; 
Elapsed tame in days: 87. 

Site: Walter Reed; 
Elapsed tame in days: 70. 

Site: Fort Sam Houston; 
Elapsed tame in days: 67. 

Site: Fort Belvoir; 
Elapsed tame in days: 65. 

Site: Fort Drum; 
Elapsed tame in days: 63. 

Site: Fort Lewis; 
Elapsed tame in days: 62. 

Site: Fort Polk; 
Elapsed tame in days: 59. 

Site: Fort Benning; 
Elapsed tame in days: 46. 

Site: Fort Riley; 
Elapsed tame in days: 45. 

Site: Fort Meade; 
Elapsed tame in days: 39. 

Site: Fort Bragg; 
Elapsed tame in days: 32. 

Site: Fort Hood; 
Elapsed tame in days: 31. 

Site: Camp Pendleton; 
Elapsed tame in days: 88. 

Navy[A]: 

Site: Bethesda; 
Elapsed tame in days: 71. 

Site: Portsmouth; 
Elapsed tame in days: 57. 

Site: San Diego; 
Elapsed tame in days: 53. 

Site: Bremerton; 
Elapsed tame in days: 47. 

Site: Camp Lejeune; 
Elapsed tame in days: 26. 

Marine Corps: 

Site: Camp Pendleton; 
Elapsed tame in days: 64. 

Site: Camp Lejeune; 
Elapsed tame in days: 60. 

Site: Bethesda; 
Elapsed tame in days: 57. 

Site: San Diego; 
Elapsed tame in days: 46. 

Site: Bremerton; 
Elapsed tame in days: 29. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] This figure shows processing times separately for servicemembers 
in the Navy and Marine Corps at the six Navy IDES pilot sites. DOD and 
VA's data indicate that, as of August 2010, there had not yet been any 
Marine Corps reserve component servicemembers who completed the single 
exam in the IDES pilot at Naval Medical Center Portsmouth. 

[End of figure] 

Figure 14: Average Number of Days to Complete MEB Documentation for 
Reserve Component Servicemembers, by IDES Pilot Site, as of August 29, 
2010: 

[Refer to PDF for image: vertical bar graph] 

Average for all sites: 76 days. 
Service goal: 35 days. 

Air Force: 

Site: MacDill; 
Elapsed time in days: 177. 

Site: Andrews; 
Elapsed time in days: 92. 

Site: Elmendorf; 
Elapsed time in days: 55. 

Site: Travis; 
Elapsed time in days: 53. 

Site: Nellis; 
Elapsed time in days: 51. 

Army: 

Site: Fort Richardson; 
Elapsed time in days: 124. 

Site: Fort Belvoir; 
Elapsed time in days: 109. 

Site: Fort Meade; 
Elapsed time in days: 100. 

Site: Walter Reed; 
Elapsed time in days: 90. 

Site: Fort Sam Houston; 
Elapsed time in days: 69. 

Site: Fort Polk; 
Elapsed time in days: 66. 

Site: Fort Stewart; 
Elapsed time in days: 63. 

Site: Fort Hood; 
Elapsed time in days: 58. 

Site: Fort Carson; 
Elapsed time in days: 53. 

Site: Fort Lewis; 
Elapsed time in days: 44. 

Site: Fort Benning; 
Elapsed time in days: 38. 

Site: Fort Bragg; 
Elapsed time in days: 33. 

Site: Fort Drum; 
Elapsed time in days: 31. 

Site: Fort Wainwright; 
Elapsed time in days: 28. 

Navy[A]: 

Site: Camp Lejeune; 
Elapsed time in days: 79. 

Site: Bethesda; 
Elapsed time in days: 78. 

Site: Camp Pendleton; 
Elapsed time in days: 47. 

Site: San Diego; 
Elapsed time in days: 23. 

Site: Bremerton; 
Elapsed time in days: 22. 

Marine Corps: 

Site: Camp Lejeune; 
Elapsed time in days: 111. 

Site: Bethesda; 
Elapsed time in days: 67. 

Site: Camp Pendleton; 
Elapsed time in days: 60. 

Site: San Diego; 
Elapsed time in days: 34. 

Site: Bremerton; 
Elapsed time in days: 12. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] This figure shows processing times separately for servicemembers 
in the Navy and Marine Corps at the six Navy IDES pilot sites. DOD and 
VA's data indicate that, as of August 2010, there had not yet been any 
Marine Corps reserve component servicemembers who completed the MEB 
phase in the IDES pilot at Naval Medical Center Portsmouth. 

[End of figure] 

Figure 15: Average Number of Days to Complete the Informal PEB for 
Reserve Component Servicemembers, by Military Service, as of August 
29, 2010: 

[Refer to PDF for image: horizontal bar graph] 

Service goal: 15 days. 

Service: Air Force; 
Elapsed time in days: 76. 

Service: Army; 
Elapsed time in days: 26. 

Service: Navy; 
Elapsed time in days: 80. 

Service: Marine Corps; 
Elapsed time in days: 106. 

Sources: GAO presentation of weekly report data from DOD and VA. 

[A] The Navy PEB determines fitness decisions for servicemembers in 
the Marine Corps. 

[End of figure] 

[End of section] 

Appendix III: Comments from the Department of Defense: 

Office Of The Under Secretary Of Defense: 
Personnel And Readiness: 
4000 Defense Pentagon: 
Washington, D.C. 2030-4000: 

November 17, 2010: 

Mr. Daniel Bertoni: 
Director, Education, Workforce, and Income Security: 
U.S. Government Accountability Office: 
441 G Street, N.W. 
Washington, DC 20548: 

Dear Mr. Bertoni, 

This is the Department of Defense (DoD) response to the GAO Draft 
Report, GAO-11-69, "Military And Veterans Disability System: Pilot Has 
Achieved Some Goals, but Further Planning and Monitoring Needed," 
dated October 22, 2010 (GAO Code 130971). 

The Department appreciates the opportunity to collaborate with the GAO 
in identifying areas within the administration of the Military and 
Veterans Disability Evaluation System for emphasis to better support 
our wounded, ill, or injured Service members as they recover and 
return to duty or prepare to leave military service. 

Each Military Department has processes and organizations in place to 
support the needs of Service members proceeding through the Disability 
Evaluation System. 

The Department concurs with the recommendations contained in the draft 
report except as noted. Specific comments are provided in the 
attachment to this letter. 

Sincerely, 

Signed by: 

John R. Campbell: 
Deputy Under Secretary of Defense: 
Wounded Warrior Care and Transition Policy: 

Attachments: As stated: 

[End of letter] 

GAO Draft Report Dated December 2010: 
GAO-11-69 (GAO Code 130971): 

"Military And Veterans Disability System: Pilot Has Achieved Some 
Goals, But Further Planning And Monitoring Needed" 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: 

To ensure that the IDES is sufficiently staffed and that military 
treatment facilities are prepared to house personnel in the IDES, we 
recommend that the Secretary of Defense direct the military services 
to conduct a thorough assessment prior to each site's implementation 
of the IDES of: 

* the adequacy of staffing of MEB physicians at military treatment 
facilities; contingency plans should be developed to address potential 
staffing shortfalls, for example, due to staff turnover or caseload 
surges. 

* the availability of housing for Service members in the IDES at 
military facilities; alternate housing options should be identified if 
sites do not have adequate capacity. 

* the capacity of organizational units to absorb Service members 
undergoing the disability evaluation; plans should be in place to 
ensure Service members are appropriately and constructively engaged. 

DoD Response: Concur with comments/clarification of staffing 
terminology. 

* The Department has concern about the use of the term "MEE. 
physician." MEB physician is not a recognized clinical specialty. 
Physicians who participate in the disposition of a medical evaluation 
board may be considered a MEB physician. However, the only specialty 
provider required on a medical evaluation board is a psychiatrist, if 
the case involves a review of a mental condition. Thus, any assessment 
of adequacy of staffing should include availability of all qualified 
providers available to review cases as part of a medical evaluation 
board. The availability of psychiatrists should be assessed separately. 

* The Department concurs with requiring the military services to 
identify alternative housing options should more space for IDES 
participants be required. 

* The Department concurs with the draft recommendation that plans 
should be in place to ensure Service members are appropriately and 
constructively engaged. As noted in the draft report, the site 
assessment matrix does address plans for ensuring that IDES 
participants are gainfully employed by their organizational units, it 
is the units responsibility to follow the requirements of the site 
assessment matrix. 

Recommendation 2: 

To improve their agencies' ability to resolve differences about 
diagnoses of Service members' conditions, and to determine whether 
their new guidance sufficiently addresses these disagreements, we 
recommend that the Secretaries of Defense and Veterans Affairs: 

* conduct a study to assess the prevalence and causes or such 
disagreements; and; 

* establish a mechanism to continuously monitor disagreements about 
diagnoses between MEB physicians and VA examiners and between PEBs and 
VA rating offices. 

DoD Response: Concur with comments. 

* The Department concurs with the draft GAO recommendation to conduct 
a study to assess the prevalence and cause of disagreements between 
the Military Department physicians and the Department of Veterans 
Affairs physicians. 

* The Department concurs with the draft GAO recommendation to 
establish a mechanism to continuously monitor disagreements about 
diagnoses between Military Department physicians and the Department of 
Veterans Affairs physicians. As noted in the comments to 
Recommendation 1, the term MEB physician is not a recognized clinical 
specialty, as such, the Department prefers Military Department 
physician. The Department and the Department of Veterans Affairs both 
consult the Veterans Affairs rating schedule when making a 
determination of disability. As the two departments study and monitor 
disagreements we will address and resolve many of the issues outlined 
in the report. 

Recommendation 3: 

To enable their agencies to take early action on problems at IDES 
sites post-implementation, we recommend that the Secretaries of 
Defense and Veterans Affairs develop a system-wide monitoring 
mechanism to identify challenges as they arise in all DoD and VA 
facilities and offices involved in the IDES_ This system could include: 

* continuous collection and analysis of data on DoD and VA staffing 
levels, sufficiency of exam summaries, and diagnostic disagreements; 

* monitoring of available data on caseloads and case processing time 
by individual VA rating office and PEB; and; 

* a formal mechanism for agency officials at local DoD and VA 
facilities to communicate challenges and best practices to DoD and VA 
headquarters offices. 

DoD Response: Concur with comments. 

* The Department concurs with the draft GAO recommendation to 
continuously collect and analyze data on staffing levels, sufficiency 
of exam summaries, and diagnostic disagreements. 

* The Department concurs with the draft GAO recommendation to monitor 
available data on caseloads and case processing time by individual 
rating office and PEB. Currently, the Department tracks caseloads and 
processing times through the Veterans Tracking Application (VTA), and 
a weekly report is provided to stakeholders. The report allows for 
continuous monitoring and can be modified to enable tracking by rating 
office and PEB. 

* The Department concurs with the draft GAO recommendation for a 
formal mechanism for agency officials at local DoD and VA facilities 
to communicate challenges and best practices to DoD and VA 
headquarters offices. The Department and the Department of Veterans 
Affairs jointly participate in the Disability Advisory Council (DAC). 
One of the objectives of the DAC is the identify best practices, 
address inconsistencies in policy, address problems and issues in the 
administration of the IDES and to provide a forum for developing, 
planning and implementing future improvements. Military Department 
representatives on the DAC bring the challenges and best practices 
from local DoD and VA facilities to the DAC. The DAC is being re-
chartered as the Disability Evaluation System Benefits Executive 
Council Working Group under the auspices of the Benefits Executive 
Council (BEC). 

[End of section] 

Appendix IV: Comments from the Department of Veterans Affairs: 

Department Of Veterans Affairs: 
Washington DC 20420: 

November 18, 2010: 

Mr. Daniel Bertoni: 
Director, Education, Workforce, and Income Security Issues: 
U.S. Government Accountability Office: 
441 G Street, NW: 
Washington, DC 20548: 

Dear Mr. Bertoni: 

The Department of Veterans Affairs (VA) has reviewed the Government
Accountability Office's (GAO) draft report, Military And Veterans 
Disability System: Pilot has Achieved Some Goals but Further Planning 
and Monitoring Needed (GAO-11-69) and generally agrees with GAO's 
conclusions and concurs with one GAO recommendation and concurs in 
principle with one GAO recommendation. 

The enclosure specifically addresses each of GAO's recommendations and 
provides comments on the draft report. VA appreciates the opportunity 
to comment on you draft report. 

Sincerely, 

Signed by: 

John R. Gingrich: 
Chief of Staff: 

Enclosure: 

[End of letter] 

Enclosure: 

Department of Veterans Affairs (VA) Comments to Government 
Accountability Office (GAO) Draft Report: Military And Veterans 
Disability System: Pilot has Achieved Some Goals but Further Planning 
and Monitoring Needed (GAO-11-69). 

GAO Recommendation 2: To improve their agencies' ability to resolve 
differences about diagnoses of servicemembers' conditions, and to 
determine whether their new guidance sufficiently addresses these 
disagreements, we recommend that the Secretaries of Defense and 
Veterans Affairs: 

* Conduct a study to assess the prevalence and causes of such 
disagreements; and; 

* Establish a mechanism to continuously monitor disagreements about 
diagnoses between MEB physicians and VA examiners and between PEBs and 
VA rating offices. 

VA Comment: Concur in principle. The Department of Veterans Affairs 
(VA) will study the prevalence and causes of the variations. Based on 
the results of this study, a determination will be made, no later than 
July 1, 2011, as to what, if any, mechanisms need to be put in place. 

GAO Recommendation 3: To enable their agencies to take early action on 
problems at IDES sites post-implementation, we recommend that the 
Secretaries of Defense and Veterans Affairs develop a system-wide 
monitoring mechanism to identify challenges as they arise in all DOD 
and VA facilities and offices involved in the IDES. This system could 
include: 

* Continuous collection and analysis of data on DoD and VA staffing 
levels, sufficiency of exam summaries, and diagnostic disagreements; 

* Monitoring of available data on caseloads and case processing time 
by individual VA rating office and PEB; and; 

* A formal mechanism for agency officials at local DoD and VA 
facilities to communicate challenges and best practices to DoD and VA 
headquarters offices. 

VA Comment: Concur. The Veterans Health Administration (VHA) has 
developed a new, focused monitoring of performance at sites that has 
been implemented at IDES. Additionally, VHA has introduced system 
changes and workload-recording practices that will make it 
significantly easier to more closely monitor DES examination 
activities, distinct from other compensation and pension examination 
workload. In particular, VHA will monitor requests (versus forecast), 
examination timeliness, examination insufficiencies, and examination 
termination reason (e.g., no-show, incorrect examination, etc.). VHA 
will additionally closely monitor the effect the IDES examination 
workload has on all C&P examination workload. This will be implemented 
by December 31, 2010. 

VA and DoD currently track caseloads and processing times through the 
Veterans Tracking Application (VTA), and a weekly report is provided 
to stakeholders. Through this reporting mechanism, workload will 
continue to be monitored and Military Services Coordinator staffing 
levels are adjusted as caseload fluctuates. VA Regional Offices are 
required to have a written contingency plan in place to address 
unexpected spikes or projected increases in caseload. 

Data specific to each Physical Evaluation Board and Rating Office are 
currently available in VTA these data are monitored to identify trends 
and outliers. VA will establish additional workload controls that can 
be monitored through the VETSNET Operation Reports. The controls will 
be distinct for the preliminary rating and the final rating. This 
provides a system of management of the sub-phases and provides more 
accurate reporting on timeliness. By March 31, 2011, VTA will be 
enhanced with new functionality to identify cases that require 
additional development, including cases involving insufficient exam 
summaries. 

Collaborative processes are being established between VA and DoD that 
will facilitate the establishment of best practices for the expanded 
program. 

[End of section] 

Appendix V: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Daniel Bertoni, (202) 512-7215, bertonid@gao.gov: 

Staff Acknowledgments: 

Michele Grgich (Assistant Director), Yunsian Tai (Analyst-in-Charge), 
Jeremy Conley, and Greg Whitney made significant contributions to this 
report. Walter Vance and Vanessa Taylor provided assistance with 
research methodology and data analysis. Bonnie Anderson, Rebecca 
Beale, Mark Bird, Brenda Farrell, Valerie Melvin, Patricia Owens, and 
Randall Williamson provided subject matter expertise. Susan Bernstein 
and Kathleen van Gelder provided writing assistance. James Bennett 
provided graphics assistance. Roger Thomas provided legal counsel. 

[End of section] 

Footnotes: 

[1] These studies include Independent Review Group, Rebuilding the 
Trust: Report on Rehabilitative Care and Administrative Processes at 
Walter Reed Army Medical Center and National Naval Medical Center 
(Arlington, Va.: April 2007); Task Force Report to the President: 
Returning Global War on Terror Heroes (April 2007); President's 
Commission on Care for America's Returning Wounded Warriors, Serve, 
Support, Simplify (July 2007); Department of the Army, Office of the 
Inspector General, Report on the Army Physical Disability Evaluation 
System (Washington, D.C.: Mar. 6, 2007); GAO, Military Disability 
System: Increased Supports for Servicemembers and Better Pilot 
Planning Could Improve the Disability Evaluation Process, [hyperlink, 
http://www.gao.gov/products/GAO-08-1137] (Washington, D.C.: Sept. 24, 
2008). 

[2] Pub. L. No. 110-181, §1601-1676, 430, 122 Stat. 3. 

[3] Pub. L. No. 110-181, § 1615(d), 122 Stat. 3, 447. We previously 
reported that DOD and VA have completed the requirements established 
in the NDAA 2008 for developing policy to improve the medical and 
physical disability evaluation of recovering servicemembers. GAO, 
Recovering Servicemembers: DOD and VA Have Jointly Developed the 
Majority of Required Policies but Challenges Remain, [hyperlink, 
http://www.gao.gov/products/GAO-09-728] (Washington, D.C.: July 8, 
2009). 

[4] The data we received for the legacy disability evaluation system 
is as of January 31, 2010. The data we received for the IDES pilot is 
as of February 28, 2010. The survey data we received on participants 
who went through the legacy system and those who went through the IDES 
pilot was as of February 28, 2010. 

[5] The IDES pilot sites we visited were: (1) Bayne Jones Army 
Community Hospital, Fort Polk, Louisiana; (2) David Grant Medical 
Center, Travis Air Force Base, California; (3) Dewitt Army Community 
Hospital, Fort Belvoir, Virginia; (4) Evans Army Community Hospital, 
Fort Carson, Colorado; (5) Naval Hospital Camp Lejeune, North 
Carolina; (6) Naval Hospital Camp Pendleton, California; (7) Naval 
Medical Center San Diego, California; (8) Walter Reed Army Medical 
Center, Washington, D.C.; (9) Winn Army Community Hospital, Fort 
Stewart, Georgia; and (10) Vance Air Force Base, Oklahoma. 

[6] Pub. L. No. 110-181, § 1615(d)(1), 122 Stat. 3. 

[7] A physician is required to identify a condition that may cause the 
member to fall below retention standards after the member has received 
the maximum benefit of medical care. 

[8] Servicemembers may receive monthly disability retirement benefits 
if they have at least 20 years of active duty or equivalent service, 
or if they have less than 20 years of active duty or equivalent 
service and a 30 percent or higher disability rating. Servicemembers 
may receive lump sum disability severance if they have fewer than 20 
years of active duty or equivalent service, and they have a 
compensable disability rated at 20 percent or lower. Servicemembers 
who separate from the military with a DOD disability rating of 30 
percent or higher receive health care benefits for life, regardless of 
their years of military service. Servicemembers may also be found to 
have an unstable disability, in which case they may be placed on the 
Temporary Disability Retired List. 

[9] For more detailed descriptions of the disability evaluation 
system, see GAO, Military Disability System: Improved Oversight Needed 
to Ensure Consistent and Timely Outcomes for Reserve and Active Duty 
Service Members, [hyperlink, http://www.gao.gov/products/GAO-06-362] 
(Washington, D.C.: Mar. 31, 2006) and [hyperlink, 
http://www.gao.gov/products/GAO-08-1137]. 

[10] The Army Reserve, the National Guard, the Air Force Reserve, the 
Air National Guard, the Navy Reserve, and the Marine Corps Reserve 
constitute DOD's reserve component. 

[11] DOD and VA also concluded in their interim report that disability 
ratings in IDES pilot cases have been higher than in legacy cases, and 
more servicemembers in the pilot were eligible for monthly disability 
benefits rather than lump sum severance, compared to the legacy, but 
they noted that these changes were primarily due to the enactment of 
NDAA 2008, rather than the IDES pilot. NDAA 2008 required DOD to apply 
VA's standards when rating disabilities. 

[12] During the table top exercise, a sample of complete legacy cases 
was used in a simulation exercise to test the relative merits of four 
pilot options. For further information on the table top exercise, see 
GAO, DOD and VA: Preliminary Observations on Efforts to Improve Care 
Management and Disability Evaluations for Servicemembers, [hyperlink, 
http://www.gao.gov/products/GAO-08-514T] (Washington, D.C.: Feb. 27, 
2008). 

[13] Weekly monitoring reports from February 2010 (the cutoff date for 
survey data analyzed for DOD and VA's interim report) and August 2010 
show lower satisfaction levels among Air Force servicemembers. 

[14] The weekly monitoring reports present cumulative case processing 
times, i.e., average case processing times for all cases completed as 
of that given week. 

[15] Our data reliability assessment included interviews regarding 
internal controls, electronic testing, and a trace-to-file process, 
where we matched a small number of randomly sampled case file dates 
against the dates that had been entered into the VTA. For the trace-to-
file process, the overall accuracy rate was 84 percent, and all but 
one date were 70 percent accurate or better and deemed sufficiently 
reliable for reporting purposes. See appendix I for details on our 
data reliability assessment. 

[16] DOD officials stated that, under the legacy system, the Navy, 
Marine Corps, and Air Force considered a case to be referred into the 
disability evaluation system when a physician documented the 
conditions that may render a servicemember unable to perform their 
duties. Under the IDES process, the servicemember is formally referred 
into the disability evaluation system before the documentation is 
prepared. 

[17] Reserve component Army cases took 389 days to reach the VA rating 
phase under the legacy process, compared with 285 days to deliver VA 
benefits under the pilot. Reserve component cases made up 48 percent 
of legacy cases and 23 percent of pilot cases. 

[18] As part of their analysis of costs, DOD estimated that costs of 
servicemembers' disability benefits will increase by approximately 
$960 million per year. However, they noted that these additional costs 
are due to requirements in the National Defense Authorization Act of 
2008 mandating the use of VA's rating standards in the disability 
evaluation system, which tend to result in higher benefits than the 
rating standards that DOD had previously used. DOD stated that these 
increased costs would be realized under the legacy system as well. 

[19] VHA estimated costs of about $33 million, but anticipates being 
reimbursed by DOD for about half of these costs through a cost-sharing 
agreement. 

[20] At Fort Stewart, a private-sector provider performs the single 
exams through a contract with VA. At Fort Carson and Fort Polk, the 
exams are conducted by VA medical staff. A VA contractor also conducts 
single exams at Camp Lejeune (NC), Camp Pendleton (CA), Fort Lewis 
(WA), Naval Hospital Bremerton (WA), and Vance Air Force Base (OK). 

[21] The 8 pilot sites that met the 45-day goal for completing single 
exams include 2 Air Force sites, 5 Army sites, and 1 Navy site that 
met the 45-day goal for servicemembers in both the Navy and Marine 
Corps. One additional site (Camp Pendleton) met the 45-day goal for 
Navy members but did not meet it for Marine Corps members. 

[22] These 8 sites include 2 Air Force sites, 3 Army sites, and 3 Navy 
sites that met the 35-day goal for both servicemembers in the Navy and 
Marine Corps. 

[23] A VA official said that these averages may not include all cases 
completed as of August 2010, due to system design issues with the VTA 
system. In our review of the VTA data as of February 2010, we found 
that in the approximately 1,100 cases that had completed the full IDES 
process up to that date, about 10 percent of the cases were missing 
the date of the VA rating determination. However, the VA official 
estimated that, as of October 2010, data may be missing for about one-
third of the 6,000 cases for which the VA rating offices have 
completed ratings. According to the VA official, VTA was updated in 
September 2010 to address these issues. 

[24] The rating offices are aligned with DOD's PEBs. The Baltimore 
rating office rates cases adjudicated by the Air Force PEB, Navy PEB, 
and the Army's PEB at Walter Reed Army Medical Center, Washington, 
D.C. The Seattle rating office rates cases adjudicated by the Army's 
PEBs at Fort Sam Houston, TX, and Fort Lewis, WA. 

[25] VA officials told us that they have recently engaged a contractor 
to perform exams for Fort Polk. 

[26] In prior work on the Army's predeployment medical assessment 
process, GAO recommended that the Army develop an enforcement 
mechanism to ensure that soldiers are properly referred to and 
complete the MEB prior to deployment, move forward with plans for an 
electronic processing system, and provide servicemembers and their 
families with an independent ombudsman. See GAO, Military Personnel: 
Army Needs to Better Enforce Requirements and Improve Record Keeping 
for Soldiers Whose Medical Conditions May Call for Significant Duty 
Limitations, [hyperlink, http://www.gao.gov/products/GAO-08-546] 
(Washington, D.C.: June 10, 2008). At the time of our review, these 
recommendations were still in process. 

[27] For reserve component servicemembers, the IDES operations manual 
states that the DOD board liaison should compile the complete medical 
records within 30 days of their referral to the IDES. 

[28] In our interviews, DOD officials also mentioned cases in which 
DOD's PEB disagreed about VA's rating for fibromyalgia and sleep apnea. 

[29] For example, VA would rate mental health conditions that cause 
occasional decrease in work efficiency at 30 percent, while it would 
rate conditions that cause deficiencies in most areas (such as work, 
school, family relations, judgment, thinking, or mood) at 70 percent. 
See 38 C.F.R. 4.125-4.130. 

[30] Some DOD and VA officials also indicated that diagnostic 
disagreements reflect a greater level of scrutiny on behalf of 
servicemembers. 

[31] In its comments on a draft of our report, VA informed us that VHA 
is starting discussions with DOD and The Joint Commission (a nonprofit 
organization that evaluates and accredits health care organizations) 
on streamlining certain processes, including simplifying the 
credentialing process. 

[32] The Air Force and Navy do not have comparable special medical 
units, although they have temporary medical hold units. For further 
information on the Army's Warrior Transition Units, see GAO, Army 
Health Care: Progress Made in Staffing and Monitoring Units that 
Provide Outpatient Case Management, but Additional Steps Needed, 
[hyperlink, http://www.gao.gov/products/GAO-09-357] (Washington, D.C.: 
Apr. 20, 2009). 

[33] Officials at Naval Medical Center San Diego were particularly 
concerned about the length of the process for recruits in basic 
training. Under the legacy system, there had been an expedited 
disability evaluation process for military recruits injured during 
basic training. At IDES pilot sites, recruits went through the longer 
IDES process. DOD is currently developing an expedited IDES process 
for recruits. 

[34] DOD and VA had originally planned for 34 sites to implement the 
IDES by the end of December 2010. However, the Army postponed 
implementation at 6 sites. 

[35] DOD's TRICARE Prime access standards are based on the minimum 
time a beneficiary should have to wait for an appointment, and the 
provider's distance from the beneficiary's residence. For example, the 
standard for routine care is an appointment within 7 calendar days, 
and a provider not more than 30 minutes' travel time from the 
beneficiary's residence. 

[36] We had originally requested files for 30 Army cases, 10 from each 
Army PEB. However, one PEB had only completed 4 IDES cases at that 
time, so they provided us with those 4 cases. 

[37] Specifically, we allowed for a 10 percent discrepancy in dates, 
which fluctuated depending on the length of the process phase. For 
example, for the Final Disposition date in the Transition phase, we 
allowed for a discrepancy of 5 days which is 10 percent of the 45 day 
goal for that stage of the process, rounded up. 

[38] GAO, Standards for Internal Control in the Federal Government, 
[hyperlink, http://www.gao.gov/products/GAO/AIMD-00.21.3.1] 
(Washington, D.C.: November 1999); Dennis P. Slevin and Jeffrey K. 
Pinto, "Balancing Strategy and Tactics in Project Implementation," 
Sloan Management Review 33 (fall 1987). 

[End of section] 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Phone: 

The price of each GAO publication reflects GAO’s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO’s Web site, 
[hyperlink, http://www.gao.gov/ordering.htm]. 

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537. 

Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional 
information. 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Ralph Dawn, Managing Director, dawnr@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, D.C. 20548: 

Public Affairs: 

Chuck Young, Managing Director, youngc1@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, D.C. 20548: