This is the accessible text file for GAO report number GAO-04-873 
entitled 'Military Education: DOD Needs to Develop Performance Goals 
and Metrics for Advanced Distributed Learning in Professional Military 
Education' which was released on July 30, 2004.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to the Ranking Minority Member, Committee on Armed Services, 
House of Representatives: 

July 2004: 

MILITARY EDUCATION: 

DOD Needs to Develop Performance Goals and Metrics for Advanced 
Distributed Learning in Professional Military Education: 

GAO-04-873: 

GAO Highlights: 

Highlights of GAO-04-873, a report to the Ranking Minority Member, 
Committee on Armed Services, House of Representatives

Why GAO Did This Study: 

As part of its transformation to prepare the armed forces to meet 
current and future challenges, the Department of Defense (DOD) is 
expanding its use of advanced distributed learning (ADL) techniques in 
senior- and intermediate-level officer professional military education 
(PME) (see table at right). ADL instruction does not require an 
instructor’s presence, and it facilitates the use of varied learning 
management systems. To date, the application of ADL has been targeted 
to nonresident students. To determine whether DOD uses a systematic 
process for evaluating the results of ADL application, GAO was asked to 
examine DOD’s metrics for assessing program effectiveness, to compare 
DOD’s criteria for converting courses to ADL with those of 
private-sector institutions, and to identify the challenges to ADL 
implementation. 

What GAO Found: 

DOD does not have specific performance goals and metrics with which to 
assess ADL effectiveness in PME. Furthermore, although GAO and 
private-sector organization have established frameworks for assessing 
the effectiveness of educational programs by focusing on metrics for 
learning outcomes—that is, the knowledge, skills, and abilities that 
students attain through learning activities—DOD’s oversight focuses 
instead on educational inputs such as facilities, student to faculty 
ratios, and student body composition. Since ADL is still a new and 
evolving tool, systematic evaluative processes have not yet been 
required. Without clear goals and an effective process for evaluating 
the results of ADL application, DOD cannot ensure that its program is 
achieving an appropriate return on investment and other goals.

The criteria for converting PME courses and curricula to ADL vary by 
school and by military service, are based on subjective choices as to 
which content is suited for online delivery, and are focused solely on 
nonresident programs. The private sector similarly lacks systematic 
criteria in its use of ADL. However, DOD’s implementation of ADL 
programs for PME compares favorably with private-sector institutions.

Cultural, technological, and resource challenges affect ADL 
implementation. For example, some military policies reflect a lower 
estimation of the value of nonresident PME, and many respondents to a 
survey of ADL students and alumni indicated that its quality and 
achievement of outcomes did not compare favorably, in their view, with 
those of resident education programs. The technological challenges of 
balancing computer access with network security, along with resource 
challenges of funding and increased burdens on limited administrative 
staff, are additional concerns.

[See PDF for figure]

[End of figure]

What GAO Recommends: 

GAO recommends that the Secretary of Defense promote (1) the 
development of specific performance effectiveness goals for ADL in PME 
schools and (2) the use of ADL technologies to provide and establish 
metrics for learning outcomes. DOD partially concurred with the first 
recommendation and fully concurred with the second. DOD supports the 
use of specific effectiveness goals for PME, but believes such goals 
are not appropriate for any specific delivery method.

www.gao.gov/cgi-bin/getrpt?GAO-04-873.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Neal P. Curtin at (757) 
552-8100 or curtinn@gao.gov.

[End of section]

Contents: 

Letter: 

Results in Brief: 

Background: 

DOD Does Not Have Specific Metrics for Assessing Performance Goals or 
Learning Outcomes: 

ADL Conversion Varied by School and by Service Based on Subjective 
Assessments of Content Suitability: 

Cultural, Technological, and Resource Barriers and Challenges Affect 
ADL Implementation in PME Programs: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendixes: 

Appendix I: Scope and Methodology: 

Appendix II: Methodology for Our Survey of Nonresident PME Students and 
Graduates: 

The Study Population: 

Developing the Survey: 

The Sample Design and Administration: 

Sampling Error: 

Nonsampling Error and Data Quality: 

Appendix III: Survey Responses: 

Introduction: 

Frame of Reference: 

Appendix IV: ADL Applications and Additional Features 
of Nonresident Programs: 

U.S. Army War College: 

Naval War College: 

Air Command and Staff College: 

Joint Forces Staff College: 

Additional Features of Nonresident PME Programs: 

Appendix V: Comments from the Department of Defense: 

GAO's Comment: 

Tables: 

Table 1: Enrollment Statistics for Resident and Nonresident Students 
for Each Senior-and Intermediate-Level PME School for the Academic Year 
2003-2004: 

Table 2: Disposition of Sample: 

Table 3: Levels-of-Learning Definitions: 

Abbreviations: 

ACSC: Air Command and Staff College: 

ADL: advanced distributed learning: 

DOD: Department of Defense: 

JFSC: Joint Forces Staff College: 

NWC: Naval War College: 

PME: professional military education: 

USAWC: U.S. Army War College: 

United States General Accounting Office:

Washington, DC 20548:

Letter July 30, 2004: 

The Honorable Ike Skelton: 
Ranking Minority Member: 
Committee on Armed Services: 
House of Representatives: 

Dear Mr. Skelton: 

As part of its transformation to prepare the armed forces to meet 
current and future challenges, the Department of Defense (DOD) is 
transforming the way it trains to favor more rapid and responsive 
deployment. A significant element of its training transformation 
strategy is the application of advanced distributed learning (ADL), a 
technique of instruction that does not require an instructor's 
presence; can use more than one form of media; and emphasizes the use 
of reusable content, networks, and learning management systems. DOD has 
been expanding its use of ADL in its program of professional military 
education (PME). PME provides military officers with a wide array of 
college-level academic courses in both resident and nonresident 
settings; to date, the application of ADL has been targeted to 
nonresident students. As a new tool, ADL is being examined to determine 
whether DOD is applying a systematic performance evaluation approach, 
particularly in light of the increased rate at which servicemembers are 
being deployed worldwide. Without clear goals and an effective process 
for evaluating the results of ADL application, DOD cannot ensure that 
its program is achieving an appropriate return on investment and other 
goals.

We were asked to review DOD's use of ADL in senior-and intermediate-
level officer PME, and specifically: 

1. to examine the metrics DOD uses to assess the effectiveness of ADL 
in PME,

2. to determine what processes and criteria DOD uses to select the 
courses or curricula it converts to ADL and how these criteria compare 
with those of other institutions in meeting ADL objectives in 
nonresident education, and: 

3. to identify what barriers and challenges exist for implementing ADL 
in PME.

We also reviewed and assessed the policies and guidance of several DOD 
offices responsible for providing oversight for PME activities. These 
offices included the Under Secretary of Defense for Personnel and 
Readiness and the Joint Staff's Joint Education Branch. We also studied 
experience in the private education sector and in other parts of the 
government in measuring the effectiveness of education programs. In 
addition, we surveyed 437 current students and graduates of senior-and 
intermediate-level PME programs to obtain their perspectives on their 
PME experience. Appendixes I and II describe our scope and methodology 
in more detail.

We conducted our review from March 2003 through June 2004 in accordance 
with generally accepted government auditing standards.

Results in Brief: 

DOD does not have specific performance goals or metrics with which to 
assess the effectiveness of ADL in PME, and its oversight activities 
focus on educational inputs rather than on learning outcomes. While DOD 
believes ADL has had a positive impact, its views are based on 
anecdotal information; clear goals and an effective process for 
evaluating results of ADL implementation are absent. Although numerous 
organizations have roles in providing oversight of PME activities, with 
several specifically responsible for ensuring that PME meets general 
standards of accreditation, DOD accreditation activities, like those in 
the private sector, focus primarily on educational process inputs--for 
example, facilities or student to faculty ratios. But we and a private-
sector organization have established guidelines and frameworks for 
assessing the effectiveness of educational programs that stress a focus 
on measurable outcomes--that is, the knowledge, skills, and abilities a 
student acquires from a course. Furthermore, ADL has a unique ability 
to capture, retain, store, and document interactions in an online 
environment, thus providing the opportunity to demonstrate student 
skill improvements and to customize performance metrics. However, we 
found no evidence to indicate that DOD is using this ability.

The processes for converting PME courses and curricula to ADL vary 
by school and by military service, and they feature a mixture of 
in-house and contractor approaches. PME schools generally focus their 
ADL applications on nonresident education programs, and they tend to 
convert an entire curriculum as a package rather than in a modular, 
course-by-course manner. No systematic criteria inform PME schools' 
decisions about which courses or curricula to convert to ADL. Instead, 
schools make individual, subjective choices as to which content is best 
suited for online rather than another delivery method. Notably, we 
found that nonmilitary educational institutions also lack systematic 
criteria when converting courses or curricula to ADL. DOD's approaches 
are in fact consistent with mainstream practice, and in some cases, 
compare favorably with the best implementations.

Numerous cultural, technological, and resource challenges affect ADL 
implementation in PME programs, some which may affect ADL expansion or 
maintenance. Cultural issues include concerns by PME school officials 
about ADL's acceptance as an appropriate learning method and the 
appropriate extent of its use for nonresident education. In our survey, 
nonresident students expressed concerns about the quality of their 
courses, regardless of nonresident delivery method, as compared with 
those taken in residence. Technological challenges, particularly those 
concerning the optimal balance between student access (computer 
availability and freedom of information) and network security 
(protection of sensitive information and use of military installation 
firewalls), remain to be addressed. With respect to resources, there 
are concerns about ADL's ability to compete for limited funding and 
about the potentially burdensome administrative impact on nonresident 
program staff.

To better assess the effectiveness of ADL in professional military 
education, we recommend that DOD promote (1) the development of 
specific performance effectiveness goals for ADL in PME schools and 
(2) the use of ADL technologies to provide and establish metrics for 
learning outcomes.

In commenting on a draft of this report, DOD partially concurred with 
our first recommendation and fully concurred with the second. DOD 
supports the use of specific effectiveness goals for PME, but believes 
such goals are not appropriate for any specific delivery method. DOD 
stated that current accreditation practices are already promoting the 
data collection capabilities of ADL technologies for assessing multiple 
delivery methods.

Background: 

Each military service has separate PME schools for senior-and 
intermediate-level officers. As defined by the Joint Staff's Officer 
Professional Military Education Policy,[Footnote 1] the senior-level 
schools, typically for O-5 and O-6 ranked officers, focus on 
warfighting within the context of strategy. The intermediate-level 
schools, typically for O-4 ranked officers, focus on warfighting within 
the context of operations.[Footnote 2] (See table 1 for a list of PME 
schools and enrollment totals.) 

The senior-and intermediate-level PME schools are not alike in terms of 
program offerings for resident and nonresident students. As indicated 
in table 1, while all senior-level PME schools offer resident programs, 
only the Army War College and the Air War College have analogous 
nonresident programs. Also as indicated in table 1, all intermediate-
level PME schools offer resident and nonresident programs.

DOD has approximately 39,318 students enrolled in its senior-and 
intermediate-level PME schools. The vast majority of these enrollees 
are nonresident students. Of the total enrolled, approximately 3,788, 
or 10 percent, are taking course work as resident students; the rest, 
or 90 percent, are nonresident enrollees.

Table 1: Enrollment Statistics for Resident and Nonresident Students 
for Each Senior-and Intermediate-Level PME School for the Academic Year 
2003-2004: 

PME institutions: Joint Senior-Level Schools: National War College; 
Resident students: 200; 
Nonresident students: N/A.

PME institutions: Joint Senior-Level Schools: Industrial College of the 
Armed Forces; 
Resident students: 309; 
Nonresident students: N/A.

PME institutions: Joint Combined-Level School: Joint Forces Staff 
College; 
Resident students: 229; 
Nonresident students: N/A.

PME institutions: Senior-Level Schools: Air War College; 
Resident students: 265; 
Nonresident students: 6,100.

PME institutions: Senior-Level Schools: Army War College; 
Resident students: 340; 
Nonresident students: 654.

PME institutions: Senior-Level Schools: College of Naval Warfare-Naval 
War College; 
Resident students: 209; 
Nonresident students: N/A.

PME institutions: Senior-Level Schools: Marine Corps War College; 
Resident students: 16; 
Nonresident students: N/A.

PME institutions: Intermediate-Level Schools: Air Command and Staff 
College; 
Resident students: 587; 
Nonresident students: 12,069.

PME institutions: Intermediate-Level Schools: Army Command and General 
Staff College; 
Resident students: 1,183; 
Nonresident students: 10,000[A].

PME institutions: Intermediate-Level Schools: College of Naval Command 
and Staff-Naval War College; 
Resident students: 256; 
Nonresident students: 1,799[ B].

PME institutions: Intermediate-Level Schools: Marine Corps Command and 
Staff College; 
Resident students: 194; 
Nonresident students: [C].

PME institutions: Intermediate-Level Schools: Marine Corps College of 
Continuing Education; 
Resident students: [C]; 
Nonresident students: 4,908.

Total; 
Resident students: 3,788; 
Nonresident students: 35,530. 

Source: DOD.

Note: N/A-School without a nonresident component.

[A] According to Army Command and General Staff College officials, the 
nonresident student total fluctuates and could be plus or minus 2,000 
students on any given day.

[B] Naval War College's nonresident programs are offered at the 
intermediate level (equivalent to the Naval Command and Staff College) 
through its College of Distance Education.

[C] The Marine Corps College of Continuing Education is the nonresident 
component of the Marine Corps Command and Staff College.

[End of table]

Nonresident PME exists to provide PME to a larger population than can 
be supported in resident institutions. Nonresident PME provides 
alternative learning-style options for officers not selected for 
residence or unable to participate in residence due to operational 
commitments. The military services have had nonresident PME programs 
for many years. The Naval War College (NWC) has had a department for 
correspondence courses since 1914. The U.S. Army War College (USAWC) 
has provided a nonresident course offering since 1968. The Air Force's 
nonresident programs were created in 1947 for its senior-level PME 
school and 1948 for its intermediate-level PME school.

Paper-based correspondence is the traditional nonresident PME delivery 
mode. Students complete correspondence courses individually with 
limited faculty contact. Course materials and submissions are exchanged 
between students and faculty primarily by mail. PME schools have 
implemented other delivery modes, including seminars conducted at 
remote sites by PME faculty and CD-ROM distribution. Increasingly, PME 
schools are using ADL[Footnote 3] techniques in their nonresident 
program offerings.

Several ADL applications are currently in use at senior-and 
intermediate-level PME schools, and all of them are focused on 
nonresident programs. They are offered at the U.S. Army War College, 
the Naval War College, and the Air Command and Staff College (ACSC). A 
planned ADL offering for reserve component staff is under development 
at the Joint Forces Staff College. See appendix IV for details on these 
programs. The addition of an ADL application for the Army Command and 
General Staff College nonresident PME course is anticipated for fiscal 
year 2005.

DOD Does Not Have Specific Metrics for Assessing Performance Goals or 
Learning Outcomes: 

DOD does not have specific performance goals and metrics to assess the 
effectiveness of ADL in PME. While DOD believes ADL has had a positive 
impact, its views are based on anecdotal information, rather than a 
systematic performance measurement. Thus, DOD cannot determine whether 
ADL is meeting performance goals in comparison to other delivery 
methods. Although numerous organizations are providing oversight of PME 
activities, with several specifically responsible for ensuring that PME 
meets general standards of accreditation, these organizations do not 
focus on student learning outcomes--that is, the knowledge, skills, and 
abilities a student acquires from a course. Instead, DOD accreditation 
activities, like those in the private sector, focus primarily on 
educational process inputs, such as quality of facilities and student 
faculty ratios. We and a private-sector organization have recently 
established guidelines and frameworks for assessing the effectiveness 
of educational programs that stress a focus on measurable outcomes. ADL 
is a new and evolving tool for which systematic evaluation requirements 
have not been established. ADL has a unique ability to capture, retain, 
store, and document interactions in an online environment, which 
provides the opportunity to demonstrate student skill improvements, and 
thus to customize performance metrics. However, we have found no 
evidence to indicate that DOD is utilizing this ability.

Numerous Organizations Have Roles in Providing Oversight of PME 
Activities: 

Numerous oversight organizations review PME activities, with several 
organizations specifically designed to ensure that PME conforms to 
general standards of accreditation. The preeminent mechanism for 
oversight is the Joint Chiefs of Staff's Process for Accreditation of 
Joint Education. The process is designed to provide oversight and 
assessment of PME institutions for purposes of strengthening and 
sustaining Joint Professional Military Education.[Footnote 4] It is a 
peer-review process involving a self-study component and a team 
assessment. The review sequence includes certification, accreditation, 
and reaffirmation of accreditation status. Accreditation can currently 
be granted for up to 5 years, and all PME programs with current ADL 
applications are Joint Staff-accredited. The Joint Staff also sponsors 
the Military Education Coordinating Council, an advisory body composed 
of high-ranking PME leadership. The purpose of the Council is to 
address key issues of interest for joint military education, to promote 
cooperation and collaboration among member institutions, and to 
coordinate joint education initiatives.

The military services have responsibility for the service PME 
institutions in terms of managing PME content and quality and 
conducting all levels within the guidelines of the military educational 
framework. Consistent with Title 10 of the United States Code, the 
Secretary of Defense requires that each PME institution periodically 
review and revise curriculum to strengthen focus on joint matters and 
on preparing officers for joint duty assignments.

PME is also reviewed by other internal and external organizations. Each 
PME institution has a Board of Visitors/Advisors that provides guidance 
over PME activities. The Board of Visitors/Advisors is composed of 
military and/or civilian academic officials who are nominated by PME 
schools and appointed by service secretaries to provide advice on 
educational and institutional issues. Service PME institutions have 
other internal and external advisory committees that perform activities 
such as providing advice, communicating feedback from major commands, 
and conducting curriculum review. Service Inspector General offices 
have conducted periodic reports and assessments on PME schools. The 
military services' education and training commands also provide 
oversight of PME activities, though not day-to-day administration. 
Additionally, private-sector regional accreditation agencies assess 
senior-and intermediate-level PME programs. Their accrediting 
activities generally guide the Joint Staff's review process.

Performance-Effectiveness Metrics for ADL Implementation Are Lacking: 

PME schools have not established, and oversight organizations have not 
reviewed, specific goals or metrics of performance effectiveness for 
ADL implementation. As was stated in our recently issued guide for 
establishing a framework for assessing training and development efforts 
in the federal government, "it is increasingly important for agencies 
to be able to evaluate training and development programs and 
demonstrate how these efforts help develop employees and improve the 
agencies' performance."[Footnote 5] The Sloan Consortium--a private-
sector organization that maintains a repository of information on 
distance education--views metrics as crucial for assessing program 
effectiveness. For example, metrics can (1) demonstrate that the 
"learning effectiveness" of nonresident education is at least as good 
as that of its resident counterpart, (2) identify cost comparisons that 
can be used to develop better strategic plans, and (3) provide 
information on student retention and completion rates. As was stated in 
our report on oversight for the military academies, such elements 
embody the principles of effective management, in which achievements 
are tracked in comparison with plans, goals, and objectives, and the 
differences between actual performance and planned results are 
analyzed.[Footnote 6]

PME schools identified advantages of ADL over other means of delivery, 
but the advantages appeared to be anecdotally derived. PME school 
officials stated that ADL has resulted in quality improvements in PME 
delivery, especially when compared with paper-based correspondence. 
These advantages include (1) better facilitation of student and faculty 
interaction; (2) increased flexibility in modifying course material; 
(3) reductions in time required to complete programs; (4) better 
leveraging of resources for administrative support; and 
(5) establishment of learning management systems that monitor student 
progress and produce management reports. But there were no indications 
that evidence for these advantages were based on an evaluative effort 
to compare differences between ADL and paper-based correspondence 
courses. Since PME schools have not detailed a comprehensive process 
for evaluating ADL benefits over paper-based correspondence, it cannot 
be determined whether ADL is meeting performance goals based on 
appropriate returns on investment, student retention, student access to 
courses, or other goals that schools use to measure program 
effectiveness.

Additionally, we did not observe any oversight agency focus on specific 
metrics of ADL effectiveness. According to Joint Staff officials, they 
perform reviews of nonresident programs as part of their accreditation 
activities. However, their reports focus on the nonresident program as 
a whole and not on particular methods of delivery. ADL is a new and 
evolving tool, and a systematic assessment of these applications has 
not yet been required. The three regional accreditation agencies that 
review PME schools with ADL implementations show variances in 
nonresident program evaluation policy.[Footnote 7] One agency stated 
that nonresident programs are not separately evaluated, although the 
programs may be included within the scope of the institution's existing 
accreditation. Another agency stated that additional procedures must be 
performed before nonresident programs are included within the scope of 
the institution's accreditation. The third agency required schools to 
evaluate its nonresident programs to ensure comparability to resident 
programs. In addition, we have not observed any Office of the Secretary 
of Defense or Board of Visitors/Advisors reviews in relation to ADL 
effectiveness for nonresident PME.

While we did not observe measures of effectiveness specifically geared 
toward ADL applications, PME schools with ADL applications did perform 
program effectiveness assessments for nonresident education by the way 
of student satisfaction assessments as part of the Joint Staff 
accreditation process. These assessments used in-course student 
surveys, graduate surveys, and supervisory surveys to obtain feedback 
as part of a systematic approach to instructional design and to update 
and improve curriculum offerings.

* USAWC performs surveys of students, alumni, and general officers with 
USAWC graduates in their commands. Students are surveyed for each 
course regarding particular aspects of the course and general degrees 
of satisfaction. A survey of alumni is conducted every 2 years. A 
general officer survey, designed to assess general officer impressions 
of alumni, will now be conducted annually instead of every 3 years, as 
in the past. Prior feedback from general officer surveys reported that 
the curriculum should emphasize application of strategic thinking to 
national security issues. USAWC also performs internal course 
evaluations as part of its curriculum assessment process. USAWC faculty 
members are required to undergo training to provide a degree of 
standardization in instruction and evaluation. This standardization, 
especially for evaluation, is more stringent for nonresident education. 
USAWC can conduct trend analyses for student performance and student 
satisfaction to determine statistical significances.

* NWC uses student and alumni surveys to assess the academic program's 
effectiveness. Depending on the department, student assessments include 
daily sessions critiques, lecture critiques, end-of-course critiques, 
major exercise critiques, and exam critiques. Alumni are sent 
questionnaires 2 years after graduation asking for feedback on their 
educational experience. All academic departments conduct an extensive 
analysis of various student surveys to determine areas of the 
curriculum that are not meeting student needs so that these areas can 
be improved. Surveys are based on standards promulgated by accrediting 
agencies and external organizations to help objectively measure 
institutional excellence. Resident and nonresident student programs are 
measured the same since a single faculty is responsible for both. Peer 
evaluation of faculty members is used to sustain teaching method 
quality.

* ACSC uses internal and external evaluations at all phases of its 
curriculum development process. It conducts end-of-course surveys that 
focus on delivery and educational support and end-of-year surveys for 
students to provide feedback about whether they believed the school 
(1) prepared them to lead commands, (2) accomplished its mission, 
(3) was institutionally effective, and (4) was beneficial to 
professional development. Surveys are also given to graduates and 
graduate supervisors to obtain perspectives on whether the 
school (1) accomplished its mission and institutional effectiveness; 
(2) enhanced graduates' ability to think operationally and critically; 
(3) prepared graduates to assume leadership duties, and (4) made the 
experience valuable in professional development.

Metrics for Learning Outcomes Are Lacking: 

Student learning outcomes, as stated by the Council for Higher 
Education Accreditation--a national association representing 
accrediting organizations--are "properly defined in terms of the 
knowledge, skills, and abilities, that a student has attained at the 
end (or as a result) of his or her engagement in a particular set of 
higher education experiences."[Footnote 8] PME schools generally are 
not assessed for student learning outcomes as a means of determining 
program effectiveness. The Joint Staff's accreditation organization 
responsible for assessing PME schools has primarily focused on inputs 
to the educational process. As detailed in its policy, its educational 
standard assessment and self-study requirements focus on internal 
aspects such as organizational structure, facilities, curricula, 
student to faculty ratios, student body composition/mix, and faculty 
qualifications. However, as stated in our recently published guide for 
assessing training and development programs, the focus on evaluating 
activities and processes takes away from evaluating training and 
development's contribution to improved performance, reduced costs, or 
greater capacity to meet new and emerging transformation 
challenges.[Footnote 9] The Joint Staff has identified the usefulness 
of student learning outcomes and is currently in the process of 
developing student learning outcomes for PME and procedures to include 
them in the accreditation process.

Our recently published report on distance education states that there 
is increased interest in using outcomes more extensively as a means of 
ensuring quality in all forms of education, including nonresident 
education.[Footnote 10] The Council for Higher Education Accreditation 
has issued guidelines on nonresident education and campus-based 
programs that call for greater attention to student learning outcomes, 
and the congressionally appointed Web-based Education 
Commission[Footnote 11] has also called for greater attention to 
student outcomes. The Commission said that a primary concern related to 
program accreditation is that "quality assurance has too often measured 
educational inputs (e.g., number of books in the library, etc.) rather 
than student outcomes."

Private-sector educational institutions are just beginning to emphasize 
the evaluation of learning outcomes as a viable measure of program 
effectiveness. For example, the University of Maryland University 
College, a school with a comparably large distance education program 
and which serves a large number of military personnel, is piloting a 
project to identify and measure learning outcomes in five general 
areas--writing efficiency and oral communication, technology fluency, 
information literacy, quantitative literacy, and scientific literacy. 
The university will use knowledge captured by its distance education 
database to serve as a basis for this determination.

Accrediting agencies and our recent report on training and development 
program assessments are also emphasizing the evaluation of learning 
outcomes as a measure of program effectiveness. Some of the regional 
agencies that accredit programs at the senior-and intermediate-level 
PME schools generally recognize the importance of student learning 
outcomes and have instituted practices that reflect some aspects of a 
systematic, outcome-based approach called for in 
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-279].[Footnote 12] 
However, these agencies vary in the extent to which standards and 
policies address student learning outcomes for distance education. Our 
training and development assessment guide states that agencies need 
credible information on how training and development programs affect 
organizational performance, and that decision makers will likely want 
to compare the performance of these programs with that of other 
programs. Furthermore, programs lacking outcome metrics will be unable 
to demonstrate how they contribute to results.

We surveyed nonresident PME current students and graduates to obtain 
their perspectives on the achievement of PME learning objectives and 
PME's impact on their career objectives. (See appendix III for 
presentation of survey results.) Because we only surveyed nonresident 
students, we could not compare the results with those of resident 
students. However, we believe the data can be useful for DOD to 
consider in its continuing study of program effectiveness.

ADL Can Be Used to Capture Valuable Measurement Data: 

ADL has a unique ability to capture, retain, store, and document 
interactions in an online environment, which provides the opportunity 
to demonstrate student skill improvements, and thus to customize 
performance metrics. Since work is done on a computer, various data 
points are automatically collected as a student works, including the 
time spent, specific pages of the text visited, use of online help, and 
communication with others. University of Maryland University College 
officials pointed out ADL's unique ability when compared with other 
delivery methods to retain, capture, store, and document baseline data 
that can be used as the basis for performance metrics. These officials 
said they would use such data in designing performance measures for 
learning outcomes. However, we found no evidence to indicate that DOD 
is using this ability. DOD may be missing an opportunity to enhance its 
ability to measure effectiveness.

ADL Conversion Varied by School and by Service Based on Subjective 
Assessments of Content Suitability: 

The processes for converting PME courses to ADL varied by school and by 
military service, and they feature a mixture of in-house and contractor 
approaches. PME schools generally focus their ADL applications on 
nonresident programs to improve their efficiency and effectiveness. In 
most cases, conversion decisions were made in collaboration with school 
academic boards. PME schools did not identify any systematic criteria 
as the basis for their decisions as to which courses or curricula to 
convert to ADL. They subjectively focused ADL conversion on the 
suitability of content for Web-based applications. Curriculum 
conversions were made because of a DOD-wide need (1) to improve access 
to a diverse officer corps and (2) to increase the efficiency of 
educational delivery. Since nonmilitary educational institutions also 
lack systematic criteria for converting courses or curricula to ADL for 
nonresident education, DOD's approaches are in fact consistent with 
mainstream practice, and in some cases, compare favorably with best 
practices in nonmilitary education.

DOD Processes and Criteria for ADL Conversion Varied: 

The processes for converting PME courses to ADL varied by school and by 
military service, and they feature a mixture of in-house and contractor 
approaches. However, the conversions were focused on the schools' 
entire nonresident programs. USAWC's and ACSC's ADL applications were 
developed and managed by in-house staff and faculty. USAWC and ACSC 
used staff consisting of instructional designers and courseware 
developers interacting with respective faculty to develop courses. 
NWC's ADL application combined the use of contractor and in-house 
support. Contractor staff created Web-based applications for two of the 
three courses in NWC's curriculum. NWC officials learned enough from 
contractor efforts to create a Web-based application for the remaining 
course with in-house staff. In all cases, the ADL applications were 
applied to affect the entire nonresident curriculum and, in most cases, 
were preceded by reviews and final decisions from the schools' academic 
boards.

The PME schools did not identify any systematic criteria that inform 
their selection of course for conversion to ADL. Rather, they made 
subjective decisions as to the appropriate parts of the curriculum that 
should be delivered online based on content suitability. While USAWC's 
application is delivered mostly through the Internet, print media 
delivers a portion of the course content as well. USAWC's application 
also includes two 2-week resident components in which students are 
brought together to achieve learning objectives best suited to resident 
instruction. These objectives include verbal communication, 
interaction in live settings, interpersonal skills used in direct 
relationships, and other skills that are important components of the 
resident experience. USAWC's application also uses asynchronous 
"threaded discussions," in which faculty members initiate online 
discussions with students on various academic topics. NWC's and ACSC's 
applications, which do not include resident components, blend content 
delivery using print media, CD-ROM, and the Internet. In NWC's 
application, print media is used for material that is mostly text and 
requires limited interactive capabilities; CD-ROMs are delivered to 
students for material that is not routinely updated; and students are 
assigned to cohort teams that allow online interactive opportunities 
and group discussions. In ACSC's application, almost all nonresident 
materials are provided to students on CD-ROMs as well as on the 
Internet to allow as much flexibility as possible to complete 
nonresident courses.

Generally, PME school officials stated that ADL conversions were made 
because of nonresident PME's need (1) to respond to multiple learning 
styles in a diverse officer corps, (2) to increase the efficiency of 
educational delivery, and (3) to improve the quality of the educational 
offering. Additionally, NWC saw ADL as a means of potentially affecting 
retention and increasing enrollment. USAWC saw ADL as a means of 
responding to student demands for a more efficient and relevant 
educational experience. ACSC saw ADL as an improved means for 
delivering course material to increasing numbers of deployed officers.

Nonmilitary ADL Conversion Decisions Were Similar: 

In nonmilitary applications of ADL, we observed processes and criteria 
for conversion decisions that were similar to DOD's. Course conversions 
followed analogous processes and criteria decisions, driven by factors 
such as interest and student enrollment that were not systematic. 
University of Maryland University College officials stated that their 
conversion process includes a dedicated staff of instructional 
designers and subject matter experts (usually faculty) who produce 
course conversion to distance learning content within an established 
framework to ensure maintenance of standards. Their criteria for 
conversion focus on high-student demand and high levels of interest and 
on course work that required less "hands-on" training, such as business 
and information technology courses. Further research of private-sector 
practices supports the observation that the lack of systematic criteria 
is consistent with mainstream practice in ADL adoption for nonresident 
education.

DOD's approaches for course conversion are thus consistent with 
mainstream practice, and in some cases, compare favorably with the best 
practices in nonmilitary education. For example, NWC's ADL application 
received the Crystal Award in 2002 from the Association for Educational 
Communications and Technology based on "innovative and creative use of 
the medium, instructional value and relevance, soundness of 
instructional strategy, quality of production, and evidence of 
successful outcomes." As of June 2004, USAWC's nonresident education 
program is fully accredited by the Middle States Commission of Higher 
Education for the awarding of Master's of Strategic Studies Degrees. 
USAWC is the first military institution to achieve degree-granting 
authority for its nonresident ADL-based program.

Cultural, Technological, and Resource Barriers and Challenges Affect 
ADL Implementation in PME Programs: 

PME schools identified a number of cultural, technological, and 
resource challenges that affect ADL implementation and may affect 
future maintenance or expansion of ADL efforts. Cultural issues such as 
appropriate extent of ADL incorporation, general perceptions about 
nonresident education, and limited ADL research in military education 
affect the degree of ADL implementation. Technological trade-offs and 
nonresident program resourcing also affect continued ADL efforts.

Cultural: 

PME officials question the appropriate extent to which ADL should be 
used in nonresident education and how closely it can, or should, enable 
nonresident education to approximate resident education. It is 
generally recognized that resident programs are better in providing 
acculturation,[Footnote 13] interactive skills, and simulations that 
are critical for professional development of officers, and that there 
are challenges in providing such aspects in nonresident education. But 
some observers believe that nonresident education should not be 
compared to resident education and that nonresident education 
represents a vital broadening experience in its own right. In addition, 
there are indications that an ADL approach could significantly enrich 
nonresident content by excluding teaching methods that do not work in 
residence, allowing students the flexibility to focus on material that 
requires further study without class disruption, and serving as the 
basis for applications that can be used to upgrade teaching methods in 
resident programs.

ADL implementation could also be affected by certain negative 
perceptions concerning nonresident education that are held by students, 
and in some cases, reflected in policy. Our survey of nonresident PME 
current students and recent graduates indicated that about 49 percent 
of current students and 48 percent of graduates believe they are not as 
well prepared as are their resident student counterparts, regardless of 
nonresident delivery method.[Footnote 14] While not universal between 
the services, we observed instances of military education policies that 
reinforce the perception that nonresident education is not as desirable 
as resident education. The Air Force's PME Instruction AFI36-2301 
states that "ideally, all officers will attend PME in residence," and 
that limited resources restrict resident attendance to the "best 
qualified." Furthermore, "completing nonresident PME programs will not 
affect eligibility for resident PME programs." Indeed, we were told of 
instances where officers who after completing the nonresident PME 
program subsequently enrolled in the corresponding senior-or 
intermediate-level course in residence.

The extent of ADL implementation in nonresident education is affected 
by the role that PME completion plays in promotional consideration. 
Programs maintained to foster promotional consideration--that is, 
"personnel-oriented"--might not be compatible with programs 
emphasizing the learning outcomes brought about by ADL--that is, 
"education-oriented." Our survey shows that promotional 
considerations, rather than learning outcomes, are the focus for 
students in nonresident education. An estimated 73 percent of current 
students and 84 percent of recent graduates listed as their predominant 
reason for participating in nonresident PME a desire to improve their 
promotional chances or to meet promotional requirements. In addition, 
of an estimated 22 percent of PME graduates who were promoted to a 
higher rank after PME completion, an estimated 88 percent stated that 
PME contributed to a "great" or "very great extent" in their 
promotions.[Footnote 15] But ADL adoption goals should focus more on 
learning outcomes than on simply "checking the boxes" for promotional 
opportunity enhancement.

PME officials state that there are concerns that ADL advantages could 
be oversold to the point that ADL may be used to supersede resident 
programs and shift the burden of PME administration. DOD officials, 
already viewing ADL delivery methods as attractive from a cost savings 
perspective, are desirous to expand such programs, even at the expense 
of resident programs. However, PME officials believe that ADL expansion 
should be considered only after completely understanding its impact on 
military operations and recognizing the resident program's role as the 
basis for a nonresident program. In addition, PME officials noted that 
ADL could be used as a means of shifting the burden from the school to 
the student and the student's commands without providing appropriate 
command support, compensation, or resources.

While ADL research exists for military training courses, there is only 
limited research on ADL's impact on military education, especially in 
terms of its impact on learning that requires interactive elements. A 
DOD official stated that it is well-known in studies that distance 
learning (usually teleconferencing) instruction offers no significant 
differences as compared with classroom instruction. However, officials 
believe that more work should be done to collect information that ADL 
improves learning in military education and that these studies should 
focus on collaborative learning environments and the extent of their 
translation. In addition, further efforts should look at commercial 
education studies (undergraduate and graduate education) and its 
transfer to military education.

Technological: 

PME school officials have stated that decisions are needed on trade-
offs between increased demands for student access (more servers, more 
bandwidth, or reduced firewalls) and the maintenance for network 
security. Such decisions are complicated by what is viewed as a lack of 
standardization in DOD and within individual services on security 
requirements.

In our nonresident survey, approximately 19 percent of current students 
and 16 percent of recent graduates experienced computer/Internet 
related problems affecting their PME experience. Some identified 
problems included servers, bandwidth issues, and security firewalls. 
PME schools considered using ".edu" domains to make courseware more 
available to students because of the difficulty of interacting ".mil" 
domains with systems outside of military organizations. However, such 
moves would be expensive and would conflict with increasing 
requirements to reduce the number of servers and personnel needed to 
operate these systems. Some reported problems involved limited 
bandwidth issues. While such problems can be overcome, they require 
time and money to resolve. Firewalls maintained for security purposes 
have caused schools to limit library access to nonresident students due 
to perceived security threats.

Resources and Funding: 

Most ADL efforts at PME schools were fielded independently with limited 
budgets and staffing. USAWC's and ACSC's ADL applications were 
developed and supported with in-house staff responsible for managing 
resident and nonresident programs. These applications were fielded 
independently within the services. PME officials stated that PME 
schools' ability to fund ADL applications is limited due to DOD's 
priority to focus more on its training and operational activities. An 
emerging funding issue involves the use of copyrighted material in ADL 
applications. Increasing costs in using copyrighted material for course 
work could result in limiting course flexibility and methodologies.

New technologies, such as ADL, create new requirements for faculty 
personnel with higher technical expertise and more equipment and 
structure than traditional programs. Faculty members must be skilled to 
perform in online and classroom settings. PME schools are beginning to 
observe they must offer faculty opportunities to teach courses in 
multiple medias or risk losing qualified faculty to competitors.

Conclusions: 

Although PME schools receive oversight from a number of organizations, 
we observed that neither the schools nor the oversight agencies had 
focused on (1) establishing specific performance effectiveness goals 
for ADL implementation or (2) measuring learning outcomes as a means of 
evaluating program effectiveness. The Joint Staff's accreditation 
reports on nonresident education do not detail performance goals for 
any particular delivery method. The military services, which have 
primary responsibility for PME oversight, view the accreditation 
process provided by the Joint Staff as the primary means of ensuring 
the effectiveness of nonresident education. DOD is not alone in this 
problem--the lack of metrics for performance effectiveness and learning 
outcomes is pervasive throughout all educational institutions. Our 
prior efforts indicate that most public and private institutions lack a 
framework with which to assess implementation of training and 
development efforts. However, agencies need clear information on how 
training and development efforts affect organizational performance, and 
decision makers will likely want to compare the performance of these 
programs with that of other programs. Without clear goals and an 
effective process for evaluating the results of ADL application, DOD 
cannot ensure that ADL is achieving appropriate return on investment, 
student retention, student access, and other goals in comparison with 
prior efforts. Furthermore, programs lacking outcome metrics will be 
unable to demonstrate how they contribute to results. Moreover, by not 
capturing and using student data that are uniquely available through 
ADL techniques, DOD is missing the opportunity to develop the basis for 
effectiveness metrics and knowledge about learning outcomes.

Recommendations for Executive Action: 

We recommend that the Secretary of Defense direct the Under Secretary 
of Defense for Personnel and Readiness, in concert with the Joint 
Staff, service headquarters, and the PME schools, to take the following 
two actions: 

* promote the development of specific performance effectiveness goals 
for ADL and: 

* promote the use of ADL technologies to capture data to provide 
knowledge about learning outcomes.

Agency Comments and Our Evaluation: 

DOD partially concurred with our first recommendation and fully 
concurred with the second. DOD supports the use of specific 
effectiveness goals for PME, but believes such goals are not 
appropriate for any specific delivery method. While we acknowledge 
DOD's concerns with focusing on a specific delivery method, we believe 
that ADL is unlike other means of delivery because of its potential to 
modernize the educational experience and because its use is rapidly 
expanding in other areas of PME. We believe it would be worthwhile for 
DOD to know specifically how well ADL performs, especially in 
comparison with other delivery methods, in order to better understand 
its appropriate use for PME. DOD concurred with our second 
recommendation and stated that current accreditation practices are 
already promoting the data collection capabilities of ADL technologies 
for assessing multiple delivery methods. DOD's comments are included in 
this report as appendix V. DOD also provided technical changes, which 
we incorporated as appropriate.

We are sending copies of this report to congressional members as 
appropriate. We will also send copies to the Secretary of Defense; the 
Secretaries of the Air Force, Army, and Navy; the Commandant of the 
Marine Corps; and the Chairman of the Joint Chiefs of Staff. We will 
make copies available to others on request. In addition, this report 
will be available at no charge on the GAO Web site at 
http://www.gao.gov.

If you or your staff have any questions, please call me on (757) 552-
8100 or Clifton Spruill, Assistant Director, on (202) 512-4531. Major 
contributors to this report were Arnett Sanders, Maewanda Michael-
Jackson, Jean Orland, David Dornisch, Terry Richardson, and Cheryl 
Weissman.

Sincerely yours,

Signed by: 

Neal P. Curtin: 
Director, Defense Capabilities and Management: 

[End of section]

Appendix I: Scope and Methodology: 

We reviewed Department of Defense's (DOD) implementation of advanced 
distributed learning (ADL) in senior-and intermediate-level 
professional military education (PME) to determine processes and 
criteria used for converting courses, metrics to assess ADL 
effectiveness and its fulfillment of learning objectives, and barriers 
and challenges in ADL implementation. We collected, reviewed, and 
analyzed relevant program information and conducted interviews with DOD 
officials responsible for ADL implementation in PME programs and 
oversight responsibilities. We initially obtained data from the 
Principal Deputy Under Secretary of Defense for Personnel and Readiness 
to identify PME programs with ADL applications. A review of the data 
indicated that there were three existing programs. We identified, 
interviewed, and obtained data from officials from PME schools with ADL 
applications. Those schools were: 

* U.S. Army War College, Carlisle, Pennsylvania;

* Naval War College, Newport, Rhode Island; and: 

* Air Command and Staff College, Montgomery, Alabama.

We also interviewed and obtained data from officials at the Joint 
Forces Staff College in Norfolk, Virginia, on their pending ADL 
application.

We also interviewed and obtained data from agencies within DOD 
responsible for oversight of PME activities. Those agencies included: 

* The Joint Chiefs of Staff's Operational Plans and Joint Force 
Development Directorate (J-7), Joint Doctrine, Education, and Training 
Division, Joint Education Branch, Arlington, Virginia;

* The Office of the Under Secretary of Defense for Personnel and 
Readiness, Deputy to the Under Secretary for Readiness, Office of 
Readiness and Training Policy and Programs, Arlington, Virginia;

* The Office of the Under Secretary of Defense for Personnel and 
Readiness, Deputy to the Under Secretary for Military Personnel Policy, 
Arlington, Virginia;

* Department of the Army, Office of the Deputy Chief of Staff, 
Operations and Plans, Arlington, Virginia;

* Department of the Navy, Office of the Chief of Naval Operations, 
Personal Development and Accessions Division, Washington, D.C;

* Department of the Air Force, Office of the Deputy Chief of Staff for 
Personnel, Learning, and Force Development, Arlington, Virginia;

* U.S. Marine Corps Combat Development Center, Training and Education 
Command, Quantico, Virginia;

* U.S. Army War College Board of Visitors, Carlisle, Pennsylvania;

* Naval War College Board of Advisors, Newport, Rhode Island; and: 

* Air University Board of Visitors, Montgomery, Alabama.

To determine sufficient metrics to assess ADL effectiveness, we 
provided PME program officials with a detailed list of questions that 
included those relating to effectiveness and learning objectives. We 
reviewed written responses, if provided, and followed up with site 
visits and correspondence with oversight agencies to clarify or obtain 
additional information if necessary. We also obtained and analyzed data 
from a survey of nonresident PME current students and graduates, which 
included questions designed to obtain perceptions on program 
effectiveness. Details of the survey methodology are presented in 
appendix II.

To determine processes and criteria DOD used for ADL conversion, we 
provided PME program officials with a detailed list of questions that 
included those relating to process and criteria decisions. We reviewed 
written responses, if provided, and followed up with site visits to 
clarify or obtain additional information if necessary. To determine 
whether criteria were consistent with those of other institutions 
performing distance education, we researched prior literature on this 
topic and conducted a site visit to the University of Maryland 
University College in Adelphi, Maryland. The school was identified in 
our prior reports on distance education as having a program with a 
large distance education population, as well as educating a significant 
number of military officers. We also contacted and received data from 
the Sloan Consortium, an organization designed to encourage 
collaborative sharing of knowledge and effective practices to improve 
online education.

To determine barriers and challenges to ADL implementation, we provided 
PME program officials with a detailed list of questions that included 
those relating to barriers and challenges. We reviewed written 
responses, if provided, and followed up with site visits and 
correspondence with DOD oversight agencies to clarify or obtain 
additional information if necessary. We also obtained and analyzed data 
from a survey of nonresident PME current students and graduates, which 
included questions designed to obtain perceptions on barriers and 
challenges in completing PME courses. Details of the survey methodology 
are presented in appendix II.

[End of section]

Appendix II: Methodology for Our Survey of Nonresident PME Students and 
Graduates: 

To obtain military officers' perspectives on nonresident PME in terms 
of impact on careers, achievement of learning objectives, and obstacles 
and challenges, we conducted a statistically representative survey of 
current nonresident senior-and intermediate-level PME students and 
graduates of these schools from April 1999 to March 2003, roughly the 
period coinciding with initial ADL implementation at several senior-and 
intermediate-level schools. We present the survey questions and 
response results in appendix III.

The Study Population: 

The population for the nonresident PME survey consisted of current 
students and graduates who fulfilled the following criteria: 

1. Respondents were identified as enrolled in a senior-and 
intermediate-level nonresident PME program of study from April 1999 to 
March 2003. We decided on this time period to ensure that our 
respondents would have begun their programs after Web-based PME had 
been clearly established as a mode of instruction or have been in PME 
long enough to have meaningful responses to our questions.

2. Respondents participated in a senior-and intermediate-level 
nonresident PME program of study, as opposed to individual PME courses 
undertaken via continuing education programs.

3. Respondents are currently active in the U.S. military services or 
reserves, excluding U.S. civilians; U.S. Coast Guard members; and 
international members, either military or civilian.

4. Respondents participated (i.e., currently enrolled or graduated) 
in a nonresident PME program in one of the six senior-and 
intermediate-level PME schools: U.S. Army War College, Army Command and 
General Staff College, Air War College, Air Command and Staff College, 
Naval War College, and Marine Corps College of Continuing Education.

The survey asked respondents about PME's impact on furthering career 
objectives, their achievement of learning objectives, and obstacles and 
challenges of the programs. Specific questions concerned students' 
satisfaction with their overall program, various modes of program 
delivery, and with technologies used; students' time and duty 
management concerns; and reasons for participation in nonresident PME.

Developing the Survey: 

To develop areas of inquiry for the survey, we reviewed our previous 
work related to distance education and PME. We reviewed a series of 
survey questionnaires developed by us and by DOD. We used these sources 
and our own analysis to develop an initial set of questions. We further 
developed and refined the questionnaire by obtaining and incorporating 
written comments regarding the initial questions from administrators 
and other representatives of the senior-and intermediate-level PME 
schools.

In addition to an internal expert technical review by our Survey 
Coordination Group, we pretested the survey with five individuals whose 
personal characteristics corresponded to our eligibility criteria. We 
identified pretest subjects through our contacts who were current 
military personnel or who knew military personnel and our PME points of 
contact.

The Sample Design and Administration: 

We conducted the survey between January and April of 2004 on a random 
sample of 437 current students and graduates of nonresident PME 
programs using a self-administered Web-based questionnaire. We drew the 
names of our respondents from an overall population data set we 
constructed that combined separate data sets received from each of the 
senior-and intermediate-level PME schools. For each data set, we 
requested the officer's name, school attended, month and year of 
initial enrollment, component (defined as either active duty or 
reservist), and mode of PME delivery. We requested e-mail addresses 
and, if needed, phone numbers from potential respondents after they 
were drawn from the population data sets. We stratified our sample by 
component in order to better understand any differences between these 
components.

We activated the survey Web site and informed our sample respondents 
of the Web site, their logon name, and passwords by e-mail on January 
30, 2004. To maximize the response rate, we sent five subsequent 
follow-up e-mail reminders to nonrespondents in February and March 
2004. At the time of the third mailing, we also telephoned many of the 
nonrespondents to encourage them to complete the survey. We ended data 
collection activities on April 30, 2004.

Of the 437 selectees included in our sample, we received 273 useable 
questionnaires. We defined useable as respondents who completed the 
survey and were not identified as out-of-scope. During the survey, we 
deemed 67 of the 437 to be outside the scope of our survey after 
determining that they did not meet at least one of our eligibility 
criteria. Disregarding these 67 responses, our overall response rate 
was 73.8 percent (273/370). Table 2 shows the final disposition of the 
sample (the 437 respondent accounts activated) by strata.

Table 2: Disposition of Sample: 

Stratum: Active Duty; 
Sample: 219; 
Useable: 136; 
Out of scope: 31; 
Number of nonrespondents: 52.

Stratum: Reserve Duty; 
Sample: 218; 
Useable: 137; 
Out of scope: 36; 
Number of nonrespondents: 45.

Source: GAO.

[End of table]

The estimates we make in this report are the result of weighting the 
survey responses to account for effective sampling rates in each 
stratum. These weights reflect both the initial sampling rate and the 
response rate for each stratum. As with many surveys, our estimation 
method assumes that nonrespondents would have answered like the 
respondents.

Sampling Error: 

For the estimates we present in this report, we are 95 percent 
confident that the results we would have obtained had we studied the 
entire population are within +/-10 or fewer percentage points of our 
estimates (unless otherwise noted). Because we surveyed a sample of 
recent nonresident PME students, our results are estimates of student 
and graduate characteristics and thus are subject to sampling errors. 
Our confidence in the precision of the results from this sample is 
expressed in 95 percent confidence intervals, which are expected to 
include the actual results for 95 percent of the samples of this type. 
We calculated confidence intervals for our results using methods 
appropriate for a stratified probability sample.

Nonsampling Error and Data Quality: 

We conducted in-depth pretesting of the questionnaire to minimize 
measurement error. However, the practical difficulties in conducting 
surveys of this type may introduce other types of errors, commonly 
known as nonsampling errors. For example, measurement errors can be 
introduced if (1) respondents have difficulty interpreting a particular 
question, (2) respondents have access to different amounts of 
information in answering a question, or (3) errors in data processing 
occur. We took extensive steps to minimize such errors in developing 
the questionnaire, collecting the data, and editing and analyzing the 
information. The Web-based data management system we used provides a 
systematized process for processing, transferring, and analyzing data 
that also protects against nonsampling errors. In addition, we 
performed tests to ensure the reliability and usefulness of the data 
provided by the PME schools. These included computer analyses to 
identify inconsistencies both within and across the data sets and other 
errors in the data sets from which we developed our overall sampling 
frame. We also interviewed agency officials knowledgeable about the 
data. We determined that the data were sufficiently reliable for the 
purposes of this report.

[End of section]

Appendix III: Survey Responses: 

Introduction: 

Welcome DoD Distance Learning Student or Graduate.

The U.S. General Accounting Office (GAO) - an agency of the U.S. 
Congress has been requested to review various aspects of nonresident 
professional military education (PME). Part of that effort is to 
evaluate whether the appropriate decisions are being made in regards to 
nonresident education for intermediate and senior-level military 
officers. As part of this effort, we are assessing opinions of 
nonresident PME graduates and current students on (1) the achievement 
of PME learning objectives, (2) obstacles and challenges in completing 
PME, and (3) PME's impact on furthering career objectives.

The survey should take approximately 10 minutes to complete.

Participation in this survey is voluntary but encouraged. Your 
responses will be confidential and the results of the survey will be 
reported in aggregate form only.

Before choosing an answer, please read the full question and all 
response choices carefully.

Thank you in advance for your participation.

Frame of Reference: 

Please consider only the following when answering the questions on this 
survey: 

1. Nonresident Professional Military Education programs of study that 
you have participated in, as opposed to individual PME courses 
undertaken via continuing education programs.

AND: 

2. Nonresident Professional Military Education programs of study that 
you began in April 1999 or after.

Our survey of nonresident Professional Military Education students was 
divided into two main parts, one with questions appropriate for current 
students and one with similar questions worded slightly differently for 
graduates. There are also questions on demographics to which both 
current students and graduates responded. Survey questions and 
responses for graduates are indicated in italics and those for current 
students are in plain text.

The information provided here represents weighted data. For information 
on weighting, see appendix II.

Except where noted by the following, all percentage estimates have 95% 
confidence intervals within +/-10 percentage points: 

[A] Confidence interval exceeds +10 percentage points: 

[B] Confidence interval exceeds +25 percentage points and estimate is 
unreliable: 

Questions 1 and 31 are intentionally omitted because they contained 
instructions telling respondents which questions to answer.

Survey of Nonresident Professional Military Education Graduates and 
Current Students: 

Q2. The name of program in which currently enrolled: 

Q32. From which school did you graduate?

Current Students; 
Air Command and Staff College Nonresident Program (percent): 30%; 
Air War College Nonresident Program (percent): 25%; 
Army Command and General Staff College Nonresident Program (percent): 
31%; 
Army War College Distance Education Program (percent): 1%; 
Marine Corps College of Continuing Education (percent): 10%; 
Naval College of Distance Education (percent): 2%. 

Graduates; 
Air Command and Staff College Nonresident Program (percent): 40%; 
Air War College Nonresident Program (percent): 27%; 
Army Command and General Staff College Nonresident Program (percent): 
27%; 
Army War College Distance Education Program (percent): 2%; 
Marine Corps College of Continuing Education (percent): 1%; 
Naval College of Distance Education (percent): 3%. 

[End of table]

Q3. In what month and year did you begin your PME program?

Q33. In what month and year did you begin your PME program (if you 
graduated from more than one program, answer for the most recent one)?

Current Students; 
1999 (percent): 1%; 
2000 (percent): 3%; 
2001 (percent): 14%; 
2002 (percent): 30%; 
2003 (percent): 47%; 
Other (percent): 5%. 

Graduates; 
1999 (percent): 15%; 
2000 (percent): 26%; 
2001 (percent): 34%; 
2002 (percent): 21%; 
2003 (percent): 5%; 
Other (percent): 0%. 

[End of table]

Q4. What mode of instruction have you used most often in your 
nonresident Professional Military program?

Q34. What mode of instruction did you use most often in your 
nonresident Professional Military program?

Current Students; 
Seminar or Classroom Instruction (percent): 24%; 
Web- Based Correspondence (percent): 25%; 
Paper-Based Correspondence (percent): 51%. 

Graduates; 
Seminar or Classroom Instruction (percent): 42%; 
Web-Based Correspondence (percent): 18%; 
Paper-Based Correspondence (percent): 40%. 

[End of table]

Q5. In what month and year do you expect to complete your PME studies 
program?

Q35. In what month and year did you complete your Professional 
Military Education program?

Current Students; 
2003 (percent): 1%; 
2004 (percent): 68%; 
2005 (percent): 21%; 
2006 (percent): 4%; 
2007 (percent): 0%; 
Other (percent): 6%. 

Graduates; 
2003 (percent): 1%; 
2004 (percent): 10%; 
2005 (percent): 18%; 
2006 (percent): 31%; 
2007 (percent): 39%; 
Other (percent): 1%. 

[End of table]

Q6. In a typical week, approximately how many hours did you spend in 
Professional Military Education-related activities, including 
preparation, study, working on-line, and time in class?

Q36. In a typical week, approximately how many hours did you spend in 
Professional Military Education-related activities, including 
preparation, study, working on-line, and time in class?

Current Students; 
Mean: 5.8%. 

Graduates; 
Mean: 8.4%. 

[End of table]

Q7. Does the military or your employer afford you time during your 
work-week for Professional Military Education?

Q37. Did the military or your employer afford you time during your 
work-week for Professional Military Education?

Current Students; 
Yes (percent): 23%; 
No (percent): 77%. 

Graduates; 
Yes (percent): 42%; 
No (percent): 58%. 

[End of table]

Q8. How many hours do you work in paid employment in a typical work-
week (outside of Professional Military Education-related activities)?

Q38. During the period of time that you were completing your 
Professional Military Education program, how many hours did you work 
in paid employment in a typical work-week (outside of Professional 
Military Education-related activities)?

Current Students; 
Mean: 52.0%. 

Graduates; 
Mean: 47.5%. 

[End of table]

Q9. Listed below are various reasons why someone would participate in 
a nonresident PME program. What is your greatest reason for 
participating in a nonresident Professional Military Education program?

Q39. Listed below are various reasons why someone would participate in 
a nonresident PME program. What was your greatest reason for 
participating in a nonresident Professional Military Education program? 

To develop professionally; 
Current Students (percent): 20%; 
Graduates (percent): 13%. 

To gain access to better assignments; 
Current Students (percent): 1%; 
Graduates (percent): 0%. 

To gain knowledge in my field or in fields of interest to me; 
Current Students (percent): 2%; 
Graduates (percent): 0%. 

To improve my chances of, or meet the requirements for, promotion; 
Current Students (percent): 73%; 
Graduates (percent): 84%. 

To network with other officers; 
Current Students (percent): 1%; 
Graduates (percent): 0%. 

To obtain college credit; 
Current Students (percent): 0%; 
Graduates (percent): 0%. 

Other; 
Current Students (percent): 4%; 
Graduates (percent): 3%. 

[End of table]

Q10. To this point in time, how satisfied or dissatisfied are you with 
your Professional Military Education program?

Q40. Overall, how satisfied or dissatisfied are you with the 
Professional Military Education program in which you graduated?

Current Students; 
Very satisfied (percent): 16%; 
Somewhat satisfied (percent): 35%; 
Neither satisfied nor dissatisfied (percent): 21%; 
Somewhat dissatisfied (percent): 21%; 
Very dissatisfied (percent): 8%. 

Graduates; 
Very satisfied (percent): 20%; 
Somewhat satisfied (percent): 45%; 
Neither satisfied nor dissatisfied (percent): 15%; 
Somewhat dissatisfied (percent): 15%; 
Very dissatisfied (percent): 5%. 

[End of table]

Q11. To what extent, if any, has your Professional Military Education 
program benefited your military career to this point in time?

Q41. To what extent, if any, has graduation from your Professional 
Military Education program benefited your military career to this point 
in time?

Current Students; 
Very great extent (percent): 1%; 
Great extent (percent): 21%; 
Moderate extent (percent): 42%; 
Little extent (percent): 20%; 
No extent (percent): 16%. 

Graduates; 
Very great extent (percent): 15%; 
Great extent (percent): 21%; 
Moderate extent (percent): 30%; 
Little extent (percent): 22%; 
No extent (percent): 12%. 

[End of table]

Q12. To what extent, if any, do you believe the knowledge you are 
acquiring in your Professional Military Education program will improve 
your effectiveness in future assignments?

Q42. To what extent, if any, do you believe the knowledge you acquired 
in your Professional Military Education program has improved your 
effectiveness in job assignments?

Current Students; 
Very great extent (percent): 5%; 
Great extent (percent): 22%; 
Moderate extent (percent): 42%; 
Little extent (percent): 25%; 
No extent (percent): 6%. 

Graduates; 
Very great extent (percent): 9%; 
Great extent (percent): 16%; 
Moderate extent (percent): 39%; 
Little extent (percent): 24%; 
No extent (percent): 12%. 

[End of table]

No Parallel Question for Current Students: 

Q43. Have you been promoted to a higher rank since you completed your 
Professional Military Education program?

Graduates; 
Yes (percent): 78%; 
No (percent): 22%. 

[End of table]

No Parallel Question for Current Students: 

Q44. To what extent, if any, do you believe that completion of your 
Professional Military Education program contributed to your promotion?

Graduates; 
Very great extent (percent): 53%[A]; 
Great extent (percent): 34%[A]; 
Moderate extent (percent): 3%[A]; 
Little extent (percent): 9%[A]; 
No extent (percent): 0%. 

[End of table]

Q13. To what extent, if any, does your Professional Military Education 
program enable you to acquire the knowledge you are expected to obtain?

Q45. To what extent, if any, did your Professional Military Education 
program enable you to acquire the knowledge you were expected to 
obtain?

Current Students; 
Very great extent (percent): 5%; 
Great extent (percent): 25%; 
Moderate extent (percent): 43%; 
Little extent (percent): 21%; 
No extent (percent): 6%. 

Graduates; 
Very great extent (percent): 9%; 
Great extent (percent): 28%; 
Moderate extent (percent): 40%; 
Little extent (percent): 19%; 
No extent (percent): 4%. 

[End of table]

Q14. Was/is any part of your Professional Military Education program 
taken through seminar/classroom-based instruction?

Q46. Was any part of your Professional Military Education program taken 
through seminar/classroom-based instruction?

Current Students; 
No (percent): 67%; 
Yes (percent): 33%. 

Graduates; 
No (percent): 50%; 
Yes (percent): 50%. 

[End of table]

Q15. Overall, how would you rate the quality of this seminar/classroom-
based learning?

Q47. Overall, how would you rate the quality of this seminar/classroom-
based learning?

Current Students; 
Excellent (percent): 8%[A]; 
Very good (percent): 66%[A]; 
Good (percent): 11%[A]; 
Fair (percent): 13%[A]; 
Poor (percent): 3%%[A]. 

Graduates; 
Excellent (percent): 18%[A]; 
Very good (percent): 35%[A]; 
Good (percent): 31%[A]; 
Fair (percent): 14%; 
Poor (percent): 3%. 

[End of table]

Q16. Was/is any part of your Professional Military Education program 
taken through paper-based correspondence?

Q48. Was any part of your Professional Military Education program taken 
through paper-based correspondence?

Current Students; 
No (percent): 36%; 
Yes (percent): 64%. 

Graduates; 
No (percent): 40%; 
Yes (percent): 60%. 

[End of table]

Q17. Overall, how would you rate the quality of this paper-based 
correspondence learning?

Q49. Overall, how would you rate the quality of this paper-based 
correspondence learning?

Current Students; 
Excellent (percent): 3%; 
Very good (percent): 27%[A]; 
Good (percent): 40%[A]; 
Fair (percent): 26%[A]; 
Poor (percent): 4%. 

Graduates; 
Excellent (percent): 6%; 
Very good (percent): 31%; 
Good (percent): 33%; 
Fair (percent): 20%; 
Poor (percent): 10%. 

[End of table]

Q18. Was/is any part of your Professional Military Education program 
taken through the World-Wide Web or Internet?): 

Q50. Was any part of your Professional Military Education program taken 
through the World-Wide Web or Internet?

Current Students; 
No (percent): 64%; 
Yes (percent): 36%. 

Graduates; 
No (percent): 60%; 
Yes (percent): 40%. 

[End of table]

Q19. Overall, how would you rate the quality of this web/Internet-
based learning?

Q51. Overall, how would you rate the quality of this web/Internet-
based learning?

Current Students; 
Excellent (percent): 17%[A]; 
Very good (percent): 22%[A]; 
Good (percent): 40%[A]; 
Fair (percent): 15%[A]; 
Poor (percent): 5%[A]. 

Graduates; 
Excellent (percent): 8%[A]; 
Very good (percent): 39%[A]; 
Good (percent): 41%[A]; 
Fair (percent): 10%[A]; 
Poor (percent): 2%. 

[End of table]

Q20. How easy or difficult has it been for you to use web/Internet-
based learning?

Q52. How easy or difficult was it for you to use web/Internet-based 
learning?

Current Students; 
Very easy (percent): 34%[A]; 
Somewhat easy (percent): 21%[A]; 
Neither easy nor difficult (percent): 32%[A]; 
Somewhat difficult (percent): 8%[A]; 
Very difficult (percent): 5%[A]. 

Graduates; 
Very easy (percent): 40%[A]; 
Somewhat easy (percent): 37%[A]; 
Neither easy nor difficult (percent): 10%[A]; 
Somewhat difficult (percent): 12%[A]; 
Very difficult (percent): 0%. 

[End of table]

Q21. How easy or difficult have you found interaction with faculty 
during your web/Internet-based learning?

Q53. How easy or difficult did you find interaction with faculty during 
your web/Internet -based learning?

Current Students; 
Very easy (percent): 26%[A]; 
Somewhat easy (percent): 11%[A]; 
Neither easy nor difficult (percent): 35%[A]; 
Somewhat difficult (percent): 11%[A]; 
Very difficult (percent): 16%[A]. 

Graduates; 
Very easy (percent): 18%[A]; 
Somewhat easy (percent): 48%[A]; 
Neither easy nor difficult (percent): 29%[A]; 
Somewhat difficult (percent): 5%[A]; 
Very difficult (percent): 0%. 

[End of table]

Q22. How easy or difficult have you found interaction with other 
students during your web/Internet-based learning?

Q54. How easy or difficult did you find interaction with other 
students during your web/Internet-based learning?

Current Students; 
Very easy (percent): 16%[A]; 
Somewhat easy (percent): 31%[A]; 
Neither easy nor difficult (percent): [B]; 
Somewhat difficult (percent): 16%[A]; 
Very difficult (percent): 11%[A]. 

Graduates; 
Very easy (percent): 15%[A]; 
Somewhat easy (percent): 41%[A]; 
Neither easy nor difficult (percent): 26%[A]; 
Somewhat difficult (percent): 15%[A]; 
Very difficult (percent): 4%[A]. 

[End of table]

Q23. How well does the courseware/course software work on the computer 
equipment to which you have access for taking web/Internet-based 
learning?

Q55. How well did the courseware/course software work on the computer 
equipment to which you had access for taking web/Internet-based 
learning?

Current Students; 
Excellent (percent): 8%[A]; 
Very good (percent): 41%[A]; 
Good (percent): 32%[A]; 
Fair (percent): 19%[A]; 
Poor (percent): 0%. 

Graduates; 
Excellent (percent): 21%[A]; 
Very good (percent): 43%[A]; 
Good (percent): 28%[A]; 
Fair (percent): 5%; 
Poor (percent): 2%. 

[End of table]

Q24. How reliable is your network access for taking web/Internet-based 
learning (e.g. ability to connect to upload and download assignments, 
etc)?

Q56. How reliable was your network access for taking web/Internet-
based learning (e.g. ability to connect to upload and download 
assignments, etc)?

Current Students; 
Very reliable (percent): 41%[A]; 
Somewhat reliable (percent): 40%[A]; 
As reliable as unreliable (percent): 14%[A]; 
Somewhat unreliable (percent): 3%[A]; 
Very unreliable (percent): 3%[A]. 

Graduates; 
Very reliable (percent): 52%[A]; 
Somewhat reliable (percent): 45%[A]; 
As reliable as unreliable (percent): 2%; 
Somewhat unreliable (percent): 2%; 
Very unreliable (percent): 0%. 

[End of table]

Q25. Compared to resident Professional Military Education students in 
the school in which you are enrolled, of the following options, do you 
believe you are prepared?

Q57. Compared to resident Professional Military Education program 
graduates of your school, of the following options, do you believe you 
are prepared? 

Current Students; 
better than resident students. (percent): 2%; 
as well as resident students. (percent): 27%; 
worse than resident students. (percent): 49%; 
Don't know (percent): 23%. 

Graduates; 
better than resident students. (percent): 3%; 
as well as resident students. (percent): 30%; 
worse than resident students. (percent): 48%; 
Don't know (percent): 19%. 

[End of table]

Q26. Overall, what has been the primary challenge, if any, affecting 
your Professional Military Education program?

Q58. Overall, what was the primary challenge, if any, affecting your 
Professional Military Education program?

Computer/Internet-related problem%; 
Current Students (percent): 3%; 
Graduates (percent): 0%. 

Deployment cycle; 
Current Students (percent): 12%; 
Graduates (percent): 5%. 

Domestic circumstances; 
Current Students (percent): 12%; 
Graduates (percent): 10%. 

Maintaining focus; 
Current Students (percent): 16%; 
Graduates (percent): 12%. 

Present job duties; 
Current Students (percent): 45%; 
Graduates (percent): 53%. 

Not applicable, I am not experiencing & have not experienced any 
challenges to this point in time; 
Current Students (percent): 6%; 
Graduates (percent): 17%. 

Other; 
Current Students (percent): 8%; 
Graduates (percent): 3%. 

[End of table]

Q27. Have you experienced any computer/Internet-related problems 
affecting your Professional Military Education program?

Q59. Did you experience any computer/Internet-related problems 
affecting your Professional Military Education program?

Current Students; 
No (percent): 81%; 
Yes (percent): 19%. 

Graduates; 
No (percent): 84%; 
Yes (percent): 16%. 

[End of table]

Q28. What specific computer/Internet-related problems have you 
incurred?

Q60. What specific computer/Internet-related problems have you 
incurred?

Current Students: a. Bandwidth/ network speed: Yes (percent); 
Graduates: a. Bandwidth/ network speed: Yes (percent). 

a. Bandwidth/ network speed; 
Current Students: 63%[A]; 
Graduates: [B]. 

b. Inadequate uploading or downloading ability/lack of high-speed 
internet equipment; 
Current Students: 57%[A]; 
Graduates: [B]. 

c. Inadequate technical support; 
Current Students: [B]; 
Graduates: [B]. 

d. Defective/ incompatible equipment; 
Current Students: 25%[A]; 
Graduates: [B]. 

e. Lack of computer skills; 
Current Students: 5%[A]; 
Graduates: [B]. 

f. Lack of network availability/access to internet; 
Current Students: 52%[A]; 
Graduates: [B]. 

g. Security/Firewall issues; 
Current Students: 26%[A]; 
Graduates: [B]. 

h. Other; 
Current Students: [B]; 
Graduates: [B]. 

[End of table]

Q29. At any point during your Professional Military Education program, 
have you had to defer/disenroll from your studies?

Q61. At any point during your Professional Military Education program, 
did you have to defer/disenroll from your studies?

Current Students; 
No (percent): 66%; 
Yes (percent): 34%. 

Graduates; 
No (percent): 86%; 
Yes (percent): 14%. 

[End of table]

Q30. What was the primary reason you had to defer/disenroll from your 
studies?

Q62. What was the primary reason you had to defer/disenroll from your 
studies?

Open-ended comments not shown here.  

Q63. What is the highest degree or level of school that you have 
completed?

High school or equivalent; 
Current Students (percent): 0%; 
Graduates (percent): 0%. 

1 or more years of college, no degree; 
Current Students (percent): 0%; 
Graduates (percent): 0%. 

Associate's degree; 
Current Students (percent): 1%; 
Graduates (percent): 0%. 

Bachelor's degree; 
Current Students (percent): 42%; 
Graduates (percent): 25%. 

Master's degree; 
Current Students (percent): 44%; 
Graduates (percent): 61%. 

Doctoral or professional school degree; 
Current Students (percent): 10%; 
Graduates (percent): 12%. 

Other; 
Current Students (percent): 3%; 
Graduates (percent): 2%. 

[End of table]

Q64. In what branch of the military do you serve?

Air Force; 
Current Students (percent): 54%; 
Graduates (percent): 66%. 

Army; 
Current Students (percent): 33%; 
Graduates (percent): 30%. 

Marines; 
Current Students (percent): 11%; 
Graduates (percent): 1%. 

Navy; 
Current Students (percent): 2%; 
Graduates (percent): 3%. 

[End of table]

Q65. What duty capacity best describes you during the majority of your 
Professional Military Education program?

Non-Active Duty; 
Current Students (percent): 43%; 
Graduates (percent): 26%. 

Active Duty; 
Current Students (percent): 57%; 
Graduates (percent): 74%. 

[End of table]

Q66. What component best describes you during the majority of your 
Professional Military Education program?

Active Component; 
Current Students (percent): 46; 
Graduates (percent): 67%. 

Reserve Component; 
Current Students (percent): 54%; 
Graduates (percent): 33%. 

[End of table]

Q67. Are you a member of the National Guard?

Yes; 
Current Students (percent): 35%[A]; 
Graduates (percent): 45%[A]. 

No; 
Current Students (percent): 65%[A]; 
Graduates (percent): 55%[A]. 

[End of table]

Q68. What military occupational category best describes you during the 
majority of your Professional Military Education program?

Administrative; 
Current Students (percent): 12%; 
Graduates (percent): 9%. 

Engineering & Maintenance Officers; 
Current Students (percent): 10%; 
Graduates (percent): 13%. 

General Officers & Executives; 
Current Students (percent): 4%; 
Graduates (percent): 7%. 

Health Care Officers; 
Current Students (percent): 18%; 
Graduates (percent): 13%. 

Intelligence Officers; 
Current Students (percent): 6%; 
Graduates (percent): 1%. 

Scientists & Professionals; 
Current Students (percent): 9%; 
Graduates (percent): 10%. 

Supply & Procurement & Allied Officers; 
Current Students (percent): 7%; 
Graduates (percent): 8%. 

Tactical Operation Officers; 
Current Students (percent): 28%; 
Graduates (percent): 31%. 

Non-Occupational; 
Current Students (percent): 0%; 
Graduates (percent): 1%. 

Not Listed Above/Other; 
Current Students (percent): 6%; 
Graduates (percent): 7%. 

[End of table]

Q69. What was your rank when you began your Professional Military 
Education program?

O-2; 
Current Students (percent): 1%; 
Graduates (percent): 1%. 

O-3; 
Current Students (percent): 18%; 
Graduates (percent): 18%. 

O-4; 
Current Students (percent): 65%; 
Graduates (percent): 59%. 

O-5; 
Current Students (percent): 13%; 
Graduates (percent): 21%. 

O-6; 
Current Students (percent): 2%; 
Graduates (percent): 1%. 

Other; 
Current Students (percent): 2%; 
Graduates (percent): 0%. 

[End of table]

Q70. What is your current rank?

O-2; 
Current Students (percent): 0%; 
Graduates (percent): 0%. 

O-3; 
Current Students (percent): 4%; 
Graduates (percent): 3%. 

O-4; 
Current Students (percent): 70%; 
Graduates (percent): 59%. 

O-5; 
Current Students (percent): 24%; 
Graduates (percent): 27%. 

O-6; 
Current Students (percent): 3%; 
Graduates (percent): 7%. 

O-7; 
Current Students (percent): 0%; 
Graduates (percent): 1%. 

Other; 
Current Students (percent): 0%; 
Graduates (percent): 3%. 

[End of table]

Q71. If you have any other comments related to your PME education, 
training, assignments, distance learning, or any other matters related 
to this questionnaire, please note them here.  

Open-ended comments not shown here. 

[End of section]

Appendix IV: ADL Applications and Additional Features of Nonresident 
Programs: 

We observed three current ADL applications at PME senior-and 
intermediate-level schools. These schools have geared their ADL efforts 
to their nonresident programs. The programs vary from service to 
service in terms of enrollment, structure, duration, and credits 
received for graduation. In addition, we observed additional features 
of nonresident programs that affect the nature of their ADL 
applications.

U.S. Army War College: 

The U.S. Army War College (USAWC), the Army's senior-level PME school, 
initiated its Web-based nonresident education program in April 1999. 
The program went online in an evolutionary process until the spring of 
2002, whereby students received both text and online versions. Since 
the spring of 2002, all nonresident students have received their 
education via a combination of ADL technology and appropriate text. 
Nonresident students are board selected to participate in the program. 
It is a 2-year Web-based program that is the only delivery method 
offered to nonresident students. The program has a "blended" component, 
whereby 2 of its 12 courses are taken in residence at USAWC. Also, 
distance courses are presented to students as a group or cohort; that 
is, students enroll at the beginning of the nonresident school year and 
must complete a sequenced load of 5 courses during the first year, 
followed by an additional 5 courses during the second year. The 
resident courses are of 2-week duration and are conducted at the end of 
each academic year. The nonresident program is designed to parallel the 
resident program, and graduates from both programs are awarded Master's 
Degrees in Strategic Studies.

Naval War College: 

The Naval War College's (NWC) nonresident education programs are 
concentrated in its College of Distance Education, its only nonresident 
college and one of five colleges under the NWC umbrella. The College of 
Distance Education, an intermediate-level PME school, offers several 
nonresident options. The fleet seminar program has existed in various 
forms at the school since 1974; the Web-enabled correspondence program 
has been operating fully since October 2002; and the CD-ROM-based 
correspondence program, effective in April 2004, was designed to 
replace the phased-out paper-based correspondence course. Nonresident 
options are open to all officers and qualified personnel. The Web-based 
course can be completed in 18-24 months. While there is no formal 
resident portion to this course, students are assigned to cohort teams 
to facilitate team and faculty communication. This nonresident course 
is closely aligned with the resident course, and graduates are allowed 
to obtain graduate hour credits. In the case of several seminars of the 
fleet seminar program, students can apply for admission to a program of 
graduate study leading toward a Master's of Arts Degree in National 
Security and Strategic Studies.

Air Command and Staff College: 

The Air Command and Staff College (ACSC), the Air Force's 
intermediate-level PME school, implemented its nonresident program in 
its present form in September 1999. There are two methods for 
completing the nonresident program: by seminar or by correspondence. 
The ACSC nonresident program is open to all officers and qualified 
personnel. The most recent version of the program consists of six 
courses organized into two semesters. The seminar method, which can 
take up to 11 months to complete, is conducted weekly, is typically 
composed of 3-18 students, and is led by assigned seminar leaders in 
order to facilitate group discussions. The correspondence program, a 
self-study program delivered in a balanced manner consisting of paper, 
CD-ROM, and Web-based delivery, requires up to 18 months to complete. 
Students move interchangeably between both programs, but they must 
achieve a minimum score of 70 percent on each of the six examinations 
and must complete four interactive Web-based exercises. The nonresident 
programs are designed to mirror resident programs, and there are 
multiple versions in use by ACSC nonresident students. These programs 
do not award master's degrees, but the American Council of Education 
recommends up to 21 semester hours of graduate credit for course 
completion.

Joint Forces Staff College: 

National Defense University's Joint Forces Staff College (JFSC) is 
piloting an Advanced Joint Professional Military Education pilot course 
for senior-and intermediate-level reserve officers. Initially launched 
in September 2003, it is designed to last 38 weeks. The period consists 
of 35 weeks of Web-based education and 3 weeks of resident education 
with 1 week occurring after the first 8 weeks of Web-based education, 
and the last 2 weeks at the end of the 38-week period. JFSC, already 
responsible for the resident Joint PME Phase II course used to complete 
the education process for joint specialty officers, was tasked to 
develop a Joint PME course for reserve officers in response to the 
fiscal year 1999 National Defense Authorization Act and the Joint Staff 
Guidance in May 2000. While there is no joint specialty officer 
requirement for reserve officers, JFSC was required to prepare reserve 
officers for joint duty assignments by providing a course similar in 
content to its resident Joint PME course, and to do so by utilizing 
current distance learning applications.

Additional Features of Nonresident PME Programs: 

There are additional features of PME programs that affect the nature of 
their ADL applications. Those features include: 

* Student Board Selection--Nonresident students are selected to attend 
the PME schools either through an annual board selection process or 
through open admissions. Only USAWC selects its nonresident students; 
the other programs with ADL applications have open-admission policies.

* Joint Professional Military Education--A significant portion of the 
PME curriculum involves study of joint service issues along with 
service-specific issues. Officers who successfully complete senior-or 
intermediate-level PME course work are awarded Joint PME Phase I 
credit, which is required for those who wish to serve as joint 
specialty officers. All nonresident programs with ADL applications 
grant Joint PME Phase I credit.

* Service Promotion Impact--PME officials stated that PME program 
completion and other forms of higher education are factors used in 
consideration for promotion and vary among the services. Generally, the 
Air Force requires completion of a corresponding PME level of study 
before a candidate is considered for the next promotion level. The 
Army, while not as strict as the Air Force, places a high value on PME 
and graduate education in promotion decisions. The Navy, placing a 
higher premium on operational experience, currently is less inclined to 
recognize PME as a credential for promotion.

* Learning Objectives Between Programs--PME officials stated that, as 
outlined by Joint Staff policies, learning objectives for nonresident 
courses are required to be the same for resident courses, regardless of 
the method of delivery. PME schools have instituted internal control 
processes to ensure the achievement of learning objectives for all 
programs, irrespective of delivery method. Generally, PME schools apply 
similar evaluation systems and criteria to both resident and 
nonresident programs.

* Levels-of-Learning--PME schools teach to differing achievement levels 
across and within the services, and they have designed their curricula 
accordingly. School officials refer to these achievement levels as 
"levels-of-learning" based on a taxonomy defined in the Joint Staff 
policy. (See table 3 for a detailed definition of levels-of-learning 
designations.) For the schools with ADL applications, their desired 
levels of learning for nonresident programs may or may not be 
equivalent to the corresponding resident programs: 

* USAWC--Synthesis/Analysis (same as for resident program).

* NWC--Application (resident program calls for synthesis/analysis).

* ACSC--Comprehension (resident program calls for synthesis/resident).

* JFSC (Planned)--Application (same as for resident program).

Table 3: Levels-of-Learning Definitions: 

Levels of learning: Knowledge; 
Definitions: The ability to remember previously learned material. This 
level involves recall of a wide range of material, from specific facts 
to complete theories, but all that is required is bringing to mind 
appropriate information. Terminology for achievement: defines, 
describes, identifies, labels, lists, matches, names, outlines, 
reproduces, selects, and states.

Levels of learning: Comprehension; 
Definitions: The ability to grasp the meaning of material. Translating 
material from one form to another, interpreting material, or estimating 
future trends may show this level. Terminology for achievement: 
converts, defends, distinguishes, estimates, explains, extends, 
generalizes, gives examples, infers, paraphrases, predicts, rewrites, 
summarizes, translates, and understands.

Levels of learning: Value; 
Definitions: The internalization and consistent display of a behavior. 
The levels of valuing consist of acceptance of a value, preference for 
a value, and commitment (conviction).

Levels of learning: Application; 
Definitions: The ability to use learned material in new and concrete 
situations. This level includes application of rules, methods, 
concepts, principles, laws, and theories. Terminology for achievement: 
changes, computes, demonstrates, discovers, manipulates, modifies, 
operates, predicts, prepares, produces, relates, shows, solves, and 
uses.

Levels of learning: Analysis; 
Definitions: The ability to break down material into its component 
parts so that its organizational structure may be understood. This 
level includes identification of the parts, analysis of the 
relationships between parts, and recognition of the organizational 
principles involved. Terminology for achievement: breaks down, 
diagrams, differentiates, discriminates, distinguishes, illustrates, 
infers, outlines, points out, selects, separates, and subdivides.

Levels of learning: Synthesis; 
Definitions: The ability to put parts together to form a new whole. 
This level involves production of unique communications, a plan of 
operations, or a set of abstract relations. Terminology for 
achievement: categorizes, combines, compiles, composes, creates, 
devises, designs, explains, generates, modifies, organizes, plans, 
rearranges, reconstructs, relates, reorganizes, revises, rewrites, 
summarizes, tells, and writes.

Levels of learning: Evaluation; 
Definitions: The ability to judge the value of material for a given 
purpose. Judgments are to be based on defined internal (organizational) 
or external (relevance to the purpose) criteria. Criteria are subject 
to value judgments. Terminology for achievement: appraises, criticizes, 
discriminates, explains, justifies, interprets, and supports.

Source: DOD.

Note: These terms, listed in increasing levels of achievement, are used 
to define the Joint PME learning objectives for PME schools.

[End of table]

[End of section]

Appendix V: Comments from the Department of Defense: 

OFFICE OF THE UNDER SECRETARY OF DEFENSE:

4000 DEFENSE PENTAGON WASHINGTON, D.C. 20301-4000:

PERSONNEL AND READINESS:

JUL 23 2004:

Mr. Neil P. Curtin:
Director, Defense Capabilities and Management: 
U.S. Government Accountability Office: 
Washington, D.C. 20548:

Dear Mr. Curtain:

This is the Department of Defense (DoD) response to the Government 
Accountability Office Draft Report GAO-04-873 "MILITARY EDUCATION: DoD 
Needs to Develop Performance Goals and Metrics for Advanced Distributed 
Learning in Professional Military Education," July 2, 2004 (GAO Code 
350327). I would like to make you aware of my assessment that the 
utility and validity of the current report is problematic. I have 
outlined my concerns below.

This GAO report was initiated based on a request from Representative 
Skelton to assess the impact of Advanced Distributed Learning (ADL) on 
officer education. The current draft does not reflect the intent and 
does not acknowledge or address most of the identified objectives of 
the February 2003 engagement letter. The altered focus on assessment of 
ADL programs and the development of specific ADL performance goals and 
metrics do not correlate to the original intent and objectives.

The Department appreciates the opportunity to comment on this draft. Of 
the two recommendations, we partially concur with the first and concur 
with the second. The Department's comments to the GAO draft 
recommendations are enclosed. Technical comments on the entire draft 
report were provided separately.

Sincerely,

Signed by:

Paul W. Mayberry: 
Deputy Under Secretary: 
Readiness:

Enclosure: As stated:

GAO DRAFT REPORT - DATED JULY 2, 2004 GAO CODE 350327/GAO-04-873:

"MILITARY EDUCATION: DoD Needs to Develop Performance Goals and Metrics 
for Advanced Distributed Learning in Professional Military Education":

DEPARTMENT OF DEFENSE COMMENTS TO THE RECOMMENDATIONS:

RECOMMENDATION 1: The GAO recommended that the Secretary of the Defense 
direct the Under Secretary of Defense for Personnel and Readiness, in 
concert with the Joint Staff, Service headquarters, and the 
professional military education schools, promote the development of 
specific performance effectiveness goals for advanced distributed 
learning.

(Page 15-16/GAO Draft Report):

DOD RESPONSE: Partially Concur.

The Department does support the use of specific performance 
effectiveness goals for professional military education. However, 
development of specific performance effectiveness goals for any 
specific delivery method, such as advanced distributed learning (ADL), 
is not appropriate. Educational outcomes are based on common standards, 
as defined in the Officer Professional Military Education Policy, 
regardless of delivery method.

RECOMMENDATION 2: The GAO recommended that the Secretary of the Defense 
direct the Under Secretary of Defense for Personnel and Readiness to 
promote the use of advanced distributed learning technologies to 
capture data to provide knowledge about learning outcomes. (Page 16/GAO 
Draft Report):

DOD RESPONSE: Concur.

Metrics for learning outcomes are established by the schools and ADL 
technologies can capture data that can be used to evaluate the metrics. 
Current accreditation practices are already promoting the data-
collection capabilities of ADL technologies for assessing multiple 
delivery methods. 
GAO's Comment: 

The following is GAO's comment on the letter from the Department of 
Defense dated July 23, 2004.

1. When we initiated this engagement in February 2003, a key objective 
was to determine (1) the assumptions for DOD's decision to move officer 
senior-and intermediate-service schools from 1-year residency to 
shorter periods by using ADL and (2) which courses and schools would be 
affected. Immediately after fieldwork commenced, however, DOD informed 
us that it was no longer actively pursuing that approach. In April 
2003, after consulting with the congressional requester, we informed 
our DOD point of contact regarding our pursuit of the engagement's 
remaining objectives.

FOOTNOTES

[1] Department of Defense, Chairman of the Joint Chiefs of Staff 
Instruction 1800.01A, Officer Professional Military Education Policy, 
December 2000. 

[2] There are also senior-and intermediate-level schools sponsored by 
DOD through its National Defense University. These schools are designed 
to educate officers on joint matters. The senior-level schools are the 
National War College, the Industrial College of Armed Forces, and the 
Joint and Combined Warfighting School-Senior at the Joint Forces Staff 
College. The intermediate-level school is the Joint and Combined 
Warfighting School-Intermediate at the Joint Forces Staff College.

[3] Advanced distributed learning, as defined by DOD's April 1999 ADL 
Strategic Plan and May 2000 ADL Implementation Plan, expands distance 
learning by emphasizing computer-based instruction; common standards; 
and use of reusable content, networks, and learning management systems 
in an "anytime, anyplace" environment.

[4] Joint Professional Military Education is a Joint Chiefs of Staff-
approved body of objectives, policies, procedures, and standards 
supporting the educational requirements for joint officer management. 
Joint Professional Military Education is a portion of PME that supports 
fulfillment of the educational requirements of joint officer 
management.

[5] U.S. General Accounting Office, Human Capital: A Guide for 
Assessing Strategic Training and Development Efforts in the Federal 
Government, GAO-04-546G (Washington, D.C.: March 2004).

[6] U.S. General Accounting Office, Military Education: DOD Needs to 
Enhance Performance Goals and Measures to Improve Oversight of Military 
Academies, GAO-03-1000 (Washington, D.C.: Sept. 10, 2003).

[7] The Middle States Association of College and Schools is the 
regional accrediting agency for the U.S. Army War College. The New 
England Association of Schools and Colleges is the regional accrediting 
agency for the Naval War College. The Southern Association of Colleges 
and Schools is the regional accrediting agency for the Air Command and 
Staff College.

[8] Council for Higher Education Accreditation, Statement of Mutual 
Responsibilities for Student Learning Outcomes: Accreditation, 
Institutions, and Programs (Washington, D.C.: Sept. 2003).

[9] See GAO-04-546G.

[10] U.S. General Accounting Office, Distance Education: Improved Data 
on Program Costs and Guidelines on Quality Assessment Needed to Inform 
Federal Policy, GAO-04-279 (Washington, D.C.: Feb. 26, 2004).

[11] The Congress established the Web-based Education Commission to 
prepare a report to the President and the Congress that contains 
recommendations for legislation and administrative actions, including 
those pertaining to the appropriate federal role in determining the 
quality of educational software products. Members of the Commission 
included senators, representatives, and leaders from postsecondary 
institutions.

[12] See GAO-04-279.

[13] Acculturation is defined as a developmental activity that involves 
the adoption of customs, protocols, and doctrine. The acculturation 
process is designed to prepare officers for shared leadership positions 
while reinforcing total force partnerships.

[14] The percentages reported here are based on a sample of current 
students and graduates and are estimates. All percentage estimates from 
the survey reported have margins of error of plus or minus 
10 percentage points or less, unless otherwise noted.

[15] The confidence interval for this figure was +9 percent and -17 
percent.

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800

U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: