This is the accessible text file for GAO report number GAO-03-52 
entitled 'Defense Acquisitions: Factors Affecting Outcomes of Advanced 
Concept Technology Demonstrations' which was released on December 02, 
2002.



This text file was formatted by the U.S. General Accounting Office 

(GAO) to be accessible to users with visual impairments, as part of a 

longer term project to improve GAO products’ accessibility. Every 

attempt has been made to maintain the structural and data integrity of 

the original printed product. Accessibility features, such as text 

descriptions of tables, consecutively numbered footnotes placed at the 

end of the file, and the text of agency comment letters, are provided 

but may not exactly duplicate the presentation or format of the printed 

version. The portable document format (PDF) file is an exact electronic 

replica of the printed version. We welcome your feedback. Please E-mail 

your comments regarding the contents or accessibility features of this 

document to Webmaster@gao.gov.



Report to the Subcommittee on Readiness and Management Support, 

Committee on Armed Services, 

U.S. Senate:



United States General Accounting Office:



GAO:



December 2002:



DEFENSE ACQUISITIONS:



Factors Affecting Outcomes of Advanced Concept Technology 

Demonstrations:



DEFENSE ACQUISITIONS:



GAO-03-52:



GAO HIGHLIGHTS:

Highlights of GAO-03-52, a report to the Subcommittee on Readiness and 

Management Support, Committee on Armed Services, U.S. Senate:



DEFENSE ACQUISITIONS

Factors Affecting Outcomes of Advanced Concept Technology 
Demonstrations:



Why GAO Did This Study

The Advanced Concept Technology Demonstration (ACTD) program was 
started 

by the Department of Defense (DOD) as a way to get new technologies 
that 

meet critical military needs into the hands of users faster and for 
less 

cost.  GAO was asked to examine DOD’s process for structuring and 

executing 

ACTDs.



What GAO Found

Since the ACTD program was started in 1994, a wide range of products 

have 

been tested by technology experts and military operators in realistic 

settings—from unmanned aerial vehicles, to friend-or-foe detection 

systems, 

to biological agent detection systems, to advanced simulation 
technology 

designed to enhance joint training.  Many of these have successfully 

delivered new technologies to users.  In fact, 21 of 24 projects we 

examined that were found to have military utility delivered at least 

some 

technologies to users that meet military needs.  



Though the majority of the projects we examined transitioned 

technologies 

to users, there are factors that hamper the ACTD process.  For example:



* Technology has been too immature to be tested in a realistic 

setting, 

leading to cancellation of the demonstration.

* Military services and defense agencies have been reluctant to fund 

acquisition of ACTD-proven technologies, especially those focusing on 

joint 

requirements, because of competing priorities.

* ACTDs’ military utility may not have been assessed consistently.



Some of the barriers we identified can be addressed through efforts 

DOD now 

has underway, including an evaluation of how the ACTD process can be 

improved; adoption of criteria to be used to ensure technology is 

sufficiently mature; and placing of more attention on the end phase 

of the 

ACTD process.  Other barriers, however, will be much more difficult 

to 

address in view of cultural resistance to joint initiatives and the 

requirements of DOD’s planning and funding process.



[See PDF for Image]

[End of Figure]



What GAO Recommends

We are recommending that DOD strengthen its criteria for assessing 

the 

military utility of ACTD projects; consider ways to ensure funding 

is 

provided for acquisitions; and have the Secretary weigh in on 

decisions 

on whether to transition technologies that are tested under the 

program.



DOD generally concurred with the recommendations on improving 

military 

utility assessments and ensuring timely funding for the transition 

of 

successful ACTD projects.  DOD partially concurred with our 

recommendation on obtaining high-level concurrence on any decision 

not to transition ACTD projects addressing joint requirements.



To view the full report, including the scope

and methodology, click on the link above.

For more information, contact Katherine Schinasi at (202) 512-4841 

or schinasik@gao.gov.



Contents:



Letter:



Results in Brief:



Background:



Twenty-one of 24 Projects Transitioned at Least Some Technologies to 

Users:



Some Factors Can Hamper the ACTD Process:



Initiatives Are Underway to Improve ACTD Outcomes:



Conclusions:



Recommendations for Executive Action:



Agency Comments and Our Evaluation:



Scope and Methodology:



Appendix I: Technology Readiness Levels and Their Definitions:



Appendix II: Comments from the Department of Defense:



Table:



Table 1: Summary of Outcomes:



Figures:



Figure 1: ACTD Process:



Figure 2: Technologies Tested in Military Operations in Urban Terrain 

ACTD:



Figure 3: Illustration of Factors Influencing Outcomes:



Abbreviations:



ACTD  Advanced Concept Technology Demonstration

DOD   Department of Defense

GCCS  Global Command and Control System

OSD   Office of the Secretary of Defense

TRL   Technology Readiness Level:



United States General Accounting Office:



Washington, DC 20548:



December 2, 2002:



The Honorable Daniel Akaka

Chairman

The Honorable James Inhofe

Ranking Minority Member

Subcommittee on Readiness and Management Support

Committee on Armed Services

United States Senate:



The Advanced Concept Technology Demonstration (ACTD) program was 

initiated by the Department of Defense (DOD) in 1994 as a way to get 

new technologies that meet critical military needs into the hands of 

users faster and at less cost than the traditional acquisition process. 

Under its traditional process, which takes an average of 10 to 15 years 

to develop a product, DOD explores various weapon concepts, defines 

what the specific weapon system will look like, refines plans through 

systems development and demonstration, and then produces the equipment 

in larger-scale quantities. By contrast, under the ACTD process, which 

takes an average of 2 to 6 years, military operators and developers 

test prototypes, which have already been developed and matured, in 

realistic settings. If they find these items to have military utility, 

DOD may choose to buy additional quantities or just use items remaining 

after the demonstration. If users find these items do not have utility, 

DOD may reject them altogether--an outcome that enables DOD to save 

time and money.



A key distinction between the traditional acquisition process and the 

ACTD process is that the ACTD process is intentionally set up to be 

much more flexible and streamlined. Decisions to move from stage-to-

stage are less formal and the process itself is managed by a set of 

guidelines, which contain advice and suggestions, as opposed to formal 

directives and regulations. This was done to encourage innovation and 

creativity as well as participation from the services and the defense 

agencies on projects that have joint applications.



You requested that we examine DOD’s process for structuring and 

executing ACTDs, particularly with respect to DOD’s ability to 

transition promising technologies to military users. In doing so, we 

reviewed 24 of the 99 projects that have been undertaken so far. Of the 

24 projects reviewed, 21 had transitioned at least some technologies 

found to have military utility to users as acquisition programs, 

residual items, or both. Among these were the Predator and Global Hawk 

unmanned aerial vehicles, devices to combat weapons of mass 

destruction, weapons and equipment for use in urban combat, and various 

information systems tools and decision aides.



Results in Brief:



Though the majority of the projects we examined had transitioned 

technologies to users, we found that there are opportunities for DOD to 

improve the ACTD process. These include (1) ensuring candidate 

technologies are mature enough to be tested in a realistic setting, 

military services and defense agencies sustain their commitment to 

projects, especially those focusing on joint requirements, and 

appropriate expertise is employed for carrying out demonstrations and 

transitions; and (2) developing specific criteria to evaluate 

demonstration results. Such actions would enable the ACTD process to 

produce better candidates and help DOD to prevent delays and funding 

gaps.



DOD recognizes that the ACTD process could be improved. In response, it 

has adopted criteria that should help ensure technologies are 

sufficiently mature for the demonstrations. It is strengthening 

guidance so that projects can be planned and managed better. To 

maximize outcomes, DOD still needs to strengthen assessments of 

military utility and ensure that projects are adequately funded through 

the transition. We are making recommendations to DOD to address both 

issues.



In commenting on a draft of this report, DOD generally concurred with 

our recommendations on improving military utility assessments and on 

ensuring timely funding for the transition of successful ACTD projects. 

DOD partially concurred with our recommendation on obtaining high-level 

concurrence on any decision not to transition ACTD projects addressing 

joint requirements.



Background:



The ACTD process is intended to be much more flexible and streamlined 

than DOD’s formal acquisition process and in turn to save time and 

money. Under the ACTD program, prototypes are developed and provide 

users with the opportunity to demonstrate and assess the prototypes’ 

capabilities in realistic operational scenarios. From these 

demonstrations, users can refine operational requirements, develop an 

initial concept of operations, and determine the military utility of 

the technology before deciding whether additional units should be 

purchased. Not all projects are selected for transition into the normal 

acquisition process. Specifically, potential users can conclude that 

the technology (1) does not have sufficient military utility and that 

acquisition is not warranted or (2) has sufficient utility but that 

only the residual assets of the demonstration are needed and no 

additional procurement is necessary. Separate technologies within one 

project may even have varied outcomes.



DOD’s traditional approach to developing and buying weapons--which 

takes an average of 10 to 15 years--is marked by four phases: exploring 

various weapon concepts, defining what the specific weapon system will 

look like, refining plans through systems development and 

demonstration, and producing the equipment in larger-scale quantities 

and operating and supporting it in the field. Before a program can 

proceed to each phase, defense officials review its progress to 

evaluate the ability to meet performance goals and whether risk is 

under control.



The ACTD process is marked by three phases: selection of the projects, 

demonstration of the technologies, and residual use of prototypes and/

or the transition of them to acquisition programs if the services or 

defense agencies decide to acquire more. The selection process begins 

via a data call to both the research and development and warfighting 

communities. The “Breakfast Club,” a panel of technology experts from 

various organizations, reviews the potential candidates. Candidates 

selected by this panel are submitted to the Joint Requirements 

Oversight Council for prioritization and then to the Under Secretary of 

Defense for Acquisition, Technology and Logistics for a final 

selection. Decisions to move from stage-to-stage, are less formal than 

the formal acquisition process, and the process is managed by a set of 

Office of the Secretary of Defense (OSD) guidelines, which contain 

advice and suggestions, as opposed to formal directives and 

regulations. While ACTD teams are to prepare management plans for the 

projects that spell out roles and responsibilities, objectives, and 

approaches, these plans are supposed to be flexible, short (less than 

25 pages), and high level. Figure 1 illustrates the major phases of the 

ACTD process.



Figure 1: ACTD Process:



[A] This phase had been shortened for fiscal year 2003 and 2004 

candidates.



[See PDF for Image]

[End of Figure]



The ACTD demonstration phase typically lasts an average of 2 to 4 

years, with an added 2-year residual phase. According to OSD, this 

provides ample time to develop fieldable prototypes and to allow users 

to evaluate them. For less complex systems or systems that are 

available quickly (e.g., commercial-off-the-shelf systems), the time 

line may be significantly shorter. Similarly, for very complex systems 

that require extensive integration and developmental testing, more time 

may be required. A key to keeping the time frame short, according to 

DOD, is beginning the demonstration with mature technology. This 

prevents delays associated with additional development and rework.



The ACTD process places the highest priority on addressing joint 

military needs, although some ACTDs focus on service specific 

capabilities. For example, DOD has found that combat identification 

systems across the services needed to be enhanced to reduce fratricide 

so that systems belonging to individual services and components, and 

even allies, could work together more effectively. As a result, it 

undertook an ACTD project that tested new technology designed to 

improve the capability of combat forces to positively identify hostile, 

friendly, and neutral platforms during air-to-surface and surface-to-

surface operations. Another ACTD project was designed to demonstrate 

the capability to conduct joint amphibious mine countermeasure 

operations. Recently, some ACTD programs have focused on enhancing 

homeland security with domestic agencies. For example, DOD is now 

testing a command and control system that will allow emergency 

personnel first responding to the scene of an attack to talk to each 

other and have a better situational awareness.



ACTDs are funded by a variety of sources, including the office within 

OSD with the oversight responsibility for the ACTD program and the 

military services or defense agencies responsible for conducting the 

demonstrations and/or the transitions. In fiscal year 2001, a total of 

$546 million was budgeted for ACTDs--$120 million from OSD and $426 

million from the services and agency partners. Participating combatant 

commands provide additional resources through their support of 

training, military exercises, and other resources. Funding to acquire 

and maintain additional units comes from service and agency budgets.



Twenty-one of 24 Projects Transitioned at Least Some Technologies to 

Users:



Of the 24 projects we reviewed, 21 transitioned at least some 

technologies to users, meaning that users found that these had some 

level of military utility and that a military service or a defense 

agency chose to accept and fund their transition in the form of 

residual assets or as an acquisition.



* For 13 of these projects, the services or agencies decided to acquire 

more of the items tested, and as a result, transitioned the items into 

formal acquisition programs. Two of the 13 had no residual assets in 

use.



* For 8 projects, the services/agencies decided not to acquire 

additional items, but to continue using the residual assets.



* Three projects had no residual assets and no acquisition planned.



However, some of these projects experienced mixed outcomes--e.g., some 

technologies may have ended up in residual use while others were 

acquired or rejected altogether or the lead military service may have 

rejected the technology while other components decided to acquire it. 

For example:



* The Counterproliferation I project consisted of a variety of 

technologies, including sensors, targeting systems, and advanced 

weapons, designed to find and destroy nuclear, biological, and chemical 

facilities. The technologies were used in military operations in 

Kosovo. For example, an improved infrared sensor that can assess bomb 

damage to facilities was accepted by the Air Force as an upgrade to its 

standard targeting pod. Two other technologies--a hard target-

penetrating bomb and a fuzing[Footnote 1] system--have transitioned to 

production and are expected to achieve initial operational capability 

in fiscal year 2003. However, the project’s weapon borne sensor 

technology did not prove to be mature enough and was dropped from the 

ACTD prior to any demonstrations.



* The Link-16 project demonstrated an interoperability between the 

Link-16 communications link and other variable message format systems 

to improve situational awareness, interdiction, surveillance, and close 

air support. No service has adopted it for formal acquisition, but some 

regional combatant commanders and lower-level commands have purchased 

additional systems. Since the system was not adopted across DOD, its 

utility could not be optimized.



* The Military Operations in Urban Terrain project field-tested 128 

items designed to enhance operations in urban environments--such as 

attacking and clearing buildings of enemy troops. Of these, 32 

technologies were determined to have merit and were kept as residual 

items to be further evaluated. Some of these have already transitioned 

or are planned for transition to acquisition programs, including a 

door-breaching round, a man-portable unmanned aerial vehicle, elbow and 

kneepads, explosive cutting tape, ladders, body armor, and flexible 

restraining devices.



Figure 2: Technologies Tested in Military Operations in Urban Terrain 

ACTD:



[See PDF for Image]

[End of Figure]



Notes: SOF Personal Equipment Advanced Requirements (SPEAR), 

Unmanned Aerial Vehicle 

    (UAV).



Table 1: Summary of Outcomes:



Moved into acquisition 

(in whole or in part): Battlefield Awareness and Data Dissemination; 

Technologies to enhance sharing of intelligence and other data; Used 

residuals: Adaptive Course of Action; Technologies to facilitate crisis 

planning (e.g., enabling simultaneous viewing of battle plans as they 

develop); No residual or acquisition: Consequence Management; 

Technologies to detect and model biological warfare agents..



Moved into acquisition 

(in whole or in part): Unattended Ground Sensors; Sensors to enhance 

capabilities to detect, locate, identify, and report time-critical 

targets; Used residuals: Common Spectral Measurement and Signature 

Exploitation; Technologies to show tactical utility of measurement and 

signature intelligence; No residual or acquisition: Joint Modular 

Lighter; Modular causeway system.



Moved into acquisition 

(in whole or in part): Counterproliferation I; Technologies to help 

detect and respond to nuclear, biological, and chemical threats; Used 

residuals: Information Assurance: Automated Intrusion Detection 

Environment; Technologies to assess attacks on computer networks; No 

residual or acquisition: Miniature Air Launched Decoy; Small air-

launched decoy system to suppress; enemy air defense systems.



Moved into acquisition 

(in whole or in part): Small Unit Logistics; Software for logistics 

mission planning; Used residuals: Joint Logistics; Software to support 

logistics planning; No residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Human Intelligence and Counterintelligence 

Support Tools; Off-the-shelf technology to support intelligence 

operations; Used residuals: Precision/Rapid Counter Multiple Rocket 

Launcher; Technologies designed to facilitate strikes against North 

Korean long-range artillery; No residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Joint Countermine[A]; Technologies to facilitate 

amphibious mine countermeasure operations; Used residuals: Navigation 

Warfare; Jamming , antijamming, and other electronic technologies; No 

residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Military Operations in Urban Terrain; 

Technologies to assist operations in urban environments; Used 

residuals: Personnel Recovery Mission Software; Software to facilitate 

personnel recovery operations; No residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Predator; Medium altitude endurance unmanned 

aerial vehicle; Used residuals: Link 16; Software to facilitate sharing 

of tactical information across military services; No residual or 

acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Portal Shield; Technologies to detect and 

identify biological attacks on air bases or ports; Used residuals: 

[Empty]; No residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Rapid Force Projection Initiative[A]; Long-range 

precision sensors, weapon systems, munitions, and digital 

communications systems designed to defeat an enemy armored force; Used 

residuals: [Empty]; No residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Global Hawk; High-altitude, long endurance 

unmanned aerial vehicle; Used residuals: [Empty]; No residual or 

acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Synthetic Theater of War; Simulation 

technologies to support joint training and mission rehearsals; Used 

residuals: [Empty]; No residual or acquisition: [Empty].



Moved into acquisition 

(in whole or in part): Combat Identification[A]; Technologies to 

identify friendly and hostile forces; Used residuals: [Empty]; No 

residual or acquisition: [Empty].



[A] One of three projects that did not also have residual assets in 

use.



Source: GAO’s analysis.



[End of table]



Some Factors Can Hamper the ACTD Process:



Though the majority of the projects we examined transitioned 

technologies to users, we identified a range of factors that hampered 

this process. Specifically:



* The technology has been too immature to be tested in a realistic 

setting, leading to possible cancellation of the demonstration.



* The military services and defense agencies have been reluctant to 

fund acquisition of ACTD-proven technologies, especially those focusing 

on joint requirements, because of competing priorities.



* Appropriate expertise has not been employed for demonstrations and 

transitions.



* Transition for software projects has not been adequately planned.



* DOD lacks specific criteria to evaluate demonstration results, which 

may cause acquisition decisions to be based on too little knowledge.



At times, top-level support can overcome these barriers. But more 

systemic improvements focused on transition planning and funding 

commitment could reduce the need for high-level intervention. Figure 3 

highlights the specific factors we identified.



Figure 3: Illustration of Factors Influencing Outcomes:



[See PDF for Image]

[End of Figure]



Technology Maturity:



Because ACTDs are often conducted during large-scale, force-on-force 

military exercises, any new systems being tested must be dependable, 

able to perform as intended, and available on schedule in order not to 

negatively affect the exercises. As such, DOD has stressed that new 

technologies proposed for ACTDs should be “mature,” that is, they 

should have already been demonstrated to perform successfully at the 

subsystem or component level.



The technology of the ACTDs in our sample was not always mature. In 

some cases, problems were fairly basic, such as a technology having 

inadequate power supply or being too heavy and bulky to carry out its 

intended operation. In other cases, technologies had not reached a 

point where they could be tested in a realistic setting, forcing users 

to forego certain parts of a test. For example:



* The Joint Countermine project tested 15 technologies, including 

detection systems and clearance/breaching systems. During 

demonstration, users found that detection technologies had unacceptably 

high false alarm rates and a mine and heavy obstacle clearing device 

was simply too heavy, bulky, slow and difficult to operate remotely. 

Moreover, several systems could not be demonstrated on their intended 

platforms, or even associated with a suitable substitute platform. 

Further, a number of critical operational sequences, such as launch/

recovery, ordnance handling, and system reconfiguration, had not been 

demonstrated. As a result, only some technologies in this project have 

transitioned.



* The Consequence Management project examined 15 technologies designed 

to identify and respond to a biological warfare threat. During 

demonstration, users found that some of the items used to collect 

samples failed to operate and did not have sufficient battery 

capability and that switches broke. None of the other technologies 

performed flawlessly, and limitations such as size and weight made it 

apparent that they were not field ready. None of the technologies from 

this project entered into the acquisition process, nor did DOD continue 

to use any of the residual assets.[Footnote 2]



* Technologies supporting the Joint Modular Lighter System, a project 

testing a modular causeway system, failed during the demonstration 

because they had not been properly designed to withstand real world sea 

conditions. Consequently, the ACTD was concluded without a 

demonstration.



* The Navigation Warfare project, which focused on validating 

technologies for electronic warfare countermeasures, was terminated 

after DOD found that some of the technologies for the project could not 

be demonstrated. Some of the jamming technologies associated with this 

project are still being evaluated.



The technical maturity of software is also vital to successful 

demonstrations. If software is not able to work as intended, a 

project’s demonstration may be limited as a consequence. For this 

reason, one ACTD operations manager stressed that software technologies 

should be as mature as possible at the start of the ACTD. One ACTD 

included in our review experienced problems with software immaturity 

going into demonstration. Because software technologies in the 

Battlefield Awareness and Data Dissemination ACTD were not mature, 

certain planned exercises could not be concluded.



Before fiscal year 2002, OSD’s guidance only generally described the 

expectations for technology maturity and OSD did not use a consistent, 

knowledge-based method for measuring technology maturity of either 

hardware or software technologies. Specifically, OSD officials 

selecting the ACTDs used simple ranking schemes to capture the degree 

of technical risk after consulting with subject area experts. The 

results of these efforts were not usually documented. Studies conducted 

by the Congressional Budget Office in 1998 and DOD’s Inspector General 

in 1997 also found that without guidelines on how to assess maturity, 

DOD officials defined mature technology in widely contrasting ways.



In the last year, OSD has changed its guidance to address this problem. 

Specifically, it now requires technology maturity to be assessed using 

the same criteria--technology readiness levels (TRLs)--that DOD uses to 

assess technical risk in its formal acquisition programs.[Footnote 3] 

This change is discussed in more detail later in this report.



Sustaining Commitment:



Although OSD provides start-up funding for ACTDs, the military services 

and defense agencies are ultimately responsible for financing the 

acquisition and support of equipment or other items that may result 

from an ACTD. At times, however, the military services did not want to 

fund the transition process. This action either slowed down the 

acquisition process or resulted in no additional procurements. Projects 

that were particularly affected by this reluctance included those that 

tested unmanned aerial vehicles and software applications for enhancing 

the performance of a system to defeat enemy artillery. In other cases, 

DOD leaders stepped in to support the projects since there was a strong 

need for the technology and/or an extremely successful demonstration.



For example:



* The Predator is a medium-altitude unmanned aerial vehicle used for 

reconnaissance that progressed from a concept to a three-system 

operational capability in less than 30 months. The Predator ACTD was 

initiated in 1995. Since then, the Predator has been deployed in a 

range of military operations, most recently in the war in Afghanistan. 

Twelve systems, each containing four air vehicles, are being procured. 

The Air Force was designated as the lead service for the ACTD, even 

though it had shown no interest in this or other unmanned aerial 

vehicle programs. A transition manager was never assigned to this 

project. The Defense Airborne Reconnaissance Office was also reluctant 

to field and support the system beyond the test-bed phase. Further, at 

one point, the project almost ran out of funds before its end. 

Nevertheless, the Joint Staff directed the Air Force to accept the 

system from the Army and the Navy, which had acted as co-lead services 

throughout the demonstration phase.



* The Global Hawk is a high-altitude unmanned aerial vehicle designed 

for broad-area and long-endurance reconnaissance and intelligence 

missions. It has also been successfully used in recent military 

missions. The Air Force was also reluctant to fund this program. 

Nevertheless, eventually the Air Force had to accept the system since 

the system answered a critical need identified during the Gulf War, was 

considered to be a success in demonstration, and received support from 

the President, the Secretary of Defense, and the Congress.



In at least one case, the Precision/Rapid Counter Multiple Rocket 

Launcher ACTD, DOD did not overcome reluctance and, in turn, missed out 

on an opportunity to acquire important warfighting capabilities with 

joint applications. This project successfully demonstrated improved 

capability in rocket launch detection, command and control, and 

counterfire necessary for countering the threat from North Korean 

multiple rocket artillery with a system called the Automated Deep 

Operations Coordination System (ADOCS). Following the demonstration, 

the Army--the lead service for the project--decided not to formally 

acquire technologies since it was pursuing a similar development 

program. Moreover, the Navy, the Air Force, and the United States 

Forces, Korea, have acquired and deployed their own unique versions of 

the software.



The military services may not want to fund technologies focusing on 

meeting joint requirements either because they do not directly affect 

their individual missions and/or because there are other service-

specific projects that the services would prefer to fund. At the same 

time, OSD officials told us that they lack a mechanism for ensuring 

that decisions on whether to acquire items with proven military utility 

are made at the joint level, and not merely by the gaining 

organizations, and that these acquisitions receive the proper priority. 

DOD’s Joint Requirements Oversight Council, which is responsible for 

validating and prioritizing joint requirements, plays a role in 

deciding which ACTD nominees are selected for demonstration, but it 

does not have a role in the transition decision process, and is not 

currently concerned with transition outcomes. Moreover, no other DOD 

organization appears to have been given authority and responsibility 

for decisions regarding joint acquisition, integration, and support 

issues.



Another factor hindering transition funding has been the lack of 

alignment of the ACTD transition process with the DOD planning process. 

The planning process requires the services/agencies to program funds 

for technology transition long before the services/agencies assuming 

transition responsibilities know whether a candidate technology is 

useful to them. Consequently, at times, the services/agencies had to 

find funds within their own budgets to fund the transition.



ACTD Management:



The problem of not involving the staff with the appropriate expertise 

to carry out demonstrations and transition planning --in all phases of 

the ACTD process--may also affect ACTD outcomes. OSD’s guidance 

recommends that ACTDs use Integrated Product Teams to organize and 

conduct ACTDs. Integrated Product Teams bring together different skill 

areas (such as engineering, purchasing, and finance). By combining 

these areas of expertise into one team, there is no need to have 

separate groups of experts work on a product sequentially. We have 

reported in the past that this practice improved both the speed and 

quality of the decision-making process in developing weapon systems. 

[Footnote 4] Conversely, not involving the acquisition, test, and 

sustainment communities precludes the opportunity for OSD to understand 

during the demonstrations the significant issues that will arise after 

transition. In some cases, ACTD projects did not employ a “transition 

manager” as called for by OSD’s guidance. This manager, working for the 

service or the agency leading the demonstration, is to prepare the 

transition plan and coordinate its execution. When a manager was not 

designated, these duties often fell to a technical manager, who was 

primarily responsible for planning, coordinating, and directing all 

development activities through the demonstration. One ACTD--the Human 

Intelligence and Counterintelligence Support Tools--experienced high 

turnover in the “operational manager” position. Specifically, it had 

five different operational managers over its life. The operational 

manager, who represents the ACTD sponsoring command, is responsible for 

planning and organizing demonstration scenarios and exercises, defining 

a concept of operations for the ACTD, assessing whether the project has 

military utility, and making recommendations based on that assessment.



In addition to not involving the right people, at times ACTDs simply 

did not anticipate issues important to a successful transition early in 

the process. OSD’s guidance calls on teams to prepare a transition 

strategy that includes a contracting strategy and addresses issues such 

as interoperability, supportability, test and evaluation, 

affordability, funding, requirements, and acquisition program 

documentation. The guidance also suggests that the transition strategy 

anticipate where in the formal acquisition process the item would enter 

(e.g., low rate initial production or system development and 

demonstration) or even whether the item could be acquired informally, 

for example, through small purchases of commercially available 

products. Specifically, the lead service has the responsibility to 

determine the transition timing, nature, and funding methodology. In 

two ACTDs, a transition strategy was never developed. Both of these 

projects ended up transitioning only as residual assets.



The 1998 Congressional Budget Office study identified similar problems 

with transition planning. The study specifically noted that while DOD 

calls for each management plan to include some discussion of possible 

acquisition costs, few plans did so. The Congressional Budget Office 

asserted that this was probably because so little was known about a 

project’s future at its start. Even when more was known later in the 

demonstration, however, plans remained sketchy.



Software Challenges:



Software technologies present special planning challenges for 

transition. Because of the fast-paced nature of advanced technology, it 

is critical to move software ACTD projects through the demonstration 

and transition phases quickly so that they are not outdated by the time 

they are acquired or integrated into existing software programs and 

databases. At the same time, transition might be slowed by 

incompatibilities between the operating systems and/or language of the 

technologies of the ACTD candidate(s) and those of the intended host. 

This can be difficult since newer applications, particularly 

commercial-off-the-shelf systems, may be built to different technical 

standards or use different languages or supporting programs.



It was apparent in several ACTDs that there were technical difficulties 

in integrating the new technologies into their intended platforms. For 

example, the Adaptive Course of Action project tested software tools 

intended to enhance DOD’s Global Command and Control System (GCCS) 

specifically by facilitating near real-time collaborative joint 

planning by multiple participants during crisis action planning. In 

this case, transition has been slowed and may possibly not occur 

because the software module cannot be easily integrated into GCCS 

(partially due to its use of a different database program) and DOD has 

not analyzed other functionality and security issues associated with 

adding the new module. In another project, Battlefield Awareness and 

Data Dissemination, which focused on providing a synchronized, 

consistent battlespace description to warfighters, the transition had a 

mixed outcome. One collection of software applications was successfully 

transitioned to GCCS, but the transition of others was not as 

successful. The software application that was successfully integrated 

was an update of existing GCCS applications and the developers of the 

software had good working relationships with GCCS managers. The 

software that experienced problems was not as compatible.



Military Utility Assessments:



Another factor potentially affecting the outcomes of ACTDs is the lack 

of specific criteria for making assessments of military utility. These 

assessments evaluate the technologies of ACTD projects after the 

demonstrations. It is important that OSD have some assurance that the 

assessments are fact-based, thorough, and consistent, because they 

provide the basis upon which the military users can base their 

transition recommendations. OSD’s guidance calls for measures of 

effectiveness and performance to help gauge whether an item has 

military utility. It defines measures of effectiveness as high-level 

indicators of operational effectiveness or suitability and measures of 

performance as technical characteristics that determine a particular 

aspect of effectiveness or suitability. But the guidance does not 

suggest how detailed the measures should be, what their scope should 

be, or what format they should take. Consequently, we found that the 

scope, content, and quality of military utility assessments varied 

widely. For some of the ACTDs we reviewed, no documentation on military 

utility could be found. Without more specific criteria, customized for 

each ACTD, there is a risk that decisions on whether to acquire an item 

will be based on unsound data.



Initiatives Are Underway to Improve ACTD Outcomes:



DOD has undertaken several initiatives to improve the ACTD process, 

including adopting criteria to ensure technology is sufficiently 

mature; evaluating how the ACTD process can be improved; and placing 

more attention on transition planning and management (rather than on 

simply the selection and demonstration phases) through additional 

guidance, training, and staffing. These initiatives target many of the 

problems that can hinder success; however, DOD has not addressed the 

need to establish specific criteria for assessing the military utility 

of each of the candidate technologies and to establish a mechanism to 

ensure funding is made available for the transition.



Specifically, DOD headquarters, commands, military services, and a 

defense agency have undertaken the following efforts.



* OSD has adopted the same TRL criteria for fiscal year 2003 ACTD 

projects that DOD uses for assessing technical risks in its formal 

acquisition programs. These criteria apply to hardware as well as 

software. Adhering to this standard should help DOD to determine 

whether a gap exists between a technology’s maturity and the maturity 

demanded for the ACTD. TRLs measure readiness on a scale of one to 

nine, starting with paper studies of the basic concept, proceeding with 

laboratory demonstrations, and ending with a technology that has proven 

itself on the intended item. According to a senior OSD official, 

projects must be rated at least at TRL 5 when they enter the 

demonstration phase. This means that the basic technological components 

of the item being demonstrated have been integrated with reasonably 

realistic supporting elements so that the technology can be tested in a 

simulated environment. An example would be when initial hand-built 

versions of a new radio’s basic elements are connected and tested 

together. We reviewed submissions for the final 16 fiscal year 2003 

ACTD candidates and found that actual and projected TRLs of each 

technology ranged from 4 to 9.[Footnote 5] According to a senior OSD 

official, during the review of fiscal year 2003 candidates, there were 

some technologies with a TRL rating of 4 were accepted for 

demonstration because the need for them was compelling.



* In early 2002, OSD reviewed the ACTD process to examine current ACTDs 

for relevancy in a changing military environment and identify ways to 

make sure projects are value-added as well as to enhance transition. 

The results of this review included recommendations for additional 

discipline and informational requirements in the ACTD candidate 

selection phase, increased program management focus on the execution 

phase, and more emphasis on management oversight.



* OSD has also designated a staff member to manage transition issues 

and initiated a training program for future ACTD managers. This 

training will emphasize technology transition planning and execution.



* To enhance future technology transitions, OSD has taken action to 

better align the ACTD selection and the DOD planning and programming 

process. Moreover, OSD has issued new guidance for the fiscal year 2004 

ACTD candidates that calls on the gaining military or defense agencies 

to identify funds specifically for the demonstration and the 

transition, appoint a dedicated transition manager, and develop a 

transition plan before it will approve future ACTD candidates.



* The combatant commanders, military services, and a defense agency are 

also strengthening their guidance for conducting ACTDs. For example, 

the U.S. European Command has updated its guidance and the U.S. Joint 

Forces Command has developed detailed guidance for selecting and 

managing ACTDs. Additionally, the U.S. Pacific Command has developed 

definitive policies, procedures, and responsibilities for sponsoring 

and co-sponsoring ACTD programs. The U.S. Special Operations Command 

issued a policy memorandum for ACTD participation. The Army has begun 

development of an ACTD tracking system. It is also requiring ACTD 

candidate submissions to include TRL and other quantitative 

information. The Air Force has drafted both a policy directive and an 

instruction regarding ACTDs. The four services have begun meetings 

amongst themselves to discuss and review their future ACTD candidates. 

The Defense Information Systems Agency is also engaged in an effort to 

improve the transition of software technologies to users of systems 

such as GCCS.



Collectively, these efforts target many of the factors that can impede 

the ACTD process. However, OSD has not yet taken steps to develop 

specific criteria for assessing whether each of the ACTD candidates 

meet military needs. More guidance in this regard, particularly with 

respect to the scope and depth of these assessments and the need to 

document their results, can help to make sure (1) decisions are based 

on sound information and (2) items that could substantially enhance 

military operations are acquired. Moreover, while OSD is requiring 

services and agencies to identify funds for demonstration and 

acquisition early in the process, it does not have a mechanism for 

ensuring that this funding will be provided. As a result, it may 

continue to experience difficulty in getting the services to fund 

projects that meet joint needs but do not necessarily fit in with their 

own unique plans.



Conclusions:



The ACTD process has achieved some important, positive results in terms 

of developing and fielding new technologies to meet critical military 

needs quickly and more cost-effectively. DOD recognizes that further 

improvements are needed to increase opportunities for success. Its 

efforts to strengthen assessments of technology readiness and 

management controls--combined with more consistent, fact-based 

assessments of military utility--should help ensure that the ACTD 

program will produce better candidates. However, DOD’s initiatives will 

be challenging to implement since they require decision makers to 

balance the need to preserve creativity and flexibility within the ACTD 

process against the need for structure and management control. 

Moreover, to fully capitalize on the improvements being made, DOD needs 

to ensure that the services sustain their commitment to projects, 

especially those shown to meet critical joint military needs. This will 

also be a challenge because it will require DOD to overcome the 

services and agencies’ cultural resistance to joint initiatives and its 

lack of a programming and funding process for joint acquisitions. A 

place to make a good start in this regard may be to require the 

services and agencies to designate funding for ACTD transition 

activities and to have the Secretary of Defense weigh in on decisions 

on whether to continue to acquire technologies that are tested and 

proven under the ACTD program.



Recommendations for Executive Action:



To ensure that transition decisions are based on sufficient knowledge, 

we recommend that the Secretary of Defense develop and require the use 

of specific criteria for assessing the military utility of each of the 

technologies and concepts that are to be demonstrated within each ACTD. 

The criteria should at a minimum identify measurement standards for 

performance effectiveness and address how results should be reported in 

terms of scope, format, and desired level of detail.



To ensure funding of the transition and its aftermath, we recommend 

that the Secretary of Defense explore the option of requiring the 

services or defense agencies to develop a category within their budgets 

specifically for ACTD transition activities, including procurement and 

follow-on support.



To ensure that transition decisions reflect DOD’s priorities, we 

recommend that the Secretary of Defense require that the lead service 

or defense agency obtain the concurrence of the Secretary’s designated 

representative on any decision not to transition an ACTD that is based 

on joint requirements and determined to be militarily useful.



Agency Comments and Our Evaluation:



In commenting on a draft of this report, DOD generally concurred with 

the first two recommendations and outlined the actions to be taken to 

(1) define ACTD measurement standards and reporting formats for 

military utility assessments, and (2) work with the services to enhance 

their ability to enable follow-on transition and support of ACTD 

products. DOD partially concurred with our recommendation on the 

transition of militarily useful technology intended to address joint 

requirements. DOD stated that it would work to provide more information 

to the Joint Staff on specific ACTD results and evaluate quarterly 

meetings between the service acquisition executives and the Under 

Secretary of Defense for Acquisition, Technology and Logistics as a 

possible forum to raise issues on specific ACTDs. These actions may not 

address the intent of the recommendation, which is to provide the joint 

warfighter the opportunity to influence the DOD’s investment decisions. 

The ACTD program offers a good opportunity in the DOD acquisition 

system to evaluate equipment and concepts in the joint warfighting 

environment. However, while ACTDs often start based on a joint 

requirement, that perspective and priority may change when it comes to 

transition issues. For the DOD actions to effectively address this 

condition, the joint perspective should be more effectively represented 

in ACTD transition issues. DOD’s comments are reprinted in appendix II.



Scope and Methodology:



Between fiscal year 1995 and 2002, DOD initiated 99 ACTDs. As we began 

our review, 46 of these had completed their demonstration phase or had 

been canceled. We reviewed 24 of these in detail. We could not review 

the remainder to the same level of detail because their military 

utility assessments were incomplete or not available and because we did 

not choose to present information on those projects that were highly 

classified. To assess the results of the completed ACTDs, we examined 

each project’s military utility assessment documents, final program 

reports, lessons learned reports, and other pertinent ACTD documents, 

such as the program acquisition strategies. We interviewed operational 

and technical managers and other knowledgeable program officials at the 

unified combatant commanders, defense agencies, and the services to 

discuss the phases of each ACTD project and its transition status.



Specifically, we interviewed officials at the Science and Technology 

Office of the United States Pacific Command, Camp Smith, Hawaii; the 

European Command, Stuttgart, Germany; the Central Command, Tampa, 

Florida; the Special Operations Command, Tampa, Florida; the Joint 

Forces Command, Norfolk, Virginia; the Air Combat Command, Hampton, 

Virginia; the Army Training and Doctrine Command, Hampton, Virginia; 

and the Marine Corps Warfighting Lab, Quantico, Virginia. We also 

contacted ACTD officials at the Program Executive Office of the Air 

Base and Port Biological Program Office, Falls Church, Virginia; the 

Defense Information Systems Agency, Falls Church, Virginia; the Defense 

Advanced Research Projects Agency, Arlington, Virginia; the Defense 

Threat Reduction Agency, Fort Belvoir, Virginia; and the Defense 

Intelligence Agency, Arlington, Virginia.



To determine the factors that affected the transition outcomes of 

completed ACTD projects, we met with the operational and technical 

managers for each ACTD as well as other knowledgeable program officials 

and the designated ACTD representatives from each of the services. We 

compared information gathered on the individual ACTDs to discern those 

factors that were salient in a majority of the cases. In order to 

better understand ACTD program guidance, funding, and management that 

can affect transition outcomes, we spoke with relevant officials within 

the office of the Deputy Undersecretary of Defense, Advanced Systems 

and Concepts (DUSD (AS&C)), including staff responsible for funding and 

transition issues, and the Executive Oversight Manager for each ACTD. 

We also discussed ACTD management and transition issues with 

representatives of the DUSD (AS&C), Comptroller; the Joint Staff; and 

the Director, Defense Research and Engineering; the Defense Advanced 

Research Projects Agency; and the Defense Information Systems Agency. 

We did not conduct a detailed review of the users’ acceptance or 

satisfaction with the items of the ACTD process.



We conducted our review between October 2001 and October 2002 in 

accordance with generally accepted government auditing standards.



We are sending copies of this report to the Chairmen and Ranking 

Minority Members of the Subcommittee on Defense, Senate Committee on 

Appropriations; the House Committee on Armed Services; and the 

Subcommittee on Defense, House Committee on Appropriations; and the 

Secretaries of Defense, the Army, the Navy, and the Air Force. We are 

also sending copies to the Director, Office of Management and Budget. 

In addition, this report will be made available at no charge on the GAO 

Web site at http://www.gao.gov.



If you or your staff have questions concerning this report, please 

contact me at (202) 512-4841. Others who made key contributions to this 

report include William Graveline, Tony Blieberger, Cristina Chaplain, 

Martha Dey, Leon Gill, and Nancy Rothlisberger.



Katherine V. Schinasi

Director, Acquisition and Sourcing Management:

Signed by Katherine V. Schinasi:



[End of section]



Appendix I: Technology Readiness Levels and Their Definitions:



[End of section]



Technology readiness level: 1. Basic principles observed and reported.; 

Description: Lowest level of technology readiness. Scientific research 

begins to be translated into applied research and development. Examples 

might include paper studies of a technology’s basic properties.



Technology readiness level: 2. Technology concept and/or application 

formulated.; Description: Invention begins. Once basic principles are 

observed, practical applications can be invented. The application is 

speculative and there is no proof or detailed analysis to support the 

assumption. Examples are still limited to paper studies..



Technology readiness level: 3. Analytical and experimental critical 

function and/or characteristic proof of concept.; Description: Active 

research and development is initiated. This includes analytical studies 

and laboratory studies to physically validate analytical predictions of 

separate elements of the technology. Examples include components that 

are not yet integrated or representative..



Technology readiness level: 4. Component and/or breadboard.; Validation 

in laboratory environment.; Description: Basic technological 

components are integrated to establish that the pieces will work 

together. This is relatively “low fidelity” compared to the eventual 

system. Examples include integration of “ad hoc” hardware in a 

laboratory..



Technology readiness level: 5. Component and/or breadboard validation 

in relevant environment.; Description: Fidelity of breadboard 

technology increases significantly. The basic technological components 

are integrated with reasonably realistic supporting elements so that 

the technology can be tested in a simulated environment. Examples 

include “high fidelity” laboratory integration of components..



Technology readiness level: 6. System/subsystem model or prototype 

demonstration in a relevant environment.; Description: Representative 

model or prototype system, which is well beyond the breadboard tested 

for technology readiness level (TRL) 5, is tested in a relevant 

environment. Represents a major step up in a technology’s demonstrated 

readiness. Examples include testing a prototype in a high fidelity 

laboratory environment or in a simulated operational environment..



Technology readiness level: 7. System prototype demonstration in an 

operational environment.; Description: Prototype near or at planned 

operational system. Represents a major step up from TRL 6, requiring 

the demonstration of an actual system prototype in an operational 

environment, such as in an aircraft, vehicle or space. Examples include 

testing the prototype in a test bed aircraft..



Technology readiness level: 8. Actual system completed and “flight 

qualified” through test and demonstration.; Description: Technology has 

been proven to work in its final form and under expected conditions. In 

almost all cases, this TRL represents the end of true system 

development. Examples include developmental test and evaluation of the 

system in its intended weapon system to determine if it meets design 

specifications..



Technology readiness level: 9. Actual system “flight proven” through 

successful mission operations.; Description: Actual application of the 

technology in its final form and under mission conditions, such as 

those encountered in operational test and evaluation. In almost all 

cases, this is the end of the last “bug fixing” aspects of true system 

development. Examples include using the system under operational 

mission conditions..



[End of table]



[End of section]



Appendix II: Comments from the Department of Defense:



ACQUISITION, TECHNOLOGY AND LOGISTICS:



OFFICE OF THE UNDER SECRETARY OF DEFENSE:



3000 DEFENSE PENTAGON WASHINGTON, DC 20301-3000:



November 27, 2002:



Ms. Katherine V. Schinasi:



Director, Acquisition and Sourcing Management U.S. General Accounting 

Office:



441 G. Street, N.W. Washington, DC 20548:



Dear Ms. Schinasi:



This letter provides a modification to the enclosure of my letter dated 

November 14, 2002 which was the Department of Defense (DoD) response to 

the GAO Draft Report “DEFENSE ACQUISITIONS: Factors Affecting Outcomes 

of Advanced Concept Technology Demonstrations,” dated October 29, 2002.



Based on our coordination and your modifications to the subject Draft 

Report, I have modified my response to Recommendation 1 from “partially 

concur” to “concur.” My action officer for this effort is Mr. Ben 

Riley, (703) 602-06983, ben.riley@osd.mil:



Sincerely,



SUE C. PAYTON 

Deputy Under Secretary of Defense (Advanced Systems & 

Concepts):

Signed by Sue C. Payton:



ACQUISITION, TECHNOLOGY AND LOGISTICS:



OFFICE OF THE UNDER SECRETARY OF DEFENSE:



3000 DEFENSE PENTAGON WASHINGTON, DC 20301-3000:



November 14, 2002:



Ms. Katherine V. Schinasi:



Director, Acquisition and Sourcing Management U.S. General Accounting 

Office:



441 G. Street, N. W. Washington, DC 20548:



Dear Ms. Schinasi:



This is the Department of Defense (DoD) response to the GAO Draft 

Report “DEFENSE ACQUISITIONS: Factors Affecting Outcomes of Advanced 

Concept Technology Demonstrations,” dated October 29, 2002 (GAO Code 

120105).



The DoD has reviewed the draft report and partially concurs with 

Recommendations 1 and 3 and concurs with Recommendation 2. Specific 

comments for each recommendation are enclosed. My action officer for 

this effort is Mr. Ben Riley, (703) 602-0683, ben.riley@osd.mil



SUE C. PAYTON 

Deputy Under Secretary of Defense (Advanced Systems & 

Concepts):

Signed by Sue C. Payton:



Enclosure:



GAO DRAFT REPORT DATE OCTOBER 29, 2002 GAO CODE120105/GAO-03-52:



“DEFENSE ACQUISITIONS: FACTORS AFFECTING OUTCOMES OF ADVANCED CONCEPT 

TECHNOLOGY DEMONSTRATIONS”:



DEPARTMENT OF DEFENSE COMMENTS TO THE RECOMMENDATIONS:



RECOMMENDATION 1: The GAO recommended that the Secretary of Defense 

develop and require the use of criteria for assessing the military 

utility of the technologies and concepts that are to be demonstrated 

within each Advanced Concept Technology Demonstration (ACTD). The 

criteria should at a minimum identify measurement standards for 

performance effectiveness and address how results should be reported in 

terms of scope, format, and desired level of detail. (p. 17/GAO Draft 

Report):



DOD Response: Partially Concur. The Deputy Under Secretary of Defense 

for Advanced Systems and Concepts (AS&C) is working with, and will 

continue to work with, the participants of each ACTD to define, prior 

to the program initiation a clear set of measurement standards for 

performance effectiveness. This effort will also identify the 

appropriate reporting formats including scope and level of detail for 

these programs. These standards will be vetted with the appropriate 

Lead Service, User Sponsor and Transition Manager for each individual 

ACTD. Recognizing the range and variability of ACTDs and the topics 

they address, however, will make it difficult to develop a single 

measurement standard. The performance standards for each ACTD will be 

unique. However, it is critical that these standards be identified up 

front and that all ACTD participants be aware of them and their 

function as a key metric in defining the military utility of the 

components of each ACTD. Additionally, AS&C expanded partnerships with 

Service testing and evaluation centers during the demonstration 

process. These centers bring recognized expertise with military utility 

assessment processes and reports. Drawing on this experience, AS&C will 

develop assessment templates to methodically capture and catalog 

results of demonstrations.



RECOMMENDATION 2: The GAO recommended that the Secretary of Defense 

explore the option of requiring the Services and defense agencies to 

develop a category within their budgets specifically for ACTD 

transition activities, including procurement and follow-on support. (p. 

17/GAO Draft Report):



DOD RESPONSE: Concur. The Deputy Under Secretary of Defense (Advanced 

Systems and Concepts) will, in coordination with the Under Secretary of 

Defense (Acquisition, Technology and Logistics) continue to coordinate 

with Services and defense agencies to develop funding strategies to 

support follow on transition and support of ACTD products which have 

demonstrated military utility during field exercises and actual 

operations. As an initial effort, the Deputy Under Secretary of Defense 

(Advanced Systems and Concepts) has adjusted the OSD funding 

contribution for execution of an:



ACTD to focus on the first two years of the program. This, in theory, 

allows Services and defense agencies to adjust their out year Program 

Objective Memoranda and budgets to more adequately fund the latter 

portion of an individual ACTD.



RECOMMENDATION 3: The GAO recommended the Secretary of Defense require 

the lead service or defense agency obtain the concurrence of the 

Secretary’s designated representative on any decision not to transition 

an ACTD that is based on joint requirements and determined to be 

militarily useful. (p. 17/GAO Draft Report):



DOD RESPONSE: Partially concur. The Deputy Under Secretary of Defense 

for Advanced Systems and Concept will enhance coordination with the 

Joint Staff in order to provide more comprehensive feedback on the 

performance of specific ACTDs and the merits of the ACTD components to 

enhance specific joint warfighting issues and requirements. 

Additionally the Under Secretary of Defense for Acquisition, Technology 

and Logistics currently conducts quarterly meetings with Service 

Acquisition Executives to review specific aspects of the ACTD program. 

This forum will be evaluated as a venue to bring forward issues 

regarding both the performance of and transition of specific ACTD 

programs and products.



The Department of Defense provided written comments on a draft of our 

report. In a 

November 27, 2002, letter DOD modified its comments from “partially 

concur” to “concur” with our recommendation 1.



GAO’s Mission:



The General Accounting Office, the investigative arm of Congress, 

exists to support Congress in meeting its constitutional 

responsibilities and to help improve the performance and accountability 

of the federal government for the American people. GAO examines the use 

of public funds; evaluates federal programs and policies; and provides 

analyses, recommendations, and other assistance to help Congress make 

informed oversight, policy, and funding decisions. GAO’s commitment to 

good government is reflected in its core values of accountability, 

integrity, and reliability.

:



Obtaining Copies of GAO Reports and Testimony:



The fastest and easiest way to obtain copies of GAO documents at no 

cost is through the Internet. GAO’s Web site (www.gao.gov) contains 

abstracts and full-text files of current reports and testimony and an 

expanding archive of older products. The Web site features a search 

engine to help you locate documents using key words and phrases. You 

can print these documents in their entirety, including charts and other 

graphics.



Each day, GAO issues a list of newly released reports, testimony, and 

correspondence. GAO posts this list, known as “Today’s Reports,” on its 

Web site daily. The list contains links to the full-text document 

files. To have GAO e-mail this list to you every afternoon, go to 

www.gao.gov and select “Subscribe to daily E-mail alert for newly 

released products” under the GAO Reports heading.

:



Order by Mail or Phone:



The first copy of each printed report is free. Additional copies are $2 

each. A check or money order should be made out to the Superintendent 

of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 

more copies mailed to a single address are discounted 25 percent. 

Orders should be sent to:



U.S. General Accounting Office

441 G Street NW, Room LM

Washington, D.C. 20548:



To order by Phone: Voice: (202) 512-6000 

TDD: (202) 512-2537

Fax: (202) 512-6061

:



To Report Fraud, Waste, and Abuse in Federal Programs:



Contact:



Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470

:



Public Affairs:



Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800

U.S. General Accounting Office, 441 G Street NW, Room 7149 

Washington, D.C. 20548:



FOOTNOTES



[1] These systems typically recognize or detect targets, initiate 

detonation, and determine the direction of detonation.



[2] However, this ACTD did produce a published concept of operations 

for both units involved in the demonstrations, the Technical Escort 

Unit and the Chemical-Biological Incident Response Force. In addition, 

this ACTD provided the first opportunity for these units to work 

together and demonstrated the ability of DOD units to integrate with 

other federal, state, and local agencies.



[3] See U.S. General Accounting Office, Best Practices: Better 

Management of Technology Development Can Improve Weapon System 

Outcomes, GAO/NSIAD-99-162 (Washington, D.C.: July 30, 1999).



[4] See U.S. General Accounting Office, Best Practices: DOD Teaming 

Practices Not Achieving Potential Results, GAO-01-510 (Washington, 

D.C.: Apr. 10, 2001).



[5] See appendix I for a description of TRLs. A single ACTD candidate 

could be comprised of multiple technologies assessed at different 

readiness levels. We have found that a TRL of 7 at the state of product 

development indicates a low risk for cost and schedule increases.



GAO’s Mission:



The General Accounting Office, the investigative arm of Congress, 

exists to support Congress in meeting its constitutional 

responsibilities and to help improve the performance and accountability 

of the federal government for the American people. GAO examines the use 

of public funds; evaluates federal programs and policies; and provides 

analyses, recommendations, and other assistance to help Congress make 

informed oversight, policy, and funding decisions. GAO’s commitment to 

good government is reflected in its core values of accountability, 

integrity, and reliability.



Obtaining Copies of GAO Reports and Testimony:



The fastest and easiest way to obtain copies of GAO documents at no 

cost is through the Internet. GAO’s Web site ( www.gao.gov ) contains 

abstracts and full-text files of current reports and testimony and an 

expanding archive of older products. The Web site features a search 

engine to help you locate documents using key words and phrases. You 

can print these documents in their entirety, including charts and other 

graphics.



Each day, GAO issues a list of newly released reports, testimony, and 

correspondence. GAO posts this list, known as “Today’s Reports,” on its 

Web site daily. The list contains links to the full-text document 

files. To have GAO e-mail this list to you every afternoon, go to 

www.gao.gov and select “Subscribe to daily E-mail alert for newly 

released products” under the GAO Reports heading.



Order by Mail or Phone:



The first copy of each printed report is free. Additional copies are $2 

each. A check or money order should be made out to the Superintendent 

of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 

more copies mailed to a single address are discounted 25 percent. 

Orders should be sent to:



U.S. General Accounting Office



441 G Street NW,



Room LM Washington,



D.C. 20548:



To order by Phone: 	



	Voice: (202) 512-6000:



	TDD: (202) 512-2537:



	Fax: (202) 512-6061:



To Report Fraud, Waste, and Abuse in Federal Programs:



Contact:



Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov



Automated answering system: (800) 424-5454 or (202) 512-7470:



Public Affairs:



Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.



General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.



20548: