This is the accessible text file for GAO report number GAO-02-392T 
entitled 'DOD's Standard Procurement System: Continued Investment Has 
Yet to Be Justified' which was released on February 7, 2002. 

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the 
printed version. The portable document format (PDF) file is an exact 
electronic replica of the printed version. We welcome your feedback. 
Please E-mail your comments regarding the contents or accessibility 
features of this document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

United States General Accounting Office: 
GAO: 

Testimony: 

Before the Subcommittee on National Security, Veterans Affairs, and 
International Relations, Committee on Government Reform, House of 
Representatives: 

For Release on Delivery: 
Expected at 9:30 a.m. 
Thursday, February 7, 2002: 

DOD's Standard Procurement System: 

Continued Investment Has Yet to Be Justified: 

Statement of Joel C. Willemssen: 
Managing Director, Information Technology Issues: 

GAO-02-392T: 

Mr. Chairman and Members of the Subcommittee: 

I am pleased to be here today to discuss the Department of Defense's 
(DOD) management of its investment in the Standard Procurement System 
or SPS program. The department launched this program a little more 
than 7 years ago with the laudable goal of replacing 76 existing 
procurement systems with a single departmentwide system to more 
effectively support divergent contracting processes and procedures 
across its component organizations. Through SPS, the department 
expected to improve efficiency and effectiveness in how it awarded and 
managed contracts, and, at that time, estimated life-cycle costs to be 
approximately $3 billion over a 10-year period. 

The department's goals for SPS are reinforced by the president's 
recent management agenda, which emphasizes investing in information 
technology to achieve results. The agenda also noted that the federal 
government has not produced measurable gains in productivity 
commensurate with its investment in information technology,[Footnote 
1] which is now estimated to be more than $50 billion for fiscal year 
2003. The agenda reiterates that program performance and results are 
what matters most, and that actual program accomplishments, as well as 
needs, should be the prerequisite to continued funding. This emphasis 
is consistent with information-technology investment management 
provisions of federal law and guidance[Footnote 2] and information-
technology management practices of leading public- and private-sector 
companies. 

For the SPS program, we reported in July 2001 that the department had 
not met these investment management criteria.[Footnote 3] Specifically: 

* The department had not economically justified its investment in the 
program because its latest (January 2000) analysis of costs and 
benefits was not credible. Further, this flawed analysis showed that 
the system, as defined, was not a cost-beneficial investment. 

* It had not effectively addressed the inherent risks associated with 
investing in a program as large and lengthy as SPS because it had not 
divided the program into incremental investment decisions that 
coincided with incremental releases of system capabilities. 

* The department had not met key program commitments that were used to 
justify the program. For example, the department committed to 
implementing a commercially available contract management system; 
however, because it had modified so much of the foundational 
commercial product, SPS evolved into a customized DOD system. Also, 
although the department committed to fully implementing the system by 
March 31, 2000, this target date had slipped by 3 1/2 years to 
September 30, 2003, and program officials have recently stated that 
this date will also not be met. 

* It did not know if it was meeting other key program commitments. For 
example, the department had not measured whether promised system 
benefits were being realized, and the information that was available 
about system performance showed that users were not satisfied with the 
system. Also, because DOD was not accumulating actual program costs, 
it did not know the total amount spent on the program to date, yet 
life-cycle cost projections had grown from about $3 billion to $3.7 
billion. 

Collectively, this meant that the question of whether further 
investment in SPS was justified could not be answered with any 
certainty. Accordingly, we recommended that investment in future 
releases or major enhancements to the system be made conditional on 
the department first demonstrating that the system was producing 
benefits that exceed costs, and that future investment decisions be 
based on complete and reliable economic justifications. We also 
recommended that program officials clarify organizational 
accountability and responsibility for the program, determine the 
program's current status, and identify lessons learned from the SPS 
investment management experience. 

In commenting on a draft of our report, the Deputy Chief Information 
Officer (CIO) generally disagreed with our recommendations, noting 
that they would delay development and deployment of SPS. Since that 
time, however, the department has either initiated or stated its 
intention to initiate steps that are consistent with our 
recommendations. It has also taken steps to address the findings of 
several department-sponsored studies initiated at the time of our 
report. For example, it has (1) clarified organizational 
accountability and responsibility for the program, (2) established 
missing controls over key acquisition processes such as requirements 
management and testing, and (3) begun addressing users' concerns. In 
addition, department officials have stated that the department will 
prepare an economic analysis before investing beyond already executed 
contractual commitments and that it will conduct a productivity study 
to assess the extent to which the department is deriving benefits from 
SPS. These are positive steps that have advanced the program beyond 
where it was at the time of our report. 

Nevertheless, much remains to be done before the department will be in 
a position to make an informed, data-driven decision about whether 
further investment in the system is justified. Namely, although 
program officials have stated their intentions to address our 
recommendations, they have not yet committed to specific tasks for 
doing so nor have they established milestone dates for completing 
these tasks. Further, the department may expand the functionality of 
the current software release to include requirements previously slated 
for later releases, which could compound existing problems and 
increase costs; and, although intended to be a standard system for the 
entire department, not all defense components have agreed to adopt SPS. 

SPS: A Brief Description and History: 

In November 1994, the Office of the Director of Defense Procurement 
initiated the SPS program to acquire and deploy a single automated 
system to perform all contract-management-related functions for all 
DOD organizations. At that time, life-cycle costs were estimated to be 
about $3 billion over a 10-year period. 

From 1994 to 1996, the department defined SPS requirements and 
solicited commercially available vendor products for satisfying these 
requirements. Subsequently, in April 1997, the department awarded a 
contract to American Management Systems (AMS), Incorporated, to (1) 
use AMS's commercially available contract management system as the 
foundation for SPS, (2) modify this commercial product as necessary to 
meet DOD requirements, and (3) perform related services.[Footnote 4] 
The department also directed the contractor to deliver functionality 
for the system in four incremental releases. The department later 
increased the number of releases across which this functionality would 
be delivered to seven, reduced the size of the increments, and allowed 
certain more critical functionality to be delivered sooner (see table 
1 for proposed SPS functionality by increment). 

Table 1: Summary of SPS Functionality by Increment: 

Increment: 1; 
Software release (subreleases): 3.1; 
Functionality: Provide base-level contracting capabilities enabling 
DOD procurement personnel to prepare simple contracts, which are 
generally fixed-price, 1-year contracts that will not be modified. 

Increment: 2; 
Software release (subreleases): 3.5; 
Increment: 3; 
Software release (subreleases): 4.0; 
Increment: 4; 
Software release (subreleases): 4.1 (a—e); 
Increment: 5; 
Software release (subreleases): 4.2; 
Functionality: Provide enhanced base-level contracting functionality 
for DOD procurement personnel, such as reporting and contract 
administration capabilities, automatic edits, security features, and 
electronic interfaces for legacy systems being replaced. 

Increment: 6; 
Software release (subreleases): 5.0; 
Functionality: Provide more complex contracting capabilities, enabling 
DOD procurement personnel to purchase weapons systems. These contracts 
are generally fewer in number, but are more complicated, consisting of 
numerous provisions and contract line-item numbers, and usually 
undergo extensive modifications. 

Increment: 7; 
Software release (subreleases): 5.1; 
Functionality: Provide inventory control point (ICP) functionality for 
ICPs, which are responsible for the support and acquisition of spare 
parts and supplies, enabling workload management to better manage 
inventories. 

Source: DOD. 

[End of table] 

Since our report of July 2001,[Footnote 5] DOD has revised its plans. 
According to the SPS program manager, current plans no longer include 
increments 6 and 7 or releases 5.0 and 5.1. Instead, release 4.2 
(increment 5) will include at least three, but not more than seven, 
subreleases. At this time, only the first of the potentially seven 4.2 
subreleases is under contract. This subrelease is scheduled for 
delivery in April 2002, with deployment to the Army and the Defense 
Logistics Agency scheduled for June 2002. Based on the original 
delivery date, release 4.2 is about one year overdue. 

The department reports that it has yet to define the requirements to 
be included within the remaining 4.2 subreleases, and has not executed 
any contract task orders for these subreleases. According to SPS 
officials, they will decide later this year whether to invest in these 
additional releases. 

As of December 2001, the department reported that it had deployed four 
SPS releases to over 777 locations.[Footnote 6] The Director of 
Defense Procurement (DDP) has responsibility for the SPS program, 
[Footnote 7] and the CIO is the milestone decision authority for SPS 
because the program is classified as a major Defense acquisition. 
[Footnote 8] 

Numerous SPS Concerns Have Been	Raised by Us and Others: 

Our July 2001 report detailed program problems and investment 
management weaknesses.[Footnote 9] To address these weaknesses, we 
recommended, among other things, that the department report on the 
lessons to be learned from its SPS experience for the benefit of 
future system acquisitions. Similarly, other reviews of the program 
commissioned by the department in the wake of our review raised 
similar concerns and identified other problems and management 
weaknesses. The findings from our report are summarized below in two 
major categories: lack of economic justification for the program and 
inability to meet program commitments. We also summarize the findings 
of the other studies. 

DOD Had Not Economically Justified Its Investment in SPS: 

The Clinger-Cohen Act of 1996, OMB guidance, DOD policy, and practices 
of leading organizations provide an effective framework for managing 
information technology investments, not just when a program is 
initiated, but continuously throughout the life of the program. 
Together, they provide for: 

(1) economically justifying proposed projects on the basis of reliable 
analyses of expected life-cycle costs, benefits, and risks; and; 

(2) using these analyses throughout a project's life-cycle as the 
basis for investment selection, control, and evaluation 
decisionmaking, and doing so for large projects (to the maximum extent 
practical) by dividing them into a series of smaller, incremental 
subprojects or releases and individually justifying investment in each 
separate increment on the basis of costs, benefits, and risks. 

The department had not met these investment management tenets for SPS. 
First, the latest economic analysis for the program—dated January 2000—
was not based on reliable estimates because most of the cost estimates 
in the 2000 economic analysis were estimates carried forward from the 
April 1997 analysis (adjusted for inflation). Only the cost estimates 
being funded and managed by the SPS program office, which were 13 
percent of the total estimated life-cycle cost in the analysis, were 
updated in 2000 to reflect more current contract estimates and actual 
expenditures/obligations for fiscal years 1995 through 1999. Moreover, 
the military services, which share funding responsibility with the SPS 
program office for implementing the program, questioned the 
reliability of these cost estimates. However, this uncertainty was not 
reflected in the economic analysis using any type of sensitivity 
analysis.[Footnote 10] A sensitivity analysis would have disclosed for 
decisionmakers the investment risk being assumed by relying on the 
estimates presented in the economic analysis. 

Moreover, the latest economic analysis (January 2000) was outdated 
because it did not reflect the program's current status and known 
problems and risks. For instance, this analysis was based on a program 
scope and associated costs and benefits that anticipated four software 
releases. However, as mentioned previously, the program now consists 
of five releases, and subreleases within releases, in order to 
accommodate changes in SPS requirements. Estimates of the full costs, 
benefits, and risks relating to this additional release and its 
subreleases were not part of the 2000 economic analysis. Also, this 
analysis did not fully recognize actual and expected delays in meeting 
SPS's full operational capability milestone, which had been slipped by 
31/2 years and DOD officials say that further delays are currently 
expected. Such delays not only increase the system acquisition costs 
but also postpone, and thus reduce, accrual of system benefits. 
Further, several DOD components are now questioning whether they will 
even deploy the software, which would further reduce SPS's cost 
effectiveness calculations in the 2000 economic analysis. 

Second, the department had not used these analyses as the basis for 
deciding whether to continue to invest in the program. The latest 
economic analysis showed that SPS was not a cost-beneficial investment 
because the estimated benefits to be realized did not exceed estimated 
program costs. In fact, the 2000 analysis showed estimated costs of 
$3.7 billion and estimated benefits of $1.4 billion, which was a 
recovery of only 37 percent of costs. According to the former SPS 
program manager, this analysis was not used to manage the program and 
there was no DOD requirement for updating an economic analysis when 
changes to the program occurred. 

Third, DOD had not made its investment decisions incrementally as 
required by the Clinger-Cohen Act and OMB guidance. That is, although 
the department is planning to acquire and implement SPS as a series of 
five increments, it has not made decisions about whether to invest in 
each release on the basis of the release's expected return on 
investment, as well as whether prior releases were actually achieving 
return-on-investment expectations. In fact, for the four increments 
that have been deployed, the department had not validated whether the 
increments were providing promised benefits and was not accounting for 
the costs associated with each increment so that it could even 
determine actual return on investment. 

Instead, the department had treated investment in this program as one, 
monolithic investment decision, justified by a single, "all-or-
nothing" economic analysis. Our work has shown that it is difficult to 
estimate, with any degree of accuracy, cost and schedule estimates for 
many increments to be delivered over many years because later 
increments are not well understood or defined. Also, these estimates 
are subject to change based on actual program experiences and changing 
requirements. This "all-or-nothing" approach to investing in large 
system acquisitions, like SPS, has repeatedly proven to be ineffective 
across the federal government, resulting in huge sums being invested 
in systems that do not provide commensurate benefits. 

DOD Had Not Met or Did Not Know if It Had Met SPS Commitments: 
	Measuring progress against program commitments is closely 
aligned with economically justifying information-technology 
investments, and is equally important to ensuring effective investment 
management. The Clinger-Cohen Act, OMB guidance, DOD policy,[Footnote 
11] and practices of leading organizations provide for making and 
using such measurements as part of informed investment decisionmaking. 

DOD had not met key commitments and was uncertain whether it was 
meeting other commitments because it was not measuring them. (See 
table 2 for a summary of the department's progress against 
commitments.) 

Table 2: Progress Against SPS Program Commitments: 

Key commitments: System fully operational by March 31, 2000; 
Commitment met? No; 
Explanation(s): Problems were encountered in modifying and testing the 
commercial product and in adequately defining requirements. For 
example, there were no system performance requirements in the SPS 
contract.[A] The target date had slipped 3-1/2 years.[B] 

Key commitments: Contracting community’s needs met; 
Commitment met? No; 
Explanation(s): Approximately 60 percent of the user population 
recently surveyed by DOD’s OIG were dissatisfied with the system’s 
functionality and performance.[C] 

Key commitments: Acquire a commercially available software product; 
Commitment met? No; 
Explanation(s): The commercial product had been extensively modified, 
resulting in a DOD-unique system. 

Other commitments: Replace 76 legacy procurement systems and manual 
processes, thereby reducing procurement system operations and 
maintenance costs; 
Commitment met? DOD was unaware of the extent to which the commitment 
had been met; 
Explanation(s): Only 2 legacy systems had been fully retired and 2 
partially retired, and DOD did not know what, if any, associated cost 
savings had resulted. Also, DOD now plans to retire only 14 legacy 
systems as a result of SPS’s implementation. 

Other commitments: Increase user productivity; 
Commitment met? DOD was unaware of the extent to which the commitment 
had been met; 
Explanation(s): DOD is unaware of the extent to which productivity may 
have increased because it did not implement needed performance metrics. 

Other commitments: Standardize policies, processes, and procedures; 
Commitment met? DOD was unaware of the extent to which the commitment 
had been met; 
Explanation(s): Each military service had or was planning to develop 
its own unique program documentation. 

Other commitments: Reduce problem disbursements; 
Commitment met? DOD was unaware of the extent to which the commitment 
had been met; 
Explanation(s): DOD was unable to provide any evidence that 
implementing SPS had reduced problem disbursements, nor had it 
included this benefit in its latest economic analysis. 
		
Other commitments: Life-cycle costs of $3.7 billion over a 10-year 
period; 
Commitment met? DOD was unaware of the extent to which the commitment 
had been met; 
Explanation(s): DOD was unaware of the amount spent on the program to 
date because cost information was being tracked and officially 
reported only for the SPS program office. Costs incurred by all DOD 
component organizations were not accumulated and reported.[D] 

[A] While the former program manager attributed the delay to an 
increase in requirements, the SPS Joint Requirements Board chairperson 
stated that no additional requirements had been approved. Rather, the 
board's chairperson stated that the original requirements had not been 
well-defined and clarification was needed to better ensure that user 
needs would be met. 

[B] According to the current program manager, the most recent target 
date of September 30, 2003, will not be met. In addition, another 
target date has not yet been established for completing the program. 

[C] A user satisfaction manager was recently designated for this 
program. 

[D] Based on DOD documents we obtained during our current review, at a 
minimum, $511.6 million had been spent as of September 30, 2001. 

[End of table] 

To partially fill the void in knowing progress against SPS 
commitments, the program office initiated a study in June 2000 to 
validate the extent to which benefits from version 4.1 would be 
realized. However, the study was not well planned and executed, and 
while some useful information was obtained, the study did not allow 
DOD to validate whether expected benefits were actually being 
realized. For example, 

* the sample selected was not statistically valid, meaning that the 
results were not projectable to the population as a whole, 

* the study was based on the 1997 economic analysis instead of the more
current 2000 economic analysis, despite key differences between the 
two analyses, such as the number and dollar value of estimated 
benefits, and, 

* the information gathered did not map to the 22 benefit types listed 
in the 1997 economic analysis. Instead, the study collected subjective 
judgments (perceptions) that were not based on predefined performance 
metrics for SPS capabilities and impacts. Thus, the department was not 
measuring SPS against its promised benefits. 

The former program manager told us that knowing whether SPS was 
producing value and meeting commitments was not the program office's 
objective because there was no departmental requirement to do so. 
Rather, the objective was simply to acquire and deploy the system. 
Similarly, CIO officials told us that the department was not 
validating whether deployed releases of SPS were producing benefits 
because there was no DOD requirement to do so and no metrics had been 
defined for such validation.[Footnote 12] However, the Clinger-Cohen 
Act of 1996 and OMB guidance[Footnote 13] emphasize the need to have 
investment management processes and information to help ensure that 
information-technology projects are being implemented at acceptable 
costs and within reasonable and expected time frames and that they are 
contributing to tangible, observable improvements in mission 
performance (i.e., that projects are meeting the cost, schedule, and 
performance commitments upon which their approval was justified). For 
programs such as SPS, DOD required this cost, schedule, and 
performance information to be reported quarterly to ensure that 
programs did not deviate significantly from expectations.[Footnote 14] 
In effect, these requirements and guidance recognize that one cannot 
manage what one cannot measure. 

Other Studies Reported Similar Findings and Identified Other 
Concerns	Shortly after receiving our draft report for comment, 
the department initiated several studies to determine the program's 
current status, assess program risks, and identify actions to improve 
the program.[Footnote 15] These studies focused on such areas as 
program costs and benefits, planned commitments, requirements 
management, program office structure, and systems acceptance testing. 
Consistent with our findings and recommendations, these studies 
identified the need to: 

* establish performance metrics that will enable the department to 
measure the program's performance and tie these metrics to benefits 
and customer satisfaction; 

* clearly define organizational accountability for the program; 

* provide training for all new software releases; 

* standardize the underlying business processes and rules that the 
system is to support; 

* acquire the software source code; and; 

* address open customer concerns to ensure user satisfaction. 

In addition, the department found other program management concerns 
not directly within the scope of our review, such as the need to: 

* appropriately staff the program management office with sufficient 
resources and address the current lack of technical expertise in areas 
such as contracting, software engineering, testing, and configuration 
management; 

* modify the existing contract to recognize that the system does not 
employ a commercial-off-the-shelf software product, but rather is 
based on customized software product; 

* establish DOD-controlled requirements management and acceptance 
testing processes and practices that are rigorous and disciplined; and; 

* assess the continued viability of the existing contractor. 

DOD Has Begun Addressing Problems, But SPS's Future Remains Uncertain: 

To address the many weaknesses in the SPS program, we made several 
recommendations in our July 2001 report.[Footnote 16] Specifically, we 
recommended that (1) investment in future releases or major 
enhancements to the system be made conditional on the department first 
demonstrating that the system is producing benefits that exceed costs; 
(2) future investment decisions, including those regarding operations 
and maintenance, be based on complete and reliable economic 
justifications; (3) any analysis produced to justify further 
investment in the program be validated by the Director, Program 
Analysis and Evaluation; (4) the Assistant Secretary of Defense for 
Command, Control, Communications, and Intelligence (C3I) clarify 
organizational accountability and responsibility for measuring SPS 
program against commitments and to ensure that these responsibilities 
are met; (5) program officials take the necessary actions to determine 
the current state of progress against program commitments; and (6) the 
Assistant Secretary of Defense for C3I report by October 31, 2001, to 
the Secretary of Defense and to DOD's relevant congressional 
committees on lessons learned from the SPS investment management 
experience, including what actions will be taken to prevent a 
recurrence of this experience on other system acquisition programs. 

DOD's reaction to our report was mixed. In official comments on a 
draft of our report, the Deputy CIO generally disagreed with our 
recommendations, noting that they would delay development and 
deployment of SPS. Since that time, however, the department has 
acknowledged its SPS problems and begun taking steps to address some 
of them. In particular, it has done the following. 

* The department has established and communicated to applicable DOD 
organizations the program's chain-of-command and defined each 
participating organization's responsibilities. For example, the Joint 
Requirements Board was delegated the responsibility for working with 
the program users to define and reach agreement on the needed 
functionality for each software release. 

* The department has restructured the program office and assigned 
additional staff, including individuals with expertise in the areas of 
contracting, software engineering, configuration management, and 
testing. However, according to the current program manager, additional 
critical resources are needed, such as two computer information 
technology specialists and three contracting experts. 

* It has renegotiated certain contract provisions to assume greater 
responsibility and accountability for the requirements management and 
testing activities. For example, DOD, rather than the contractor, is 
now responsible for writing the test plans. However, additional 
contract changes remain to be addressed, such as training, help-desk 
structure, facilities support, and system operations and maintenance. 

* The department has designated a user-satisfaction manager for the 
program and defined forums and approaches intended to better engage 
users. 

* It has established a new testing process, whereby program officials 
now develop the test plans and maintain control over all software 
testing performed. 

In addition, SPS officials have stated their intention to: 

* prepare analyses for future program activities beyond those already 
under contract, such as the acquisition of additional system releases, 
and use these analyses in deciding whether to continue to deploy SPS 
or pursue another alternative; 

* define system performance metrics and use these metrics to assess 
the extent to which benefits have been realized from already deployed 
system releases; and; 

* report on lessons learned from its SPS experience to the Secretary 
of Defense and relevant congressional committees. 

The department's actions and intentions are positive steps and 
consistent with our recommendations. However, much remains to be 
accomplished. In particular, the department has yet to implement our 
recommendations aimed at ensuring that (1) future releases or major 
enhancements to the system be made conditional on first demonstrating 
that the system is producing benefits that exceed costs and (2) future 
investment decisions, including those regarding operations and 
maintenance, be based on a complete and reliable economic 
justification. 

We also remain concerned about the future of SPS for several 
additional reasons. First, definitive plans for how and when to 
justify future system releases or major enhancements to existing 
releases do not yet exist. Second, SPS officials told us that release 
4.2, which is currently under contract, may be expanded to include 
functionality that was envisioned for releases 5.0 and 5.1. Including 
such additional functionality could compound existing problems and 
increase program costs. Third, not all defense components have agreed 
to adopt SPS. For example, the Air Force has not committed to 
deploying the software; the National Imagery and Mapping Agency, the 
Defense Advanced Research Projects Agency, and the Defense 
Intelligence Agency have not yet decided to use SPS; and the DOD 
Education Agency has already adopted another system because it deemed 
SPS too expensive. 

In summary, effective investment in information technology depends on 
organizations (1) justifying programs via incremental business cases 
that are based on reliable data and sound analysis, (2) making 
decisions on investments in programs on an incremental basis, and (3) 
monitoring actual return on investment (benefits achieved and costs 
incurred) for each increment and using this information to facilitate 
decisionmaking about future increments. In the case of SPS, this has 
not occurred. While DOD has begun taking steps to strengthen its 
management of certain aspects of the program and committed to 
strengthening its investment management practices, questions still 
remain as to what will be done and when. To increase the chances of 
program success, the department must expeditiously follow through on 
its stated commitments and address each of our recommendations. If it 
does not, it risks acquiring and deploying a procurement system that 
will not produce business value commensurate with costs. 

This concludes my statement. I would be pleased to answer any 
questions you or Members of the Subcommittee may have at this time.
For further information regarding this testimony, please contact 
Randolph C. Hite, Director, Information Technology Systems Issues, at 
(202) 5123439, or Cynthia Jackson, Assistant Director, Information 
Technology Systems Issues, at (202) 512-5086. You may also contact 
them by e-mail at hiter@gao.gov or jacksonc@gao.gov, respectively. 

[End of section] 

Footnotes: 

[1] The President's Management Agenda: Fiscal Year 2002, Executive 
Office of the President, Office of Management and Budget. 

[2] Clinger-Cohen Act of 1996, Public Law 104-106; Office of 
Management and Budget Circular A-130, Management of Federal 
Information Resources (November 30, 2000). 

[3] U.S. General Accounting Office, DOD Systems Modernization: 
Continued Investment in the Standard Procurement System Has Not Been 
Justified, [hyperlink, http://www.gao.gov/products/GAO-01-682] 
(Washington, D.C.: July 31, 2001). 

[4] DOD is not acquiring the source code for SPS and, unless an 
expanded license is obtained, is required to obtain sole-source 
support over the life of this system from AMS. 

[5] [hyperlink, http://www.gao.gov/products/GAO-01-682] (July 31, 
2001). 

[6] All DOD components except the Air Force have deployed subrelease 
4.1e; the Air Force has only deployed through subrelease 4.1b. The Air 
Force is scheduled to begin deployment of release 4.1e in March 2002. 

[7] DDP is organizationally located within the Office of the Under 
Secretary of Defense for Acquisition, Technology and Logistics. 

[8] DOD Regulation 5000.2-R, Mandatory Procedures for Major Defense 
Acquisition Programs and Major Automated Information System 
Acquisition Programs specifies mandatory policies and procedures for 
major acquisitions. The policy also specifies that the DOD CIO is the 
milestone decision authority, responsible for program approval, for 
all major automated information systems, such as SPS. 

[9] [hyperlink, http://www.gao.gov/products/GAO-01-682] (July 31, 
2001). 

[10] That is, an analysis to explicitly present the return-on-
investment implications associated with using estimates whose inherent 
imprecision could produce a range of outcomes. 

[11] D0D Interim Regulation 5000.2-R, Mandatory Procedures for Major 
Defense Acquisition Programs and Major Automated Information System 
Acquisition Programs (January 4, 2001). 

[12] In January 2001, DOD issued a change to its major system 
acquisition policy requiring incremental investment management. 
Specifically, the policy notes that a program's milestone decision 
authority must verify that each increment meets part of the mission 
need and delivers a measurable benefit, independent of future 
increments. 

[13] Clinger-Cohen Act of 1996, Public Law 104-106, and OMB Circular A-
130 (November 30, 2000). 

[14] DOD Interim Regulation 5000.2-R, Mandatory Procedures for M4jor 
Defense Acquisition Programs and Major Automated Information System 
Acquisition Programs (January 4, 2001).	 

[15] See for example, SPS Contract Review: Preliminary Report and 
Status, August 1, 2001; The Present State of the SPS Program, Software 
Engineering Institute, October 19, 2001; and Independent Review of the 
Standard Procurement System Program, Gartner Consulting, November 29, 
2001. 

[16] [hyperlink, http://www.gao.gov/products/GA0-01-682], July 31, 
2001. 

[End of testimony]