This is the accessible text file for GAO report number GAO-07-51 
entitled 'Information Technology: DOD Needs to Ensure That Navy Marine 
Corps Intranet Program Is Meeting Goals and Satisfying Customers' which 
was released on December 8, 2006. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

Report to Congressional Addressees: 

December 2006: 

Information Technology: 

DOD Needs to Ensure That Navy Marine Corps Intranet Program Is Meeting 
Goals and Satisfying Customers: 

GAO-07-51: 

GAO Highlights: 

Highlights of GAO-07-51, a report to congressional addressees 

Why GAO Did This Study: 

The Navy Marine Corps Intranet (NMCI) is a 10-year, $9.3 billion 
information technology services program. Through a performance-based 
contract, the Navy is buying network (intranet), application, and other 
hardware and software services at a fixed price per unit (or “seat”) to 
support about 550 sites. GAO prepared this report under the Comptroller 
General’s authority as part of a continued effort to assist Congress 
and reviewed (1) whether the program is meeting its strategic goals, 
(2) the extent to which the contractor is meeting service level 
agreements, (3) whether customers are satisfied with the program, and 
(4) what is being done to improve customer satisfaction. To accomplish 
this, GAO reviewed key program and contract performance management 
related plans, measures, and data and interviewed NMCI program and 
contractor officials, as well as NMCI customers at shipyards and air 
depots. 

What GAO Found: 

NMCI has not met its two strategic goals—to provide information 
superiority and to foster innovation via interoperability and shared 
services. Navy developed a performance plan in 2000 to measure and 
report progress towards these goals, but did not implement it because 
the program was more focused on deploying seats and measuring 
contractor performance against contractually specified incentives than 
determining whether the strategic mission outcomes used to justify the 
program were met. GAO’s analysis of available performance data, 
however, showed that the Navy had met only 3 of 20 performance targets 
(15 percent) associated with the program’s goals and nine related 
performance categories. By not implementing its performance plan, the 
Navy has invested, and risks continuing to invest heavily, in a program 
that is not subject to effective performance management and has yet to 
produce expected results. 

GAO’s analysis also showed that the contractor’s satisfaction of NMCI 
service level agreements (contractually specified performance 
expectations) has been mixed. Since September 2004, while a significant 
percentage of agreements have been met for all types of seats, others 
have not consistently been met, and still others have generally not 
been met. Navy measurement of agreement satisfaction shows that 
performance needed to receive contractual incentive payments for the 
most recent 5-month period was attained for about 55 to 59 percent of 
all eligible seats, which represents a significant drop from the 
previous 9-month period. GAO’s analysis and the Navy’s measurement of 
agreement satisfaction illustrate the need for effective performance 
management, to include examining agreement satisfaction from multiple 
perspectives to target needed corrective actions and program changes. 

GAO analysis further showed that NMCI’s three customer groups (end 
users, commanders, and network operators) vary in their satisfaction 
with the program. More specifically, end user satisfaction surveys 
indicated that the percent of end users that met the Navy’s definition 
of a satisfied user has remained consistently below the target of 85 
percent (latest survey results categorize 74 percent as satisfied). 
Given that the Navy’s definition of the term “satisfied” includes many 
marginally satisfied and arguably somewhat dissatisfied users, this 
percentage represents the best case depiction of end user satisfaction. 
Survey responses from the other two customer groups show that both were 
not satisfied. GAO interviews with customers at shipyards and air 
depots also revealed dissatisfaction with NMCI. Without satisfied 
customers, the Navy will be challenged in meeting program goals. 

To improve customer satisfaction, the Navy identified various 
initiatives that it described as completed, under way, or planned. 
However, the initiatives are not being guided by a documented plan(s), 
thus limiting their potential effectiveness. This means that after 
investing about 6 years and $3.7 billion, NMCI has yet to meet 
expectations, and whether it will is still unclear. 

What GAO Recommends: 

GAO is making recommendations to the Secretary of Defense aimed at 
implementing effective program performance management, expanding 
measurement and understanding of service level agreement performance, 
effectively managing customer satisfaction improvement efforts, and 
deciding whether overall performance to date warrants program changes. 
In commenting on a draft of this report, DOD agreed with GAO’s 
recommendations. 

[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-51]. 

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Randolph C. Hite at 202-
512-3439 or hiter@gao.gov. 

[End of Section] 

Contents: 

Letter: 

Results in Brief: 

Background: 

Navy Has Not Met NMCI Strategic Goals and Has Not Focused on Measuring 
Strategic Program Outcomes: 

Contractor Has Largely Met Many but Has Not Met Other SLAs: 

NMCI Customer Groups' Satisfaction Levels Vary, but Overall Customer 
Satisfaction Is Low: 

Customer Satisfaction Improvement Efforts Are Not Being Guided by 
Effective Planning: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendixes: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Customer Satisfaction Survey Questions: 

Appendix III: SLA Descriptions and Performance: 

Appendix IV: Comments from the Department of Defense: 

Appendix V: GAO Contact and Staff Acknowledgments: 

Tables: 

Table 1: List of SLAs Organized by Tier and Category: 

Table 2: NMCI End User Customer Satisfaction Survey Questions: 

Table 3: NMCI Satisfaction of Performance Targets for Each Performance 
Category for Fiscal Year 2005: 

Table 4: NMCI End User Customer Satisfaction Survey Questions and 
Results for the Quarterly Period Ending on March 31, 2006: 

Table 5: Percentages of Satisfied End Users by Navy Budget Submitting 
Office, as of March 31, 2006: 

Table 6: Percentages of Satisfied End Users by U.S. Marine Corps Major 
Command, as of March 31, 2006: 

Table 7: Results of the 6-Month Periods Ending on September 30, 2005 
and March 31, 2006, Commander Surveys: 

Table 8: Results for the 6-Month Periods Ending on September 30, 2005, 
and March 31, 2006, Network Operations Leaders Survey: 

Table 9: NMCI End User Customer Satisfaction Survey Questions: 

Table 10: Navy Echelon II and Marine Corps Major Command Commander's 
Customer Satisfaction Incentive Survey's Questions: 

Table 11: Navy and Marine Corps Network Operations Leader's Customer 
Satisfaction Incentive Survey's Questions: 

Figures: 

Figure 1: Simplified Department of the Navy Organization Chart: 

Figure 2: Organizations Responsible for NMCI Management and Oversight: 

Figure 3: The Value of the NMCI Contract: 

Figure 4: Site Level Performance for SLA 225: 

Figure 5: Site Level Performance for SLA 328: 

Figure 6: Site Level Performance for SLA 102: 

Figure 7: Site Level Performance for SLA 107: 

Figure 8: Percentage of Seats Meeting SLA 101: 

Figure 9: Percentage of Seats Meeting SLA 103: 

Figure 10: Months in Which the Enterprisewide SLAs Were Met and Not Met 
between October 2004 and March 2006: 

Figure 11: Trend in the Percentage of Operational Seats Meeting Either 
the Full Payment or Full Performance Levels: 

Figure 12: Number of Seats Achieving Either the Full Payment or Full 
Performance Levels Versus the Number of Seats Eligible: 

Figure 13: Trends in End User Satisfaction Levels Related to Program 
Contractor Target Levels: 

Figure 14: Site Level Performance SLA 101: 

Figure 15: Site Level Performance for SLA 102: 

Figure 16: Site Level Performance for SLA 103: 

Figure 17: Enterprisewide Performance for SLA 103: 

Figure 18: Enterprisewide Performance for SLA 104: 

Figure 19: Site Level Performance for SLA 105: 

Figure 20: Enterprisewide Performance for SLA 106: 

Figure 21: Site Level Performance for SLA 107: 

Figure 22: Enterprisewide Performance for 203: 

Figure 23: Site Level Performance for SLA 204: 

Figure 24: Enterprisewide Performance for SLA 204: 

Figure 25: Site Level Performance for SLA 206: 

Figure 26: Enterprisewide Performance for SLA 206: 

Figure 27: Site Level Performance for SLA 211: 

Figure 28: Enterprisewide Performance for SLA 211: 

Figure 29: Site Level Performance for SLA 225: 

Figure 30: Enterprisewide Performance for SLA 226: 

Figure 31: Site Level Performance for SLA 231: 

Figure 32: Enterprisewide Performance for SLA 231: 

Figure 33: Site Level Performance for SLA 324: 

Figure 34: Site Level Performance for SLA 325: 

Figure 35: Site Level Performance for SLA 328: 

Figure 36: Enterprisewide Performance for SLA 329:  

Figure 37: Site Level Performance for SLA 332: 

Figure 38: Enterprisewide Performance for SLA 333: 

Figure 39: Enterprisewide Performance for SLA 334: 

Figure 40: Enterprisewide Performance for SLA 336: 

Abbreviations: 

BAN/LAN: Base Area Network/Local Area Network: 

BSO: Budget Submitting Office: 

CIO: Chief Information Officer: 

CNO: Chief of Naval Operations: 

DOD: Department of Defense: 

EDS: Electronic Data Systems: 

IAVA: Information Assurance Vulnerability Alert: 

INFOCON: Information Operations Condition: 

IT: information technology: 

LISI: Levels of Information Systems Interoperability: 

MAC: Move Add Change: 

MCNOSC: Marine Corps Network Operations and Security Command: 

MTDN: Marine Corps Tactical Data Network: 

NETWARCOM: Network Warfare Command: 

NIPRNET: Unclassified but Sensitive Internet Protocol Router Network: 

NMCI: Navy Marine Corps Intranet: 

PEO-EIS: Program Executive Officer for Enterprise Information Systems: 

PKI: Public Key Infrastructure: 

SIPRNET: Secret Internet Protocol Router Network: 

SLA: service level agreement: 

December 8, 2006: 

Congressional Addressees: 

The Navy Marine Corps Intranet (NMCI) program is a multiyear 
information technology (IT) services program; its goals are to provide 
information superiority and to foster innovation via interoperability 
and shared services. The Navy awarded the NMCI services contract-- 
currently valued at $9.3 billion--to Electronic Data Systems (EDS) in 
October 2000. The contract calls for EDS to replace thousands of 
independent networks, applications, and other hardware and 
software[Footnote 1] with a single, internal communications network 
(intranet), and associated desktop, server, and infrastructure assets 
and services for Navy and Marine Corps customers (end users, network 
operators, and commanders). 

Because of the size and importance of NMCI, as well as continuing 
widespread congressional interest, we prepared this report under the 
Comptroller General's authority as part of a continued effort to assist 
Congress and reviewed (1) whether the program is meeting its strategic 
goals, (2) the extent to which the contractor is meeting its service 
level agreements (SLA),[Footnote 2] (3) whether customers are satisfied 
with the program, and (4) what is being done to improve customer 
satisfaction. 

To accomplish these objectives, we reviewed program documentation, 
analyzed performance data (including those related to SLAs and customer 
satisfaction surveys), reviewed collection processes and results, met 
with customers at several large NMCI sites (Navy shipyards and air 
depots) to discuss their level of satisfaction, and interviewed 
officials from the program office, the Navy's Chief Information 
Officer's (CIO) office, and EDS. We performed our work from April 2005 
to August 2006, in accordance with generally accepted government 
auditing standards. Details on our objectives, scope, and methodology 
are in appendix I. 

Results in Brief: 

After investing about 6 years and $3.7 billion on NMCI, the Navy has 
yet to meet the program's two strategic goals--to provide information 
superiority and to foster innovation. A plan that the Navy developed in 
2000 to measure various aspects of the program, and thereby gauge 
program goal attainment, has not been implemented, and associated 
performance reports have not been issued. According to Navy officials, 
implementing this plan has not been as high a priority as, for example, 
deploying NMCI and measuring contractor performance. While program 
officials told us that NMCI has achieved much, they were unable to 
provide performance data to demonstrate these achievements relative to 
either the program's strategic goals or the nine performance categories 
that its 2000 performance measurement plan and other initiatives 
defined for these goals. Given this, we mapped contractor performance 
targets and data to the nine performance categories and strategic 
goals, which prompted the Navy to do the same. The Navy's mapping shows 
that NMCI has met only 3 of 20 performance targets (15 percent). This 
means that the mission-critical information superiority and operational 
innovation outcomes used to justify NMCI have yet to be attained. 

NMCI contractor performance in meeting SLAs depends on how satisfaction 
of the agreements is measured and presented. When we analyzed 
performance relative to operational "seats" since September 2004, 
without regard to the operational status of any site,[Footnote 3] we 
determined that while EDS had largely met many of the agreements, it 
had not consistently met others, and still other agreements were 
generally not being met. For example, during March 2006, EDS met its 
agreement to resolve customer problems reported to the help desk for 91 
percent of the basic seats, but did not meet this agreement for 52 
percent of the mission-critical seats.[Footnote 4] According to the 
Navy, it does not measure SLA performance in this manner. Instead, it 
measures agreement performance as defined in the contract for purposes 
of determining contract incentive payments. Using this approach, the 
Navy reports that, as of March 2006, the contractor achieved "full 
payment" or "full performance," which are levels of performance that 
qualify for increased payments, for approximately 55 percent of the 
"eligible" seats. In contrast, the Navy reports that these performance 
levels were met for about 94 percent of eligible seats in June 2005. 
These views on agreement performance illustrate that, by having robust 
performance management efforts and considering a range of perspectives 
and metrics, important performance insights can be identified and used. 

NMCI customers, which the Navy divides into three groups--end users, 
organizational commanders, and network operators--vary in the extent to 
which they are satisfied with the program's performance. With respect 
to end users, the Navy reports that the percentage satisfied with NMCI 
rose from about 54 percent in December 2002, to about 80 percent in 
September 2005. However, the rate of improvement dropped off after June 
2004, and the percentage of end users that the Navy considers to be 
satisfied is below the Navy-wide target of 85 percent. Moreover, the 
percentage of end users considered to be satisfied includes many 
satisfaction survey responses that are at the lower end of the range of 
scores that the Navy has defined "satisfied" to mean. With respect to 
commander and network operator satisfaction, the latest Navy data show 
that these two customer groups are not satisfied. For example, on a 
scale from 0-3, with 0 being dissatisfied and 1 being slightly 
satisfied, commanders' response averaged 0.8 and operators' response 
averaged 0.3. In addition, officials representing customer groups at 
five shipyard or air depot installations that we visited expressed a 
number of concerns and areas of dissatisfaction with NMCI. For example, 
they told us that they have had to continue using their existing IT 
systems to support daily operations because NMCI does not adequately 
meet their needs. Without satisfied customers, the Navy runs the risk 
that NMCI will not attain the widespread acceptance necessary to ever 
achieve strategic program goals. 

NMCI program officials told us that improving customer satisfaction is 
a program priority and thus they have invested and continue to invest 
time and resources in a variety of improvement activities. For example, 
they said that they have expanded NMCI capabilities in a number of 
ways, such as the implementation of broadband remote access. However, 
these improvement efforts are not being guided by a documented plan or 
plans with prioritized initiatives that are defined in terms of 
activities to be performed, resources to be committed, schedules to be 
met, and measurable results to be achieved. Instead, officials told us 
that because they have limited resources, they undertake improvement 
activities that have not been prioritized whenever resources become 
available. Given the importance of NMCI customer satisfaction, it is 
important to take a structured and disciplined approach to managing 
improvement activities. Without it, the program office cannot 
adequately ensure that improvement activities are cost effectively 
managed. 

To assist the Navy in managing and making informed investment decisions 
about the NMCI program, we are making recommendations to the Secretary 
of Defense aimed at implementing effective program performance 
management, expanding measurement and understanding of SLA performance, 
effectively managing customer satisfaction improvement efforts, and 
deciding whether performance to date warrants changes to the program. 

In written comments on a draft of this report, the Department of 
Defense (DOD) stated that it agreed with our recommendations. 
Nevertheless, the department also said that the Navy believes that the 
draft report contained factual errors, data misinterpretations, and 
unsupported conclusions. In this regard, the Navy generally made five 
points. 

* It said that our review focused on Navy shipyards and air depots and 
excluded Marine Corps sites. We disagree. Our scope, as stated 
throughout the report, extended to both Navy and Marine Corps sites and 
customers. 

* The Navy said that NMCI is a strategic success and is meeting its 
goals of providing information superiority and fostering innovation. We 
disagree. As we show in the report, the Navy's own performance targets, 
along with SLA and other performance data, show that NMCI has met only 
3 of 20 performance categories associated with its two goals. Meeting 
program strategic goals, in our view, should be the measure of a 
program's strategic success. 

* The Navy said that we misinterpreted SLA data as they relate to the 
contractually-specified performance categories of full payment and full 
performance. We disagree. Our use of SLA data relative to the full 
payment and full performance categories presents the Navy's own 
analysis and includes no GAO interpretations. The analysis of SLA data 
that we performed and included in the report decouples these data from 
these two performance categories and offers more visibility into and 
coverage of contractor performance relative to each individual SLA. 

* The Navy said that our conclusion that certain customers were 
marginally satisfied is not supported by the survey responses, which 
the Navy contends can only be viewed as either satisfied or unsatisfied 
customers. While we acknowledge that the Navy views responses of 5.5 or 
higher on a 1-10 point scale as satisfied customers, our point is that 
this viewing is too simplistic because it does not differentiate 
between degrees of satisfaction. Therefore, our characterizing of 
responses of 5.5 to 7 as marginally satisfied provides additional 
insight and perspective into customers' true level of satisfaction. 

* The Navy said the program office adequately reports to key program 
decision makers. We disagree, as evidenced by the fact that this 
reporting has not conveyed the range and magnitude of performance and 
customer satisfaction issues that our report contains. 

Beyond these major points, the Navy also provided various technical 
comments, which we have incorporated as appropriate in this report. 

Background: 

The Department of the Navy is a large and complex organization with a 
wide range of mission operations and supporting business functions. For 
example, the Navy has about 350,000 active duty officers and enlisted 
personnel, 130,000 ready reserve, and 175,000 civilian employees. 
Navy's fleet operations involve approximately 280 ships and 4,000 
aircraft operating throughout the world. Further, the Navy's annual 
operating budget is about $120 billion and is used to fund such things 
as ship and aircraft operations, air depot maintenance, and Marine 
Corps operations. 

The department's primary organizational components are the Secretary of 
the Navy, the Chief of Naval Operations, and the Commandant of the 
Marine Corps. The structural relationships among these components are 
summarized later and in figure 1. 

Figure 1: Simplified Department of the Navy Organization Chart: 

[See PDF for image] - graphic text: 

Source: GAO based on Navy data. 

[End of figure] - graphic text: 

* Secretary of the Navy: Department of the Navy headquarters recruits, 
organizes, supplies, equips, trains, and mobilizes, naval forces. Among 
other things, this includes construction, outfitting, and repair of 
Navy and Marine Corps ships, equipment, and facilities. It also 
includes formulating and implementing policies and programs. 

* Naval and Marine Corps Operating Forces: The operating forces 
commanders and fleet commanders have two chains of command. 
Administratively, they report to the Chief of Naval Operations, and are 
responsible for providing, training, and equipping naval forces. 
Operationally, they provide naval forces and report to the appropriate 
Unified Combatant Commanders. The operating forces include a variety of 
organizations with diverse missions, such as the Atlantic and Pacific 
Fleets, Naval Network Warfare Command, and Naval Reserve Forces. 

* Naval shore establishment: The Navy shore establishment includes 
facilities and activities for repairing machinery, electronics, ships, 
and aircraft; providing communications capabilities; providing 
training; providing intelligence and meteorological support; storing 
repair parts, fuel, and munitions; and providing medical support. It 
consists of organizations such as the Naval Sea Systems Command (which 
includes shipyards), Naval Air Systems Command (which includes aviation 
depots), Space and Naval Warfare Systems Command, Navy Personnel 
Command, Naval Education and Training Command, and the Office of Naval 
Intelligence. 

The Navy's many and dispersed organizational components rely heavily on 
IT to help them perform their respective mission operations and 
business functions. For fiscal year 2006, the Navy's IT budget was 
about $5.8 billion, which included funding for the development, 
operation, and maintenance of Navy-owned IT systems, as well as funding 
for contractor-provided IT services and programs, such as NMCI. 

The Assistant Secretary of the Navy for Research, Development and 
Acquisition is responsible for Navy acquisition programs. Reporting to 
the Assistant Secretary are numerous entities that have authority, 
responsibility, and accountability for life-cycle management of 
acquisition programs within their cognizance. These entities include 
certain program managers, system command, and program executive 
officers. 

The Navy Chief Information Officer (CIO) is responsible for developing 
and issuing IT management policies and standards in coordination with 
the above Assistant Secretary, the system commands, and others. The 
Navy CIO is also responsible for ensuring that major programs comply 
with the Clinger-Cohen Act (1996)[Footnote 5] and for recommending to 
the Secretary of the Navy whether to continue, modify, or terminate IT 
programs, such as NMCI. 

NMCI Purpose, Scope, and Status: 

NMCI is a major, Navy-wide IT services program. Its goals are to 
provide information superiority--an uninterrupted information flow and 
the ability to exploit or deny an adversary's ability to do the same-- 
and to foster innovative ways of operating through interoperable and 
shared network services. The program is being implemented through a 
multiyear IT services contract that is to provide desktop, server, 
infrastructure, and communications-related services at Navy and Marine 
Corps sites located in the United States and Japan. Through this 
contract, the Navy is replacing independent local and wide area 
networks with a single network and related desktop hardware and 
software that are owned by the contractor. Among other things, the 
contractor is to provide voice, video, and data services; 
infrastructure improvements; and customer service. This type of 
contract is commonly referred to as "seat management." Generally 
speaking, under seat management, contractor-owned desktop and other 
computing hardware, software, and related services are bundled and 
provided on the basis of a fixed price per unit (or seat). 

In October 2000, the Navy's goal was to have between 412,000 and 
416,000 seats operational by fiscal year 2004. As of June 2006, the 
Navy reported that about 303,000 seats were operational at about 550 
sites. According to the Navy, initial delays in meeting deployment 
schedules were due to underestimates in its existing inventory of 
legacy applications that needed to be migrated to NMCI. Subsequent 
delays were attributed to developing and implementing a certification 
and accreditation process[Footnote 6] for all applications, as well as 
legislation[Footnote 7] requiring certain analyses to be completed 
before seat deployment could exceed specific levels. 

The number of seats at each site ranges from a single seat to about 
10,000. These sites include small sites, such as office facilities 
located throughout the United States, and large sites, such as 
shipyards and air depots, which use unique software to assist in repair 
work.[Footnote 8] 

NMCI Program Management Structure: 

Various organizations in the Navy are responsible for NMCI management 
and oversight (see fig. 2). The Program Executive Officer for 
Enterprise Information Systems (PEO-EIS) along with the NMCI Program 
Manager are responsible for NMCI acquisition and contract management. 
The program is also overseen and supported by several groups. One is 
the Navy's Information Executive Committee, which provides guidance 
for, and oversight of, NMCI and other information issues. The committee 
is made up of CIOs from a range of Navy commands, activities, offices, 
and other entities within the Navy. Another is the NMCI Executive 
Committee, which includes representatives of the heads of a broad cross 
section of organizations throughout the Navy, and the contractor. Its 
mission is to help in the review, oversight, and management of the 
Navy's implementation of NMCI, as well as to assist in identifying and 
resolving process and policy impediments within the Navy that hinder an 
efficient and effective implementation process. Additionally, the 
Network Warfare Command (NETWARCOM)[Footnote 9] and the Marine Corps 
Network Operations and Security Command (MCNOSC),[Footnote 10] are the 
two entities primarily responsible for network operations management in 
the Navy and Marine Corps, respectively. The Navy CIO is responsible 
for overall IT policy. 

Figure 2: Organizations Responsible for NMCI Management and Oversight: 

[See PDF for image] - graphic text: 

Source: GAO based on Navy data. 

[End of figure] - graphic text: 

NMCI Contract Description: 

On October 6, 2000, the Navy awarded a 5-year contract for NMCI 
services to a single service provider--EDS--for an estimated 412,000 to 
416,000 seats and minimum value of $4.1 billion. The original contract 
also included a 3-year option for an additional $2.8 billion in 
services, bringing the potential total contract value to $6.9 billion. 
The department and EDS subsequently restructured the contract to be a 7-
year, $6 billion contract with a 3-year option for an additional $2.8 
billion beginning in fiscal year 2008. Following further contract 
restructuring and the Navy's decision to exercise the 3-year option, 
the total contract period and minimum value is now 10 years and about 
$9.3 billion. Figure 3 illustrates the value of the NMCI contract. 

Figure 3: The Value of the NMCI Contract: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

The NMCI contract type is commonly referred to as seat management 
because pricing for the desktop services is based on a fixed price per 
"seat." Seats include desktop computers, as well as other devices, such 
as cellular phones. Pricing for these seats varies depending on the 
services provided. For example, having classified connectivity, mission-
critical service, additional user accounts, or additional software 
installation increases the amount paid per seat. 

The NMCI contract is performance-based, which means that it contains 
monetary incentives to provide services at specified levels of quality 
and timeliness. The contract includes several types of incentives, 
including incentives tied to SLA performance, and customer satisfaction 
surveys. 

SLAs: 

The contract currently specifies 23 SLAs divided into three tiers: 100 
SLAs, 200 SLAs, and 300 SLAs. The 100 tier is referred to as base 
agreements, the 200 as transitional agreements, and the 300 as 
additional agreements. Examples of agreements for each tier are 
provided below. 

* 100--End user services (SLA 103): 

* 200--Web access services (SLA 206): 

* 300--Network management services (SLA 328): 

SLAs are further categorized as enterprisewide, site-specific, or both. 
Unlike site-specific SLAs, enterprisewide SLAs are not analyzed on a 
site-by-site basis. See table 1 for a list of agreements organized by 
tier and category. 

Table 1: List of SLAs Organized by Tier and Category: 

Base agreements: 

SLA number and name: 101-End user problem resolution; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 102-Network problem resolution; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 103-End user services; 
Site-specific: X; 
Enterprisewide: X. 

SLA number and name: 104-Help desk; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 105-Move, add, change; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 106-Information assurance incentives; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 107-NMCI intranet; 
Site-specific: X; 
Enterprisewide: [Empty]. 

Transitional agreements:  

SLA number and name: 203-E-mail services; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 204-Directory services; 
Site-specific: X; 
Enterprisewide: X. 

SLA number and name: 206-Web access services; 
Site-specific: X; 
Enterprisewide: X. 

SLA number and name: 211-Unclassified but Sensitive Internet Protocol 
Router Network (NIPRNET) access; 
Site-specific: X; 
Enterprisewide: X. 

SLA number and name: 225-Base area network/local area network 
communications services; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 226-Proxy and caching services; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 231-System service - Domain name server; 
Site-specific: X; 
Enterprisewide: X. 

Additional agreements:  

SLA number and name: 324-Wide area network network connectivity; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 325-Base area network/local area network 
communications services; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 328-Network management service; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 329-Operational support services; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 332-Application server connectivity; 
Site-specific: X; 
Enterprisewide: [Empty]. 

SLA number and name: 333-Security operational services; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 334-Information assurance operational service-PKI; 
Site-specific: [Empty]; 
Enterprisewide: X. 

SLA number and name: 336-Information assurance planning services; 
Site-specific: [Empty]; 
Enterprisewide: X. 

Source: GAO analysis of NMCI SLA data. 

[End of table] 

Each agreement has one or more performance categories. For example, SLA 
102 has 1 performance category (Network Problem Resolution), while SLA 
107 has 3 performance categories (NMCI Intranet Availability, Latency/ 
Packet Loss, and Voice and Video Quality of Service). Collectively, 
there are 51 performance categories. 

Each performance category has specific performance targets that the 
contractor must reach in order for the category to be met. An example 
of a target is providing e-mail server services to users 99.7 percent 
of the time that they are supposed to be available. 

The contract currently specifies two levels of performance to be used 
in determining, on a site-by-site basis, what performance-based payment 
incentives, if any, EDS will earn in a given quarter (3-month 
period).[Footnote 11] If either of these levels of performance is not 
met, the contractor is to be paid 85 percent of the amount allowed 
under the contract for each seat that has been cut over (i.e., is 
operational). 

1. Full payment. To achieve this level for a given seat, the contractor 
must meet 100 percent of the applicable SLAs for that seat, and 50 to 
90 percent of the planned seats at the site must be cut over. Meeting a 
quarterly agreement is defined as performance at or above the 
applicable target(s) for either (1) 2 out of the 3 months preceding an 
invoice or (2) the current month of the invoice. If these conditions 
are met, the contractor is paid 100 percent of the amount allowed per 
seat. If, in subsequent months, the contractor fails to achieve 100 
percent of the agreements, the amount paid is 85 percent of the amount 
allowed per seat. 

2. Full performance. To achieve this level for a given seat, the 
contractor must meet 100 percent of the applicable SLAs for that seat, 
and over 90 percent of the planned seats at the site must be cut over. 
Meeting an agreement is defined as performance at or above the 
target(s) for either (1) 2 out of the 3 months preceding a quarterly 
invoice or (2) the current month of the invoice. If these conditions 
are met, the contractor is paid 100 percent of the amount allowed per 
seat. Once a site has achieved full performance, it remains eligible 
for full payments, regardless of changes to the numbers of seat orders. 
However, the contractor is required to provide "financial credits" to 
the Navy in the event that the agreements are not met at some future 
time. 

Customer Satisfaction Surveys: 

The contract also provides for administration of three customer 
satisfaction surveys: End User, Echelon II/ Major Command,[Footnote 12] 
and Network Operations Leaders. These surveys and their related 
financial incentives are discussed below. 

End User Satisfaction Survey: 

The contractor began conducting quarterly satisfaction surveys of Navy 
end users in June of 2002 and Marine Corps end users in March 2005. 
These surveys are administered to a different mix of 25 percent of 
eligible users[Footnote 13] each quarter, with nearly all users being 
surveyed each year. 

Since March 2004, the survey has consisted of 14 questions, all 
relating to satisfaction with the NMCI program[Footnote 14] and 10 
focusing on satisfaction with EDS.[Footnote 15] For each question, 
users are asked to indicate their level of dissatisfaction/satisfaction 
according to a 10-point scale, with 1-5 denoting levels of 
dissatisfaction, and 6-10 denoting levels of satisfaction. The Navy 
considers end users to be satisfied in general, with the program, or 
with the contractor, if the average response across the 14, 4, or 10 
questions, respectively, is 5.5 or higher. The survey instrument also 
includes space for additional comments and asks the end users to 
identify and rank reasons for dissatisfaction or suggestions for 
improvements. See table 2 for a list of the 14 questions. 

Table 2: NMCI End User Customer Satisfaction Survey Questions: 

What is your satisfaction. 

* With having access to the computer hardware you need to accomplish 
your job? 

With the dependability of the computer you use? 

* With having access to the software you need to accomplish your job? 

With network reliability? 

With the professionalism of EDS personnel? 

With finding and using information about NMCI services? 

With the accuracy of information describing how to use NMCI services? 

* With training on how to use NMCI effectively? 

With technical support services provided by the help desk? 

With technical support services provided by on-site personnel? 

With the timeliness of problem resolution? 

With the solution implemented to correct any problem you experienced? 

* With the process to make changes to your IT environment? 

What is your overall satisfaction with services provided by EDS? 

Source: March 2006 Quarterly Customer Satisfaction Survey Report. 

Note: Questions marked with an asterisk are not used for incentive 
purposes: 

[End of table] 

Based on the quarterly survey results, the contractor is eligible for 
an incentive payment of $12.50 per seat if 85 to 90 percent of the 
average responses is 5.5 or higher, and $25 per seat if greater that 90 
percent respond in this way. No incentive is to be paid if fewer than 
85 percent respond as being satisfied. 

Echelon II (Navy) and Major Command (Marine Corps) Commander Survey and 
Network Operations Leader Survey: 

In October 2004, the Navy designated two additional categories of 
customers--commanders and network operations leaders--and developed 
separate satisfaction surveys for each. In general, the commander 
survey focuses on whether NMCI is adequately supporting a command's 
mission needs and strategic goals; the network operations leader survey 
focuses on whether the contractor is meeting certain operational 
network requirements. The surveys are administered every 6 months. 

The latest commander survey was distributed to the heads of 23 Navy and 
Marine Corps command units. The network operations leader survey was 
distributed to NETWARCOM and MCNOSC. 

Both surveys are organized by major topic and subtopic. For the 
commander survey, the major topics and subtopics are as follows: 

* Warfighter support--including classified network support, deployable 
support, and emergent requirement support. 

* Cutover services--including planning, preparation, and execution. 

* Technical solutions--including the new service order and delivery 
process, and technical performance. 

* Service delivery--including organizational understanding, customer 
service, and issue management. 

For the network operations leader surveys, the major topics and 
subtopics are as follows: 

* Mission support and planning--including interoperability support, 
continuity of operations, future readiness, and public key 
infrastructure. 

* Network management--including network status information, information 
assurance, urgent software patch implementation, and data management. 

* Service delivery--including organizational understanding, 
communications, issue management, and flexibility and responsiveness. 

Appendix II provides a complete listing of the questions included in 
the commander survey and the network operations leader survey. 

Responses to the questions in both surveys are solicited on a scale of 
0-3, with 0 being dissatisfied, and 3 being extremely satisfied. To 
aggregate the respective surveys' results, the Navy averages the 
responses by command units, and network operations units. 

Based on the 6-month survey results, the contractor is eligible for an 
incentive payment of up to $50 per seat, with average scores of less 
than 0.5 receiving no incentive, 0.5 to less than 1.5 receiving 25 
percent of the incentive, between 1.5 to less than 2.25 receiving 50 
percent of the incentive, and at least 2.25 receiving 100 percent of 
the incentive. 

Previous GAO Work on NMCI: 

We have reported on a number of NMCI issues since the program's 
inception. For example, in March 2000, we reported that the Navy's 
acquisition approach and implementation plan had a number of 
weaknesses, and thus introduced unnecessary program risk. In 
particular, we said that the Navy lacked a plan for addressing many 
program requirements and information on NMCI's potential impacts on 
Navy personnel.[Footnote 16] 

In October 2002, we reported that NMCI's transition costs for shipyards 
and air depots was unclear, which in turn limited the ability of such 
industrially funded entities to set the future rates that they would 
charge their customers.[Footnote 17] Accordingly, we recommended that 
the program, in collaboration with the Naval Sea Systems Command and 
the Naval Air Systems Command, systematically and expeditiously resolve 
implementation issues that affect the ability of shipyards and depots 
to plan and budget. In response to these recommendations, the Navy took 
a number of actions, including establishing an Executive Customer Forum 
to, among other things, adjudicate issues requiring collaborative 
decision making among Navy component CIOs, including those from the 
Naval Sea Systems Command and the Naval Air Systems Command, which 
represent Navy shipyards and air depots, respectively. 

In April 2003, we reported on the extent to which five DOD IT services 
projects, including NMCI, had followed leading commercial outsourcing 
practices.[Footnote 18] For NMCI, we found that while the Navy had 
employed most of these practices, it did not follow the key practice 
related to establishing an accurate baseline of the existing IT 
environment, choosing instead to rely on a preexisting and dated 
inventory of its legacy applications. Because of this, we concluded 
that the Navy substantially underestimated the number of legacy 
applications that needed to transition to NMCI, in turn causing the 
program's time frame for transitioning to slip considerably. We 
recommended that DOD take steps to learn from such lessons, so that 
such mistakes are not repeated on future IT outsourcing projects. 

Navy Has Not Met NMCI Strategic Goals and Has Not Focused on Measuring 
Strategic Program Outcomes: 

Consistent with relevant laws and guidance, the Navy defined strategic 
goals for its NMCI program and developed a plan for measuring and 
reporting on achievement of these goals. However, the Navy did not 
implement this plan, choosing instead to focus on defining and 
measuring contractually specified SLAs. According to Navy officials, 
implementing the goal-oriented plan was not a priority, compared with 
swiftly deploying NMCI seats and measuring satisfaction of contract 
provisions. While program officials told us that NMCI has produced 
considerable mission value and achieved much, they did not have 
performance data to demonstrate progress in relation to either the 
program's strategic goals or nine performance categories that its plan 
and related efforts defined relative to these goals. Given this, we 
mapped SLAs to the nine performance categories and two strategic goals, 
which prompted the Navy to do the same. The Navy's mapping shows that 
NMCI has met few of the categories' performance targets, and thus has 
yet to meet either of the strategic goals. This means that the mission- 
critical information superiority and operational innovation outcomes 
that were used to justify investment in NMCI have yet to be attained. 
Without effective performance management, the Navy is increasing the 
risk that the program will continue to fall short of its goals and 
expected results. 

Navy Developed a Performance Management Plan to Measure and Report NMCI 
Progress in Meeting Strategic Goals but Did Not Implement It: 

Various laws --such as the Government Performance & Results Act and 
Clinger-Cohen Act--require federal agencies to identify and report on 
mission and strategic goals, associated performance measures, and 
actual performance. Federal IT guidance[Footnote 19] also recognizes 
the importance of defining program goals and related measures and 
performance targets, as well as determining the extent to which 
targets, measures, and goals are being met. 

In initiating NMCI, the Navy established two strategic goals for the 
program. According to the Navy, the program's primary goal is to 
support "information superiority," which it characterizes as "providing 
the capability to collect, process, and disseminate an uninterrupted 
flow of information while exploiting or denying an adversary's ability 
to do the same." In this regard, NMCI was to create an integrated 
network in which connectivity among all parts of the shore 
establishment, and with all deployed forces at sea and ashore, enables 
all members of the network to collaborate freely, share information, 
and interoperate with other services and nations. The second goal is to 
"foster innovation" by providing an interoperable and shared services 
"environment that supports innovative ways of integrating doctrine and 
tactics, training, and supporting activities into new operational 
capabilities and more productive ways of using resources." Related to 
these goals, the Navy also cited significant benefits that were to 
accrue from NMCI, including (1) an uninterrupted flow of information; 
(2) improvements to interoperability, security, information assurance, 
knowledge sharing, productivity, and operational performance; and (3) 
reduced costs. 

To determine its progress in meeting these program goals and producing 
expected benefits, the Navy included a performance measurement plan in 
its "2000 Report to Congress" on NMCI. According to the Navy, the 
purpose of this 2000 performance measurement plan was to document its 
approach to ensuring that key NMCI outcomes (i.e., results and 
benefits) and measures were identified and collected. In this regard, 
the plan identified eight strategic performance measurement categories, 
and related them to the NMCI strategic program goals. Subsequently, the 
Navy added a ninth performance category. According to program office 
and the Navy CIO officials, the nine performance categories are all 
relevant to determining program performance and strategic goal 
attainment. Moreover, the plan states that these categories provide for 
making NMCI an integrated portion of the Navy and Marine Corps 
strategic vision, support the principles of using IT to support people, 
and focus on the mission value of technology. 

These nine categories, including the Navy's definition of each, are as 
follows: 

* Interoperability: ability to allow Navy systems and applications to 
communicate and share information with, and for providing services to 
and accepting services from, other military services. 

* Security and information assurance: compliance with relevant DOD, 
Navy, and Marine Corps information assurance policies and procedures. 

* Workforce capabilities: ability to (1) increase people's access to 
information, (2) provide tools and develop people's skills for 
obtaining and sharing information, and (3) support a knowledge-centric 
and -sharing culture that is built on mutual trust and respect. 

* Process improvement: role as a strategic enabler for assessment and 
benchmarking of business and operational processes, and for sharing of 
data, information, applications, and knowledge. 

* Operational performance: ability to support improved mission 
(operational and business) performance. 

* Service efficiency: economic effectiveness (i.e., its cost versus 
services and benefits). 

* Customer satisfaction: key stakeholders (e.g., end users,) degree of 
satisfaction. 

* Program management: ability to (1) meet the seat implementation 
schedule and the NMCI budget, (2) achieve specified levels of network 
performance, and (3) proactively manage program risks. 

* Network operations and maintenance: includes such things as virus 
detection and repair, upgradeability, scalability, maintainability, 
asset management, and software distribution. 

The performance plan also included metrics, targets, and comparative 
baselines that were to be used for the first annual performance report, 
although it noted that progress in meeting some performance targets 
would not be measured until after contract award and that some of the 
cited measures could at some point cease to provide useful information 
for making decisions, while others may need to be collected 
continuously. The plan also stated that the Navy would fully develop 
performance measures for each of the categories and that it would 
produce an annual report on NMCI's performance in each of the 
categories. 

However, the Navy has not implemented its 2000 performance management 
plan. For example, the Navy did not develop performance measures for 
each of the performance categories and has not reported annually on 
progress against performance targets, categories and goals. Instead, 
Navy officials told us that they focused on defining and measuring 
progress against contractually specified SLAs, deploying NMCI seats, 
and reducing the number of Navy applications that are to run on NMCI 
workstations. According to these officials, measuring progress against 
the program's strategic goals was not a priority. 

Because measurement of goal attainment has not been the Navy's focus to 
date, when we sought (from both the program office and the Navy CIO 
office) performance data demonstrating progress in meeting NMCI's 
strategic goals and performance categories, the Navy was unable to 
provide data in this context. Instead, these officials said that data 
were available relative to contract performance, to include SLA 
performance levels and customer satisfaction survey results. Given 
this, we mapped the available contract-related performance data to the 
nine performance categories and targets and provided our analysis to 
the program office and the Navy CIO office. The Navy provided 
additional performance data and revisions to our mappings. Our analysis 
of the Navy-provided mapping, including associated fiscal year 2005 
data, is discussed in the next section. 

NMCI Strategic Goals and Associated Performance Category Targets Have 
Not Been Met: 

The Navy has not fully met any of its performance categories associated 
with achieving NMCI strategic goals and realizing program benefits. For 
example, the performance category of "Program management" has four 
performance targets relative to cost, schedule, performance, and risk. 
For fiscal year 2005, the NMCI program met one of the performance 
targets. It did not meet the other three targets and thus did not meet 
this performance category. Overall, the Navy defined 20 targets for the 
9 performance categories. Of these 20, the Navy met 3, did not meet 13, 
and was unable to determine if it met 4. The specific performance 
targets for each performance category are described below, along with 
performance in fiscal year 2005 against each target. Table 3 summarizes 
the number of targets met and not met for each category. 

Table 3: NMCI Satisfaction of Performance Targets for Each Performance 
Category for Fiscal Year 2005: 

Performance area: Interoperability; 
Number of targets: 3; 
Targets met: 1; 
Targets not met: 1; 
Unable to determine: 1. 

Performance area: Security/information assurance; 
Number of targets: 2; 
Targets met: 0; 
Targets not met: 2; 
Unable to determine: 0. 

Performance area: Workforce capabilities; 
Number of targets: 3; 
Targets met: 1; 
Targets not met: 1; 
Unable to determine: 1. 

Performance area: Process improvement; 
Number of targets: 2; 
Targets met: 0; 
Targets not met: 1; 
Unable to determine: 1. 

Performance area: Operational performance; 
Number of targets: 1; 
Targets met: 0; 
Targets not met: 1; 
Unable to determine: 0. 

Performance area: Service efficiency; 
Number of targets: 2; 
Targets met: 0; 
Targets not met: 1; 
Unable to determine: 1. 

Performance area: Customer satisfaction; 
Number of targets: 1; 
Targets met: 0; 
Targets not met: 1; 
Unable to determine: 0. 

Performance area: Program management; 
Number of targets: 4; 
Targets met: 1; 
Targets not met: 3; 
Unable to determine: 0. 

Performance area: Network operations and maintenance; 
Number of targets: 2; 
Targets met: 0; 
Targets not met: 2; 
Unable to determine: 0. 

Total; 
Number of targets: 20; 
Targets met: 3; 
Targets not met: 13; 
Unable to determine: 4. 

Source: GAO analysis of Navy data. 

[End of table] 

Interoperability: The Navy defined information systems 
interoperability, critical joint applications interoperability, and 
operational testing targets as its measures of this category. For 
fiscal year 2005, it met the information systems interoperability 
target. However, it did not meet the critical joint applications 
interoperability target, and it could not determine whether it met the 
operational testing target because of insufficient data. 

* Information systems interoperability: The target was to be level 2 on 
the DOD Levels of Information Systems Interoperability (LISI) 
Scale.[Footnote 20] The Navy reports that NMCI was a level 2. 

* Critical joint applications interoperability: The target was for all 
critical joint applications to be interoperable with NMCI.[Footnote 21] 
In fiscal year 2005, the Navy did not transition all of its critical 
joint applications to NMCI. Moreover, of the 13 applications that were 
fully or partially transitioned, one was determined not to be 
interoperable. 

* Operational testing: The target was to be "Potentially Operationally 
Effective" and "Potentially Operationally Suitable." However, Navy 
reported that the Joint Interoperability Test Command operational 
testing did not produce sufficient data to determine this. 

Security and information assurance: The Navy identified SLAs and 
information assurance incentive targets as its measures of this 
category. For fiscal year 2005, it did not meet either target. 

* SLAs: The target was to meet 100 percent of all security-related 
agreements. The Navy reported that it met this target during 4 months 
of the fiscal year but did not meet it for 8 months, including the last 
6 months of the fiscal year. 

* Information assurance incentives: The target was to have the 
contractor earn 100 percent of the incentive each year. However, the 
contractor did not earn 100 percent of the incentive for the last 6 
months of this fiscal year. 

Workforce capabilities: The Navy defined the reduction of civilian IT 
workforce, percentage of workforce with access to NMCI, and the amount 
of professional certifications as its measures of this category. For 
fiscal year 2005, it reported that it met the reduction of civilian IT 
workforce target but did not meet the percent of workforce with access 
target and could not determine whether it met the professional 
certifications target. 

* Reduction of civilian IT workforce: The target was to have a zero 
reduction in its civilian IT workforce. The Navy reported that it met 
this target. 

* Percent of workforce with access: The target was for 100 percent of 
its workforce to have access. As of September 30, 2005, 82 percent of 
the applicable workforce had a seat. 

* Amount of professional certifications: While Navy officials stated 
that the target is professional certifications, they could not provide 
a measurable target. Therefore, it cannot be determined whether the 
target was met. 

Process improvement: The Navy defined certain customer survey and 
technology refreshment targets as its measures of this category. For 
fiscal year 2005, the Navy did not meet the leadership survey target 
and could not determine whether it met the technology refreshment 
target. 

* Information from customer surveys: The target was to have the 
contractor earn 100 percent of the Echelon II survey and the Network 
Operations Leaders' survey incentives. However, the contractor earned 
25 percent of the incentive for the Echelon II survey, and 0 percent of 
the incentive for the Network Operations Leaders' survey in fiscal year 
2005. 

* Technology refreshment: While Navy officials stated that the target 
is technology refreshment, they could not provide measurable targets. 
Therefore, it cannot be determined whether the target was met. 

Operational performance: The Navy identified information from the 
network Operations Leaders' survey as its target for measuring this 
category. For fiscal year 2005, it did not meet this target. 

* Network Operations Leaders' survey: The target was for the contractor 
to earn 100 percent of the Network Operations Leaders' survey 
incentive. The contractor earned 0 percent of the incentive in fiscal 
year 2005. 

Service efficiency: The Navy defined SLA performance and cost/service 
ratio per seat targets as measures of this category. For fiscal year 
2005, the Navy did not meet the SLA performance target, and it could 
not determine if it met the cost/service ratio per seat target. 

* SLA performance: The target was to have 100 percent of seats at the 
full performance or full payment level. As of September 2005, the Navy 
reported that 82 percent of seats achieved full payment or full 
performance. This is down from March 2005, when the Navy reported that 
96 percent of seats achieved full payment or full performance. 

* Cost/service per seat: The target was to have the cost/service ratio 
per seat to not exceed what it was prior to NMCI. According to the 
Navy, while the per seat cost for NMCI is higher, the service level is 
also higher. However, the Navy did not have sufficient information to 
determine if the target was met. 

Customer satisfaction: The Navy identified information from the end 
user satisfaction survey as a target for measuring this category. It 
did not meet this target in fiscal year 2005. 

* Customer satisfaction survey: The target was to have 85 percent of 
NMCI end users satisfied. However, the percentage of users reported as 
satisfied from December 2004 through September 2005 ranged from 75 to 
80 percent. 

Program management: The Navy defined cost, schedule, performance, and 
risk-related performance targets as measures of this category. For 
fiscal year 2005, it reports that it met the cost target because it did 
not obligate more than 100 percent of available NMCI funding but did 
not meet the schedule, performance, and risk targets. 

* Cost: The target was to obligate up to 100 percent of program funds 
on NMCI in fiscal year 2005. The Navy reports that it obligated 97 
percent of these funds in this fiscal year. Program officials stated 
that the other 3 percent was spent on legacy IT infrastructure. 

* Schedule: The target was to deploy all seats that were scheduled for 
deployment in fiscal year 2005. The Navy reports that it deployed 77 
percent of these scheduled seats. 

* Performance: The target was to have 100 percent of eligible seats at 
full payment or full performance. The Navy reports that, as of 
September 2005, 82 percent of the seats achieved full payment or full 
performance. 

* Risk: The target is to be "green" in all risk areas.[Footnote 22] The 
Navy reports that it was "yellow" in several risk areas, such as 
schedule and organizational change management. 

Network operations and maintenance: The Navy defined SLA performance, 
leadership survey results, and technology refreshment targets for 
measuring this category. For fiscal year 2005, it did not meet the SLA 
performance or the leadership survey results targets. Further, it could 
not determine if it met the technology refreshment target. 

* SLA performance: The target was to have 100 percent of eligible seats 
at either full payment or full performance. As of September 2005, the 
Navy reported that 82 percent of seats were achieving full payment or 
full performance. This is down from March 2005, when the Navy reported 
that 96 percent of seats achieved full payment or full performance. 

* Leadership survey results: The target was to have the contractor earn 
100 percent of both the Echelon II and Network Operations Leaders' 
survey incentives. Through September 30, 2005, the contractor earned 25 
percent of the Echelon II incentive, and 0 percent of the operator's 
incentive. 

Notwithstanding the above described performance relative to performance 
category targets and strategic goals, Navy CIO and program officials 
described the program as a major success. CIO officials, for example 
stated that NMCI has significantly improved the Navy's IT environment, 
and will increase productivity through greater knowledge sharing and 
improved interoperability. They also stated that a review and 
certification process for all applications deployed on the network has 
been implemented and thus compliance with security and interoperability 
requirements has been ensured. According to these officials, NMCI's 
value has been demonstrated repeatedly over the last few years. In this 
regard, they cited the following examples but did not provide 
verifiable data to support them. 

* Improved security through continuous security assessments, a 
centralized distribution of vulnerability information, configuration 
control of critical servers, and an improved response to new 
vulnerabilities/threats. 

* Improved continuity of operations (e.g., the Navy reports that it had 
no prolonged disruptions due to recent hurricanes and fires on the West 
Coast). 

* Increased personnel training and certification by increasing the 
amount of offerings. 

* Identified opportunities for improving efficiency through the use of 
performance metrics. 

* Improved software and hardware asset management and implementation of 
standard and secure configurations. 

* Provided pier-side (waterfront) connectivity and Navy-wide public key 
infrastructure.[Footnote 23] 

The Navy's mapping of fiscal year 2005 data to performance categories 
and targets as summarized above shows that the NMCI program has not yet 
met either of its strategic goals. Specifically, the information 
superiority and innovation goals that were used to justify the program 
have yet to be attained. Further, although the Navy developed a plan to 
measure and report on NMCI progress in meeting the strategic goals, 
this plan was not implemented. As a result, the development and 
reporting of program performance relative to strategic goals has not 
occurred. 

Contractor Has Largely Met Many but Has Not Met Other SLAs: 

Our analysis of Navy contractor performance data since September 2004 
shows that the extent to which the site-specific agreements have been 
met for all operational seats (regardless of site) varies widely by 
individual agreement, with some always being met but others having 
varied performance over time and by seat type. Our analysis also showed 
that, although the contractor has met most of the enterprisewide 
agreements during this time period, it has not met a few. The Navy's 
analysis and reporting of contractor performance relative to the SLAs, 
using data for the same time period, showed that the percentage of 
operational seats meeting the agreements averaged about 89 percent from 
March 2005 to September 2005, then declined to 74 percent in October 
2005 and averaged about 56 percent between November 2005 and March 
2006. These differences in how SLA performance can be viewed illustrate 
how contractor performance against the agreements can be viewed 
differently depending on how available data are analyzed and presented. 
They also illustrate the importance of having a comprehensive, 
transparent, and consistent approach to program performance management 
that considers a range of perspectives and metrics. 

Contractor Satisfaction of SLAs Has Varied by Agreement and Seat Type, 
with Not All Agreements Being Met: 

For the period beginning October 2004 and ending March 2006, the 
contractor's performance relative to site-specific SLAs has varied, 
with certain agreements consistently being met regardless of seat type, 
other agreements being met to varying degrees over time, and still 
others largely not being met for certain seat types.[Footnote 24] 
Variability in performance has also occurred for enterprisewide 
agreements, although most have been met. 

Significant Percentage of All Applicable Seat Types Have Met Certain 
Site-Specific Agreements: 

Between October 2004 and March 2006, the contractor has met, or usually 
met, the agreement for each seat type for many SLAs. For example, the 
contractor met SLA 324, which covers wide area network connectivity, 
for all seat types all of the time. Also, SLA 325, covering network 
communication services, and SLA 332, measuring application server 
connectivity, were met for all seat types over the same time period. 
SLA 225, which measures base area network and local area network 
performance, was met for essentially all seat types (see fig. 4). 
Similarly, SLA 328, which measures the time to implement new seats and 
application servers, was met for 94 percent or more of deployed seat 
types in January 2005 through March 2006 (see fig. 5). (See app. III 
for descriptions of each SLA and figures illustrating levels of 
performance relative to each applicable seat type.) 

Figure 4: Site Level Performance for SLA 225: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 5: Site Level Performance for SLA 328: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Certain Site-Specific Agreements Have Not Been Consistently Met Over 
Time: 

The contractor has not consistently met certain agreements between 
October 2004 and March 2006. For example, satisfaction of SLA 102, 
which covers response time for network problem resolution, has ranged 
from a high of 100 percent in March 2005 and June 2005 to a low of 79 
percent in February 2006. As of March 2006, this SLA was met by 97 
percent of all seat types (see fig. 6). Also, satisfaction of SLA 107, 
which is a measure of network performance in areas of availability, 
latency/packet loss,[Footnote 25] and quality of service in support of 
videoconferencing and voice-over-IP, has varied over time. 
Specifically, satisfaction has ranged from a high of 99 percent in 
January 2006 to a low of 71 percent in January 2005. As of March 2006, 
this agreement was met by 90 percent of all seat types (see fig. 7). 

Figure 6: Site Level Performance for SLA 102: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 7: Site Level Performance for SLA 107: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Significant Percentage of All Seat Types Have Not Met Certain Site- 
Specific Agreements: 

Between October 2004 and March 2006, the contractor has not met certain 
agreements for all seat types. For example, for SLA 101, which is a 
measure of the time it takes to resolve NMCI user issues, the 
percentage of seats meeting the agreement has widely varied. 
Specifically, the percentage of mission-critical seats that met the 
agreement has been consistently and significantly lower than was the 
case for the basic or high end seats. In particular, as of March 2006, 
SLA 101 was met for about 90 percent of basic seats, 77 percent of high 
end seats, and 48 percent of mission-critical seats (see fig. 8). 
Similarly, for SLA 103, which is a measure of performance of end user 
services, the percentage of basic seats that met the agreement was 
consistently and significantly lower than that of high end or mission- 
critical seats. In March 2006, SLA 103 was met for about 63 percent of 
basic seats, 74 percent of high end seats, and 86 percent of mission- 
critical seats (See fig. 9). 

Figure 8: Percentage of Seats Meeting SLA 101: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 9: Percentage of Seats Meeting SLA 103: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Most Enterprisewide Agreements Have Been Met, but a Few Have Not: 

The contractor generally met most of the SLAs that have enterprisewide 
applicability. In particular, of the 13 such SLA's, 8 were met each 
month between October 2004 and March 2006, and another was met all but 
1 month during this time period. Further, a tenth SLA was met for 14 
out of the 18 months during this period. 

However, the contractor has not consistently met 3 of the 13 
enterprisewide SLAs. Specifically, SLA 103, which covers end user 
services, was not met 12 of the 18 months. SLA 104, which covers the 
help desks, was not met 11 out of the 18 months, including 8 out of the 
last 9 months of this period. SLA 106, which covers information 
assurance services including identifying incidents, responding to 
incidents, and configuration of NMCI, was not met for 11 out of 18 
months, including the last 9 months of the period. (See fig. 10 for a 
summary of the months in which the contractor met and did not meet the 
enterprisewide SLAs.) 

Figure 10: Months in Which the Enterprisewide SLAs Were Met and Not Met 
between October 2004 and March 2006: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Contractor Satisfaction of SLAs Relative to Contractually Defined 
Performance Levels Has Varied: 

NMCI program officials told us that they measure the contractor's SLA- 
related performance in terms of the percentage of eligible seats that 
have met the contractual definitions of full payment and full 
performance. More specifically, they compare the number of seats on a 
site-by-site basis that have met these definitions with the number of 
seats that are eligible. As discussed earlier, full payment means that 
the contractor has met 100 percent of the applicable agreements at a 
given site, and 50 to 90 percent of the planned seats at that site have 
been cut over (i.e., are operational). Full performance means that the 
contractor has met 100 percent of the applicable agreements at a given 
site, and over 90 percent of the planned seats at that site have been 
cut over. In effect, this approach focuses on performance for only 
those seats that are at sites where at least 50 percent of the planned 
number of seats are actually operating. It excludes performance at 
sites where less than 50 percent of the ordered seats are operating. 
Moreover, it combines the results for all SLAs and, therefore, does not 
highlight differences in performance among service areas. 

For the period beginning in October 2004 and ending in March 2005, the 
contractor's performance in meeting the agreements from a contractual 
standpoint increased, with the percentage of operational seats that met 
either performance level having jumped markedly between October and 
December 2004 (about 5 to 65 percent), then generally increasing to a 
high of about 96 percent in March 2005. Since then, the percentage of 
seats meeting either of the two performance levels fluctuated between 
82 and 94 percent through September 2005 and then decreased to 74 
percent in October 2005. From November 2005 through March 2006, the 
percentage of seats meeting either performance level decreased to 55 
percent. (See fig. 11 for the trend in the percentage of operational 
seats meeting either the full payment or full performance levels; see 
fig. 12 for the number of seats achieving either performance level 
versus the number eligible for doing so for the same time period.) 

Figure 11: Trend in the Percentage of Operational Seats Meeting Either 
the Full Payment or Full Performance Levels: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 12: Number of Seats Achieving Either the Full Payment or Full 
Performance Levels Versus the Number of Seats Eligible: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

The preceding descriptions of SLA performance illustrate that 
contractor performance against the agreements can be viewed differently 
depending on how relevant data are analyzed and presented. Further, 
they illustrate the importance of considering different perspectives 
and metrics in order to have a comprehensive, transparent, and 
consistent approach to program performance management. 

NMCI Customer Groups' Satisfaction Levels Vary, but Overall Customer 
Satisfaction Is Low: 

The Navy's three groups of NMCI customers--end users, organizational 
commanders, and network operators--vary in the extent to which they are 
satisfied with the program, but collectively these customers are 
generally not satisfied. With respect to end users, the Navy reports 
that overall satisfaction with NMCI improved between 2003 and 2005; 
however, reported satisfaction levels have dropped off since September 
2005. In addition, while the Navy reports that this overall level of 
end user satisfaction with contractor provided services has averaged 
about 76 percent since April 2004,[Footnote 26] this is below the Navy- 
wide target of 85 percent and includes many survey responses at the 
lower end of the range of scores that the Navy has defined "satisfied" 
to mean. With respect to commanders and network operations leaders, 
neither is satisfied with NMCI. In addition, officials representing 
each of the customer groups at five shipyard or air depot installations 
that we visited expressed a number of NMCI concerns and areas of 
dissatisfaction with the program. Without satisfied customers, the Navy 
runs the risk that NMCI will not attain the widespread acceptance 
necessary to achieve strategic program goals. 

End User Surveys Show Dissatisfaction with NMCI: 

Despite reported improvements in end user satisfaction levels since 
2002, end user responses to quarterly satisfaction surveys have been 
consistently at the low end of the range of scores that the Navy 
defines the term "satisfied" to mean, and the percentage of end users 
that Navy counts as being "satisfied" have consistently been below the 
Navy's satisfaction target level. Specifically, although the Navy's 
satisfied users dropped from about 66 percent in June 2002 to around 54 
percent for the next two quarters (September and December 2002), 
satisfaction reportedly rose steadily from March 2003 through September 
2005, peaking at that time at about 80 percent. Since then, the 
percentage of end users that the Navy reports to be satisfied has 
declined, leveling off at around 76 percent over the next several 
months.[Footnote 27] This means that even with the Navy's forgiving 
definition of what constitutes a satisfied end user, at least 24 
percent of end users are dissatisfied with NMCI. (See fig. 13 for the 
trends in end user satisfaction with the program and the contractor.) 

Figure 13: Trends in End User Satisfaction Levels Related to Program 
Contractor Target Levels: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

Note: Survey participants varied over time. 

[End of figure] - graphic text: 

Exacerbating this long-standing shortfall in meeting end user 
satisfaction expectations is the fact that the Navy considers a 
"satisfied" end user to include users that are at best marginally 
satisfied and arguably somewhat dissatisfied. That is, the Navy uses an 
average score of 5.5 or greater (on its 10-point satisfaction scale, 
where 1 is dissatisfied, and 10 is satisfied) as the threshold for 
categorizing and counting end users as satisfied. This means that users 
counted as satisfied may include a large contingent that are at the low 
end of the satisfaction range (e.g., between 5.5 and 7). When the 
results of the March 2006 survey are examined in this context, we see 
that this is the case. For example, we see that 8 of the 14 questions 
received an average score below 7.0. 

Additional insights into the degree and nature of end user satisfaction 
(and dissatisfaction) are apparent when the reported percentage of 
satisfied users are examined from different perspectives, such as by 
(1) individual survey questions and (2) organizational units. For 
example, Navy-reported end user satisfaction survey results for the 
quarter ending March 31, 2006, show that while the percentage of users 
deemed satisfied with the program averaged about 74 percent, the 
percentage reported as satisfied relative to each survey question 
ranged from a low 52 to a high of 87 percent. These insights into end 
user sources of satisfaction and dissatisfaction are summarized as 
follows: 

* Variations in satisfaction levels by question. While the percentage 
of end users who are categorized as satisfied with the program and the 
contractor do not significantly differ (74 versus 76 percent, 
respectively), variations do exist among the percentage satisfied with 
the 14 areas that the questions address. For example, far fewer (66 
percent) were satisfied with the reliability of the NMCI network than 
were satisfied with the professionalism of EDS personnel (87 percent). 
(See table 4 for the percentage of users satisfied and dissatisfied 
according to each of the 14 survey questions.) 

* Variations in satisfaction levels by organizational unit. The 
percentage of end users who were categorized as being satisfied with 
the NMCI program varied by organizational unit as much as 18 percentage 
points. For example, about 66 percent of users in the Naval Sea Systems 
Command were deemed satisfied with the program as compared with about 
84 percent in the Commander of Navy Installations. Similarly, the 
percentage of end users who were categorized as satisfied with the 
contractor also varied by 17 percentage points, with the Naval Sea 
Systems Command and Naval Air Systems Command having about 69 percent 
of its users viewed as satisfied and the Commander of Navy 
Installations having about 86 percent. (See tables 5 and 6 for 
percentages of satisfied end users by Navy and Marine Corps, 
respectively, organizations as of March 31, 2006.) 

Table 4: NMCI End User Customer Satisfaction Survey Questions and 
Results for the Quarterly Period Ending on March 31, 2006: 

Survey questions: With the process to make changes to your IT 
environment?a[A]; 
Average score: 5.5; 
Percentage not satisfied: 48%; 
Percentage satisfied: 52%. 

Survey questions: With training on how to use NMCI effectively?[A]; 
Average score: 6.5; 
Percentage not satisfied: 32; 
Percentage satisfied: 68. 

Survey questions: With having access to the software you need to 
accomplish your job?[A]; 
Average score: 6.6; 
Percentage not satisfied: 33; 
Percentage satisfied: 67. 

Survey questions: With having access to the computer hardware you need 
to accomplish your job?[A]; 
Average score: 7.0; 
Percentage not satisfied: 26; 
Percentage satisfied: 74. 

Survey questions: With network reliability?; 
Average score: 6.4; 
Percentage not satisfied: 34; 
Percentage satisfied: 66. 

Survey questions: With the timeliness of problem resolution?; 
Average score: 6.6; 
Percentage not satisfied: 32; 
Percentage satisfied: 68. 

Survey questions: With the dependability of the computer you use?; 
Average score: 6.8; 
Percentage not satisfied: 29; 
Percentage satisfied: 71. 

Survey questions: What is your overall satisfaction with services 
provided by EDS?; 
Average score: 6.8; 
Percentage not satisfied: 27; 
Percentage satisfied: 73. 

Survey questions: With the solution implemented to correct any problem 
you experienced?; 
Average score: 7.0; 
Percentage not satisfied: 27; 
Percentage satisfied: 73. 

Survey questions: With finding and using information about NMCI 
services?; 
Average score: 7.0; 
Percentage not satisfied: 23; 
Percentage satisfied: 77. 

Survey questions: With technical support services provided by the help 
desk?; 
Average score: 7.2; 
Percentage not satisfied: 25; 
Percentage satisfied: 75. 

Survey questions: With the accuracy of information describing how to 
use NMCI services?; 
Average score: 7.1; 
Percentage not satisfied: 22; 
Percentage satisfied: 78. 

Survey questions: With technical support services provided by on-site 
personnel?; 
Average score: 7.1; 
Percentage not satisfied: 25; 
Percentage satisfied: 75. 

Survey questions: With the professionalism of EDS personnel?; 
Average score: 8.0; 
Percentage not satisfied: 13%; 
Percentage satisfied: 87%. 

Source: GAO based on Navy-provided data. 

[A] Responses to these questions were not used to determine levels of 
satisfactions with contractor provided services. 

Note: Scores shown may reflect rounding decisions made by the 
Department of the Navy regarding the results of its calculations. 

[End of table] 

Table 5: Percentages of Satisfied End Users by Navy Budget Submitting 
Office, as of March 31, 2006: 

Navy budget submitting offices: Naval Sea Systems Command; 
Percentage satisfied with NMCI program: 66%; 
Percentage satisfied with contractor- provided services: 69%. 

Navy budget submitting offices: Naval Air Systems Command; 
Percentage satisfied with NMCI program: 67; 
Percentage satisfied with contractor- provided services: 69. 

Navy budget submitting offices: Naval Facilities Engineering Command; 
Percentage satisfied with NMCI program: 68; 
Percentage satisfied with contractor-provided services: 71. 

Navy budget submitting offices: Space and Naval Warfare Systems 
Command; 
Percentage satisfied with NMCI program: 69; 
Percentage satisfied with contractor-provided services: 71. 

Navy budget submitting offices: Chief of Naval Operations; 
Percentage satisfied with NMCI program: 72; 
Percentage satisfied with contractor- provided services: 75. 

Navy budget submitting offices: Administrative Assistant to the Under 
Secretary of the Navy; 
Percentage satisfied with NMCI program: 75; 
Percentage satisfied with contractor-provided services: 76. 

Navy budget submitting offices: Commander, U.S. Pacific Fleet; 
Percentage satisfied with NMCI program: 77; 
Percentage satisfied with contractor-provided services: 78. 

Navy budget submitting offices: Reserve Forces; 
Percentage satisfied with NMCI program: 78; 
Percentage satisfied with contractor-provided services: 81. 

Navy budget submitting offices: Manpower, Personnel, Training and 
Education; 
Percentage satisfied with NMCI program: 79; 
Percentage satisfied with contractor-provided services: 81. 

Navy budget submitting offices: Commander, U.S. Atlantic Fleet; 
Percentage satisfied with NMCI program: 79; 
Percentage satisfied with contractor-provided services: 80. 

Navy budget submitting offices: Aggregated Navy Budget Submitting 
Offices[A]; 
Percentage satisfied with NMCI program: 80; 
Percentage satisfied with contractor-provided services: 81. 

Navy budget submitting offices: Naval Supply Systems Command; 
Percentage satisfied with NMCI program: 80; 
Percentage satisfied with contractor-provided services: 82. 

Navy budget submitting offices: Commander, Navy Installations; 
Percentage satisfied with NMCI program: 84%; 
Percentage satisfied with contractor-provided services: 86%. 

Source: GAO based on Navy provided data. 

[A] Includes the Bureau of Medicine, Military Sealift Command, Navy 
Engineering Logistics Office, Naval Meteorology and Oceanography 
Command, Office of Naval Intelligence, Office of Naval Research, and 
the Naval Security Group. 

Note: Scores shown may reflect rounding decisions made by the 
Department of the Navy regarding the results of its calculations. 

[End of table] 

Table 6: Percentages of Satisfied End Users by U.S. Marine Corps Major 
Command, as of March 31, 2006: 

Marine Corps Major Commands: Aggregated Marines[A]; 
Percentage satisfied with NMCI program: 69%; 
Percentage satisfied with contractor- provided services: 72%. 

Marine Corps Major Commands: Training and Education Command; 
Percentage satisfied with NMCI program: 69; 
Percentage satisfied with contractor- provided services: 72. 

Marine Corps Major Commands: U.S. Marine Forces, Atlantic; 
Percentage satisfied with NMCI program: 70; 
Percentage satisfied with contractor- provided services: 72. 

Marine Corps Major Commands: Logistics Command; 
Percentage satisfied with NMCI program: 71; 
Percentage satisfied with contractor-provided services: 73. 

Marine Corps Major Commands: U.S. Marine Forces, Pacific; 
Percentage satisfied with NMCI program: 71; 
Percentage satisfied with contractor- provided services: 73. 

Marine Corps Major Commands: U.S. Marine Forces, Reserve; 
Percentage satisfied with NMCI program: 77%; 
Percentage satisfied with contractor- provided services: 81%. 

Source: GAO based on Navy-provided data. 

[A] Includes Enterprise USMC, Headquarters Marine Corps, Marine Corps 
Combat Development Center, Marine Corps Recruiting Command and Marine 
Corps Systems Command. Surveys were distributed to 1,671 of a total 
population of 6,685 end users in these Commands. 

Note: Scores shown may reflect rounding decisions made by the 
Department of the Navy regarding the results of its calculations. 

[End of table] 

Commander and Network Operator Surveys Show That Both Customer Groups 
Are Dissatisfied: 

The Navy conducted surveys of commander and network operations leader 
units in September 2005 and in March 2006. Overall, survey results show 
that neither commanders nor operators are satisfied with NMCI. 

Commander Survey Results: 

The results from the two commander satisfaction surveys conducted to 
date show that the customers are not satisfied, with NMCI. 
Specifically, on a scale of 0-3 with 0 being not satisfied, and 1 being 
slightly satisfied with the contractor's support in meeting the mission 
needs and strategic goals of these organizations, the average response 
from all organizations was 0.65 and 0.76 in September 2005 and March 
2006, respectively. The latest survey results show minor differences in 
the degree of dissatisfaction with the four types of contractor 
services addressed (cutover services, technical solutions, service 
delivery, and warfighter support). (See table 7 for results of the 
September 2005, and March 2006, commander satisfaction surveys.) 

Table 7: Results of the 6-Month Periods Ending on September 30, 2005 
and March 31, 2006, Commander Surveys: 

Reporting organization: Assistant for Administration to the Under 
Secretary of the Navy; 
September 2005: War-fighter support: *; 
September 2005: Cutover services: *; 
September 2005: Technical solutions: 2; 
September 2005: Service delivery: 2; 
September 2005: Average organization score: 2.00; 
March 2006: War-fighter support: 2; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 1; 
March 2006: Average organization score: 1.25. 

Reporting organization: Bureau of Personnel; 
September 2005: War-fighter support: *; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 1; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 0.67; 
March 2006: War-fighter support: **; 
March 2006: Cutover services: **; 
March 2006: Technical solutions: **; 
March 2006: Service delivery: **; 
March 2006: Average organization score: n/a. 

Reporting organization: Commander of Navy Installations; 
September 2005: War-fighter support: 1; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 0.50; 
March 2006: War-fighter support: 1; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.25. 

Reporting organization: Chief of Naval Operations (CNO); 
September 2005: War-fighter support: 1; 
September 2005: Cutover services: 1; 
September 2005: Technical solutions: 1; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 1.00; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 2; 
March 2006: Average organization score: 0.75. 

Reporting organization: CNO-Field Support Activity, Pacific Command; 
September 2005: War-fighter support: **; 
September 2005: Cutover services: **; 
September 2005: Technical solutions: **; 
September 2005: Service delivery: **; 
September 2005: Average organization score: n/a; 
March 2006: War-fighter support: 1; 
March 2006: Cutover services: 2; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 1; 
March 2006: Average organization score: 1.25. 

Reporting organization: Commander, Atlantic Fleet; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.25. 

Reporting organization: Commander, Pacific Fleet; 
September 2005: War-fighter support: 1; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.25; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Naval Air Systems Command; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Naval Facilities Engineering Command; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 1; 
September 2005: Technical solutions: 1; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.50; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.50. 

Reporting organization: Naval Sea Systems Command; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Naval Supply Systems Command; 
September 2005: War-fighter support: 1; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.25; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Office of Naval Research; 
September 2005: War-fighter support: **; 
September 2005: Cutover services: **; 
September 2005: Technical solutions: **; 
September 2005: Service delivery: **; 
September 2005: Average organization score: n/a; 
March 2006: War-fighter support: *; 
March 2006: Cutover services: 2; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 2; 
March 2006: Average organization score: 1.67. 

Reporting organization: Reserve Forces; 
September 2005: War- fighter support: 1; 
September 2005: Cutover services: 2; 
September 2005: Technical solutions: 2; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 1.50; 
March 2006: War-fighter support: 1; 
March 2006: Cutover services: 2; 
March 2006: Technical solutions: 2; 
March 2006: Service delivery: 2; 
March 2006: Average organization score: 1.75. 

Reporting organization: Space and Naval Warfare Systems Command; 
September 2005: War-fighter support: 1; 
September 2005: Cutover services: *; 
September 2005: Technical solutions: 1; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 1.00; 
March 2006: War-fighter support: 2; 
March 2006: Cutover services: *; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 2; 
March 2006: Average organization score: 1.67. 

Reporting organization: Logistics Command; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Manpower, Personnel, Education and Training; 
September 2005: War-fighter support: **; 
September 2005: Cutover services: **; 
September 2005: Technical solutions: **; 
September 2005: Service delivery: **; 
September 2005: Average organization score: n/a; 
March 2006: War-fighter support: 2; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 2; 
March 2006: Service delivery: 2; 
March 2006: Average organization score: 1.75. 

Reporting organization: Headquarters Marine Corps; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: *; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.67. 

Reporting organization: Marine Corps Combat Development Center; 
September 2005: War-fighter support: 1; 
September 2005: Cutover services: 2; 
September 2005: Technical solutions: 2; 
September 2005: Service delivery: 2; 
September 2005: Average organization score: 1.75; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 2; 
March 2006: Service delivery: 3; 
March 2006: Average organization score: 1.50. 

Reporting organization: Marine Corps Systems Command; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 1; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.25; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.50. 

Reporting organization: Marine Corps Recruiting Command; 
September 2005: War-fighter support: **; 
September 2005: Cutover services: **; 
September 2005: Technical solutions: **; 
September 2005: Service delivery: **; 
September 2005: Average organization score: n/a; 
March 2006: War-fighter support: *; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Commander, Marine Forces; 
September 2005: War-fighter support: **; 
September 2005: Cutover services: **; 
September 2005: Technical solutions: **; 
September 2005: Service delivery: **; 
September 2005: Average organization score: n/a; 
March 2006: War-fighter support: 1; 
March 2006: Cutover services: 1; 
March 2006: Technical solutions: 1; 
March 2006: Service delivery: 1; 
March 2006: Average organization score: 1.00. 

Reporting organization: Marine Forces, Atlantic; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: **; 
March 2006: Cutover services: **; 
March 2006: Technical solutions: **; 
March 2006: Service delivery: **; 
March 2006: Average organization score: n/a. 

Reporting organization: Marine Forces, Pacific; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 1; 
September 2005: Technical solutions: 1; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 0.75; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 0; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.00. 

Reporting organization: Marine Forces, Reserves; 
September 2005: War-fighter support: **; 
September 2005: Cutover services: **; 
September 2005: Technical solutions: **; 
September 2005: Service delivery: **; 
September 2005: Average organization score: n/a; 
March 2006: War-fighter support: 0; 
March 2006: Cutover services: 2; 
March 2006: Technical solutions: 0; 
March 2006: Service delivery: 0; 
March 2006: Average organization score: 0.50. 

Reporting organization: Military Sealift Command; 
September 2005: War-fighter support: 0; 
September 2005: Cutover services: 0; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 0; 
September 2005: Average organization score: 0.00; 
March 2006: War-fighter support: **; 
March 2006: Cutover services: **; 
March 2006: Technical solutions: **; 
March 2006: Service delivery: **; 
March 2006: Average organization score: n/a. 

Reporting organization: Naval Education and Training Command; 
September 2005: War-fighter support: *; 
September 2005: Cutover services: 2; 
September 2005: Technical solutions: 2; 
September 2005: Service delivery: 2; 
September 2005: Average organization score: 2.00; 
March 2006: War-fighter support: **; 
March 2006: Cutover services: **; 
March 2006: Technical solutions: **; 
March 2006: Service delivery: **; 
March 2006: Average organization score: n/a. 

Reporting organization: Training and Education Command; 
September 2005: War-fighter support: 2; 
September 2005: Cutover services: 2; 
September 2005: Technical solutions: 0; 
September 2005: Service delivery: 1; 
September 2005: Average organization score: 1.25; 
March 2006: War-fighter support: *; 
March 2006: Cutover services: 3; 
March 2006: Technical solutions: 2; 
March 2006: Service delivery: 2; 
March 2006: Average organization score: 2.33. 

Overall satisfaction average; 
September 2005: Average organization score: 0.65; 
March 2006: Average organization score: 0.76. 

Legend: 

"*" no response was provided: 

"**" the organization was not included in survey report: 

Source: GAO based on Navy-provided data. 

[End of table] 

Network Operations Leaders Survey Results: 

The Navy-reported results of the two network operations leader 
satisfaction surveys conducted to date show that these customers are 
also not satisfied with NMCI. Specifically, on a scale of 0-3 with 0 
being not satisfied and 1 being slightly satisfied with the 
contractor's support in meeting the mission needs and strategic goals 
of these two organizations, the average of the responses from NETWARCOM 
in September 2005 was 0.33, rising to 0.67 in March 2006. For MCNOSC, 
the average of the responses to both surveys was 0.00. (See table 8 for 
these results.) Of the three types of contractor services addressed in 
the survey (mission support and planning, network management, and 
service delivery), network management services, which includes 
information assurance and urgent software patching, received a score of 
0 from both organizations on both surveys. 

Table 8: Results for the 6-Month Periods Ending on September 30, 2005, 
and March 31, 2006, Network Operations Leaders Survey: 

Reporting organization: NETWARCOM; 
September 2005: Mission support & planning: 0.00; 
September 2005: Network management: 0.00; 
September 2005: Service delivery: 1.00; 
September 2005: Average score: 0.33; 
March 2006: Mission support &: planning: 1.00; 
March 2006: Network management: 0.00; 
March 2006: Service delivery: 1.00; 
March 2006: Average score: 0.67. 

Reporting organization: MCNOSC; 
September 2005: Mission support & planning: 0.00; 
September 2005: Network management: 0.00; 
September 2005: Service delivery: 0.00; 
September 2005: Average score: 0.00; 
March 2006: Mission support &: planning: 0.00; 
March 2006: Network management: 0.00; 
March 2006: Service delivery: 0.00; 
March 2006: Average score: 0.00. 

Overall satisfaction average; 
September 2005: Average score: 0.17; 
March 2006: Average score: 0.33. 

Source: GAO based on Navy-provided data. 

[End of table] 

Shipyard and Air Depot Customers Consistently Identified a Range of 
Concerns and Areas of Dissatisfaction with NMCI: 

Consistent with the results of the Navy's customer satisfaction 
surveys, officials representing end users, commanders, and network 
operations personnel at five shipyards or air depots[Footnote 28] that 
we interviewed cited a number of concerns or sources of dissatisfaction 
with NMCI. The anecdotal information that they provided to illustrate 
their concerns are described in the next section. 

Continued Reliance on Legacy Systems: 

Shipyard and air depot officials for all five sites told us that they 
have continued to rely on their legacy systems rather than NMCI for 
various reasons. For example, officials at one air depot stated that 
NMCI provided less functionality than their legacy systems and thus 
they have continued to use these legacy systems to support mission 
operations. Also, officials at one shipyard told us that site personnel 
lack confidence in NMCI and thus they continue to use legacy systems. 
Officials at the other two shipyards voiced even greater concerns, with 
officials at one saying that only NMCI seats (i.e., workstations) are 
running on the NMCI intranet (their servers are still running on their 
legacy network), and officials at the other saying that NMCI does not 
support their applications and thus they primarily use it for e-mail. 
Similarly, officials at an air depot stated that NMCI workstations are 
not capable of supporting certain applications, such as high- 
performance modeling, and thus they operate about 233 other 
workstations to support their needs. 

Loss in Workforce Productivity: 

According to a memo from the Commander of one shipyard to the Naval Sea 
Systems Command dated December 2005, NCMI software updates adversely 
affect the operation of network applications. Consistent with this, 
officials at two of the sites stated that NMCI is hurting workforce 
productivity, with officials at one shipyard saying that system 
downtime, particularly as it relates to major applications, has 
deteriorated and is unacceptable, and officials at another shipyard 
said that NMCI response time is slow both on-and off-site. To 
illustrate, officials at one air depot said that personnel cannot 
download more than one file at a time, while officials at shipyards 
stated that "reach back" to legacy systems through NMCI is slow, 
sometimes taking 45 minutes to open a document. Further, officials at 
shipyards complained that users' profiles do not follow the user from 
one workstation to another, causing users to recreate them, while 
officials at one air depot stated that NMCI does not provide them the 
capability to monitor employees' inappropriate use of the Internet 
(e.g., excessive use or accessing unauthorized sites). 

Lack of Support of Dynamic Work Environments: 

Both air depot and shipyard officials described their respective work 
environments as dynamic, meaning that they are frequently changing, and 
thus require flexibility in moving and configuring workstations. 
Further, shipyards operate at the waterfront, which we were told is an 
environment that requires quick responses to changing needs. For 
example, ships come in, barges are created to service them, and these 
barges must be outfitted with computers. Decisions occur in a short 
amount of time regarding new barge set ups and equipment movements. 
According to shipyard officials, NMCI has not been able to support 
these barge-related requirements because it is not flexible enough to 
quickly react to shifting work priorities. As a result, officials with 
one shipyard stated that they have had to provide their own waterfront 
support using legacy systems. Similarly, officials with the air depots 
stated that the NMCI contractor has a difficult time moving seats fast 
enough to keep up with changing needs. 

Limitations in Help Desk Support: 

Officials from each of the shipyards and air depots voiced concerns and 
dissatisfaction with help desk assistance. According to officials with 
the air depots, the quality of help desk support is inconsistent, and 
thus they have had to assume more of the burden in dealing with IT 
system problems since they transitioned to NMCI. Shipyard officials 
were even more critical of help desk support. According to officials at 
one shipyard, help desk support is not working, as it is almost 
impossible to get a help desk call done in 1 hour. Similarly, officials 
at another shipyard told us that help desk responsiveness has been poor 
because it takes hours, if not days, to get problems fixed. The 
previously cited memo from the Commander of one shipyard to the Naval 
Sea Systems Command cited an average time of 2.4 days to respond to 
customer inquiries. 

Problems with NMCI Site Preparation and Transition: 

Officials from all five sites expressed concerns with the manner in 
which they were prepared for transitioning to NMCI. According to 
officials at one air depot, certain seat management requirements were 
overlooked, and NMCI users have struggled with understanding the 
contract processes that govern, for example, how to order new software 
and hardware, or how to relocate machines, because the contractual 
terms are difficult to follow, and training was not adequate. In 
particular, they said users do not understand with whom they should 
talk to address a given need, and officials with one air depot noted 
that NMCI has no solution for their electronic classroom needs. 
Officials at one shipyard attributed the lack of NMCI site preparation 
to insufficient planning prior to deploying NMCI and a lack of 
transparency in how NMCI was being managed, including how deployment 
issues were to be resolved. As stated by officials at another shipyard, 
the transition to NMCI was difficult and very disruptive to operations 
because they had no control over the contractor transition team. 

NMCI program officials told us that they are aware of the concerns and 
sources of dissatisfaction of shipyard and air depot customers, 
however, they added that many of them are either not supported by data 
or reflect customers' lack of familiarity with the services available 
under the contract. In particular, they said that they have not been 
provided any data showing a drop in workforce productivity caused by 
NMCI. They also said that continued reliance on legacy systems 
illustrates a lack of familiarity with the contract because provisions 
exist for moving legacy servers onto NMCI and supporting certain 
applications, such as high-performance modeling. Further, they said 
that the contract supports monitoring Internet usage, provides 
waterfront support to shipyards, and provides help desk service 24 
hours a day, 7 days a week. Nevertheless, they acknowledged that both a 
lack of customer understanding, and customer perceptions about the 
program are real issues affecting customer satisfaction that need to be 
addressed. 

Customer Satisfaction Improvement Efforts Are Not Being Guided by 
Effective Planning: 

The NMCI program office reports that improving customer satisfaction is 
a program priority. Accordingly, it has invested and continues to 
invest time and resources in a variety of activities that it associated 
with customer satisfaction improvement, such as holding user 
conferences and focus groups. However, these efforts are not being 
guided by a documented plan that defines prioritized improvement 
projects and associated resource requirements, schedules, and 
measurable goals and outcomes. Given the importance of improved 
customer satisfaction to achieving NMCI program goals and benefits, it 
is important for the Navy to take a structured and disciplined approach 
to planning its improvement activities. Without it, the program office 
cannot adequately ensure that it is effectively investing scarce 
program resources. 

As we have previously reported,[Footnote 29] effectively managing 
program improvement activities requires planning and executing such 
activities in a structured and disciplined fashion. Among other things, 
this includes developing an action plan that defines improvement 
projects and initiatives, assigns roles and responsibilities, sets 
priorities, identifies resource needs, establishes time lines with 
milestones, and describes expected results in measurable terms. The 
Software Engineering Institute's IDEALSM model, for example, is one 
recognized approach for managing process improvement efforts.[Footnote 
30] According to this model, improvement efforts should include a 
written plan that serves as the foundation and basis for guiding 
improvement activities, including obtaining management commitment to 
and funding for the activities, establishing a baseline of commitments 
and expectation against which to measure progress, prioritizing and 
executing activities and initiatives, determining success, and 
identifying and applying lessons learned. Through such a structured and 
disciplined approach, improvement resources can be invested in a manner 
that produces optimal results. Without such an approach, improvement 
efforts can be reduced to trial and error. 

The NMCI program office identified seven initiatives that are intended 
to increase customer satisfaction with the program. According to 
program officials, the initiatives are (1) holding user conferences, 
(2) conducting focus groups, (3) administering diagnostic surveys, (4) 
strengthening help desk capabilities, (5) expanding network services 
(e.g., adding broadband remote access), (6) assessing infrastructure 
performance, and (7) initiating a lean six sigma effort.[Footnote 31] 
Following are descriptions of each initiative: 

User conferences. The program office has conducted semiannual NMCI user 
conferences since 2000. According to program officials, these 
conferences provide a forum for users to directly voice to program 
leaders their sources of dissatisfaction with NMCI. During the 
conferences, users ask questions, participate in issue-focused breakout 
sessions, and engage in informal discussions. We attended the June 2005 
user conference and observed that Navy and contractor program officials 
provided information, such as updates on current and planned activities 
and capabilities, while users had opportunities to provide comments and 
ask questions. According to program officials, the conferences are 
useful in making program officials aware of customer issues and are 
used to help diagnose NMCI problems. 

Focus groups. According to program officials, they conduct user focus 
groups to, among other things, solicit reasons for customer 
dissatisfaction and explore solutions and to test newly proposed end 
user satisfaction survey questions. The focus group sessions include 
invited participants and are guided by prepared scripts. The results of 
the sessions are summarized for purposes of identifying improvements 
such as revisions to user satisfaction survey questions. 

Diagnostic surveys. The program office performs periodic surveys to 
diagnose the source of user dissatisfaction with specific services, 
such as e-mail, printing, and technical support. According to program 
officials, these surveys help identify the root causes of user 
dissatisfaction and support analysis of areas needing improvement. 
However, they could not identify specific examples of where such causes 
have been identified and addressed and measurable improvements have 
resulted. 

Help desk improvement team. The program office established a team to 
identify the reasons for declining end user satisfaction survey scores 
relative to the technical support services provided by the help desk. 
According to program officials, the team traced declining satisfaction 
levels to such causes as help desk agents' knowledge, training, and 
network privilege shortfalls. To address these limitations, the program 
office reports that it has redesigned and restructured help desk 
operations to organize help desk agents according to skills and 
experience, route calls according to the skill level needed to address 
the call, target needed agent training, hold daily meetings with agents 
to apprise them of recent issues, and monitor help desk feedback. 
However, program officials could not link these efforts to measurable 
improvements in help desk performance, and NMCI customers that we 
interviewed during our visits to shipyards and air depots voiced 
concerns with help desk support. 

Expanded network services. NMCI program officials stated that a key 
improvement initiative has been expanding the scope of network-related 
services that are available under the contract. In particular, they 
point to such new services as broadband remote access for all laptop 
users, antispam services for all e-mail accounts, and antispyware 
services for all accounts as having improved customer satisfaction. 
Further, they said that the planned addition of wireless broadband 
access will increase customer satisfaction. However, they could not 
provide data showing how these added services affected customer 
satisfaction, or how future services are expected to affect 
satisfaction. 

Infrastructure performance assessment. Working with EDS, the program 
office undertook an NMCI network infrastructure assessment that was 
intended to identify and mitigate performance issues. This assessment 
included establishing metrics and targets for common user functions 
such as opening a Web site, then determining actual network performance 
at the Washington Navy Yard and Marine Corps installations in Quantico, 
Virginia. According to program officials, assessment results included 
finding that network performance could be improved by balancing traffic 
among firewalls and upgrading wide area network circuits. As a result 
of this initial assessment, the program has begun adjusting network 
settings and upgrading hardware at additional NMCI sites. Further, 
program officials said they are expanding their use of network 
infrastructure metrics to all sites. However, they neither provided us 
with a plan for doing so, nor did they demonstrate that these efforts 
have affected customer satisfaction. 

Lean six sigma. Program officials said they are applying lean six sigma 
techniques to improve customer satisfaction. In particular, they have 
established a customer satisfaction workgroup, which is to define a 
process for identifying customer problems and prioritizing improvement 
projects. They said that, for each project, they will perform concept 
testing using pilot projects and focus groups. They also said that they 
plan to establish a steering committee that includes representatives 
from the Navy and the contractor. The officials told us that they have 
initiated seven projects using lean six sigma techniques, although they 
did not provide us with any information about the results of these 
projects or their impact on customer satisfaction. 

While any or all of these initiatives could result in improvements to 
customer satisfaction, the program office could not demonstrate that 
they have produced or will produce measurable improvements. Moreover, 
the latest customer satisfaction data provided to us show that 
satisfaction levels are not improving. Further, it is unclear how these 
various initiatives relate to one another, and various aspects of these 
initiatives appear redundant such as multiple teams and venues to 
identify root causes and propose solutions. 

One reason for this lack of demonstrable improvements and redundancy is 
the way in which the program office has pursued its improvement 
initiatives. In particular, they have not been pursued as an integrated 
portfolio of projects that were justified and prioritized on the basis 
of relative costs and benefits. Further, they have not been guided by a 
well-defined action plan that includes explicit resource, schedule, and 
results-oriented baselines, as well as related steps for knowing 
whether expected outcomes and benefits have actually accrued. Rather, 
program officials stated that customer satisfaction improvement 
activities have been pursued as resources become available and have 
been in reaction to immediate issues and concerns. 

Without a proactive, integrated, and disciplined approach to improving 
customer satisfaction, the Navy does not have adequate assurance that 
it is optimally investing its limited resources. While the lean six 
sigma techniques that program officials told us they are now applying 
to customer satisfaction improvement advocate such an approach, program 
officials did not provide us with documentation demonstrating that they 
are effectively planning and executing these projects. 

Conclusions: 

IT service programs, like NMCI, are intended to deliver effective and 
efficient mission support and to satisfy customer needs. If they do 
not, or if they are not being managed in a way to know whether or not 
they do, then the program is at risk. Therefore, it is important for 
such programs to be grounded in outcome-based strategic goals that are 
linked to performance measures and targets, and it is important for 
progress against these goals, measures, and targets to be tracked and 
reported to agency and congressional decision makers. If such 
measurement does not occur, then deviations from program expectations 
will not become known in time for decision makers to take timely 
corrective action. The inevitable consequence is that program results 
will fall short of those that were promised and used to justify 
investment in the program. The larger the program, the more significant 
these deviations and their consequences can be. 

NMCI is an enormous IT services program and thus requires highly 
effective performance management practices. However, such management, 
to include measurement of progress against strategic program goals and 
reporting to key decision makers of performance against strategic goals 
and other important program aspects, such as examining service level 
agreement satisfaction from multiple vantage points and ensuring 
customer satisfaction, has not been adequate. One reason for this is 
that measurement of progress against strategic program goals has not 
been a priority for the Navy on NMCI, giving way to the Navy's focus on 
deploying NMCI seats to more sites despite a long-standing pattern of 
low customer satisfaction with the program and known performance 
shortfalls with certain types of seats. Moreover, despite investing in 
a range of activities intended to improve customer satisfaction, plans 
to effectively guide these improvement efforts, including plans for 
measuring the success of these activities, have not been developed. 
Given that the Navy reports that it has already invested about 6 years 
and $3.7 billion in NMCI, the time to develop a comprehensive 
understanding of the program's performance to date, and its prospects 
for the future, is long overdue. 

To its credit, the Navy recognizes the importance of measuring program 
performance, as evidenced by its use of service level agreements, its 
extensive efforts to survey customers, and its various customer 
satisfaction improvement efforts. However, these steps need to be given 
the priority that they deserve and be expanded to obtain a full and 
accurate picture of program performance. Doing less increases the risk 
of inadequately informing ongoing NMCI investment management decisions 
that involve huge sums of money and carry important mission 
consequences. 

Recommendations for Executive Action: 

To improve NMCI performance management and better inform investment 
decision making, we recommend that the Secretary of Defense direct the 
Secretary of the Navy to ensure that the NMCI program adopts robust 
performance management practices that, at a minimum, include (1) 
evaluating and appropriately adjusting the original plan for measuring 
achievement of strategic program goals and provides for its 
implementation in a manner that treats such measurement as a program 
priority; (2) expanding its range of activities to measure and 
understand service level agreement performance to provide increased 
visibility into performance relative to each agreement; (3) sharing the 
NMCI performance results with DOD, Office of Management and Budget, and 
congressional decision makers as part of the program's annual budget 
submissions; and (4) reexamining the focus, scope, and transparency of 
its customer satisfaction activities to ensure that areas of 
dissatisfaction described in this report are regularly disclosed to the 
aforementioned decision makers and that customer satisfaction 
improvement efforts are effectively planned and managed. In addition, 
we recommend that the Secretary of Defense direct the Secretary of the 
Navy, in collaboration with the various Navy entities involved in 
overseeing, managing, and employing NMCI, to take appropriate steps to 
ensure that the findings in this report and the outcomes from 
implementing the above recommendations are used in considering and 
implementing warranted changes to the NMCI's scope and approach. 

Agency Comments and Our Evaluation: 

In its comments on a draft of this report, signed by the Deputy 
Assistant Secretary of Defense (Command, Control, Communications, 
Intelligence, Surveillance, Reconnaissance & Information Technology 
Acquisition Programs) and reproduced in appendix IV, DOD agreed with 
our recommendations and stated that it has implemented, is 
implementing, or will implement each of them. In this regard, the 
department stated that the report accurately highlights the need to 
adjust the NMCI strategic goals and associated measures, and it 
committed to, among other things, sharing additional NMCI performance 
data with decision makers as part of the annual budget process. 
Notwithstanding this agreement, DOD also commented that the Navy 
believes that our draft report contained factual errors, 
misinterpretations, and unsupported conclusions. We do not agree with 
the Navy's position. The Navy's points are summarized below along with 
our response. 

* The Navy stated that our review focused on Navy shipyards and air 
depots to the exclusion of Marine Corps sites. We disagree. As the 
Objectives, Scope and Methodology section of our report points out, the 
scope of our review covered the entire NMCI program and extended to 
Navy and Marine Corps sites based on data we obtained from program 
officials. For example, our work on the extent to which NMCI had met 
its two strategic goals was programwide, and our work on SLA 
performance and customer satisfaction surveys included Navy and Marine 
Corps sites at which NMCI was operating and Navy and Marine Corps 
customers that responded to the program's satisfaction surveys. 

* The Navy stated that NMCI is a strategic success, noting that the 
program is meeting its goals of providing information superiority (as 
well as information security) and fostering innovation. As part of 
these statements, the Navy cited such things as the number of users 
supported and seats deployed, the types of capabilities fielded, and 
contracting actions taken. In addition, the Navy stated that NMCI has 
thwarted intrusion attacks that have penetrated other DOD systems, and 
it concluded that NMCI represents a major improvement in information 
superiority over the Navy's legacy network environment in such areas as 
virus protection and firewall architecture. It also noted that more 
Naval commands now have access to state-of-the-art workstations and 
network services, which it concluded means that NMCI is fostering 
innovation. While we do not question these various statements about 
capabilities, improvements, and access, we would note they are not 
results-oriented, outcome-based measures of success. Moreover, we do 
not agree with the statements about NMCI meeting its two strategic 
goals and being a strategic success. As we show in our report using the 
Navy's own performance categories, performance targets, and actual SLA 
and other performance data, NMCI met only 3 of the 20 performance 
targets spanning nine performance categories that the Navy established 
for determining goal attainment. Concerning these results, the Navy 
stated that our report's use of SLA performance data constitutes a 
recommendation on our part for using such data in determining program 
goal attainment, which the Navy said is "awkward" because SLAs "do not 
translate well into broad goals." We do not agree that our report 
recommends the use of any particular performance data and targets for 
determining program goal attainment. Our report's use of these data and 
targets is purely because the NMCI program office provided them to us 
in response to our inquiry for NMCI performance relative the nine Navy- 
established performance categories. We are not recommending any 
particular performance targets or data. Rather, we are recommending 
that the approach for measuring achievement of strategic goals be 
reevaluated and adjusted. Accordingly, we support DOD's comment that 
the Navy needs to adjust the original NMCI strategic goals and 
associated measures. 

* The Navy stated that we misinterpreted SLA data as they relate to the 
contractor performance categories of full payment and full performance. 
We disagree. The report presents a Navy-performed analysis of SLA data 
relative to the full payment and full performance categories that 
offers no interpretation of these data. However, because the Navy's 
analysis of SLA data is an aggregation, we performed a different 
analysis to provide greater visibility into individual SLA performance 
that the Navy's full payment and full performance analyses tends to 
hide. Our analysis also avoids the bundling and averaging concerns that 
the Navy raised. 

* The Navy stated that some of our customer satisfaction conclusions 
were unsupported. Specifically, the Navy said that the way it collects 
end-user satisfaction responses, 5.5 or higher on a scale of 10 
indicates a satisfied user, and such a scale is in line with industry 
practice. Therefore, the Navy said that user satisfaction survey 
responses do not "break out" in a way that supports our conclusion that 
scores of 5.5 through 7 are marginally satisfied users. We do not 
agree. While we recognize that the Navy's 1-10 scale does not 
differentiate between degrees of satisfaction, we believe that doing so 
would provide insight and perspective that is lacking from merely 
counting a user as satisfied or not satisfied. When we analyzed the 
responses to individual questions in terms of degrees of satisfaction, 
we found that average responses to 10 of 14 survey questions were 5.5 
to 7, which is clearly close to the lower limit of the satisfaction 
range. Also, with regard to customer satisfaction, the Navy stated that 
our inclusion in the report of subjective statements from shipyard and 
air depot officials did not include any data to support the officials' 
statements and thus did not support our conclusions. We recognize that 
the officials' statements are subjective and anecdotal, and our report 
clearly identified them as such. Nevertheless, we included them in the 
report because they are fully consistent with the customer satisfaction 
survey results and thus help illustrate the nature of NMCI user 
concerns and areas of dissatisfaction that the survey results show 
exist. 

* The Navy stated that NMCI provides adequate reports to key decision 
makers. However, we disagree because the reporting that the Navy has 
done has yet to disclose the range of performance and customer 
satisfaction issues that our report contains. Our message is that fully 
and accurately disclosing program and contractor performance and 
customer satisfaction to the various entities responsible for 
overseeing, managing, and employing NMCI will serve to strengthen 
program performance and accountability. 

The Navy also provided various technical comments, which we have 
incorporated as appropriate. 

We are sending copies of this report to interested congressional 
committees; the Secretary of Defense; the Secretary of the Navy; the 
Commandant of the Marine Corps; and the Director, Office of Management 
and Budget. We also will make copies available to others upon request. 
In addition, the report will be available at no charge on the GAO Web 
site at [Hyperlink, http://www.gao.gov]. 

If you have any questions concerning this information, please contact 
me at (202) 512-6256 or by e-mail at hiter@gao.gov. Contact points for 
our Offices of Congressional Relations and Public Affairs may be found 
on the last page of this report. Key contributors to this report are 
listed in appendix V. 

Signed by: 

Randolph C. Hite: 
Director, Information Technology Architecture and Systems Issues: 

List of Congressional Addressees: 

The Honorable John Warner: 
Chairman: 
The Honorable Carl Levin: 
Ranking Minority Member: 
Committee on Armed Services: 
United States Senate: 

The Honorable Susan M. Collins: 
Chairman: 
The Honorable Joseph I. Lieberman: 
Ranking Minority Member: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

The Honorable Judd Gregg: 
United States Senate: 

The Honorable Olympia J. Snowe: 
United States Senate: 

The Honorable John E. Sununu: 
United States Senate: 

The Honorable Duncan L. Hunter: 
Chairman: 
The Honorable Ike Skelton: 
Ranking Minority Member: 
Committee on Armed Services: 
House of Representatives: 

The Honorable Tom Davis: 
Chairman: 
The Honorable Henry A. Waxman: 
Ranking Minority Member: 
Committee on Government Reform: 
House of Representatives: 

The Honorable Thomas H. Allen: 
House of Representatives: 

The Honorable Michael H. Michaud: 
House of Representatives: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to review (1) whether the Navy Marine Corps 
Intranet (NMCI) is meeting its strategic goals, (2) the extent to which 
the contractor is meeting its service level agreements (SLA), (3) 
whether customers are satisfied with the program, and (4) what is being 
done to improve customer satisfaction. 

To determine whether NMCI is meeting its strategic goals, we: 

* reviewed documents provided by Department of the Navy describing the 
mission need for NMCI, strategic goals, performance measures, and data 
gathered on actual performance, 

* conducted interviews with officials from the offices of the 
Department of Defense Chief Information Officer (CIO), Department of 
the Navy CIO, and Assistant Secretary of the Navy for Research, 
Development, and Acquisition, including officials in the NMCI program 
office, 

* identified the NMCI strategic goals, related performance categories, 
associated performance targets, and actual performance data through 
document reviews and interviews, 

* developed an analysis showing NMCI's performance relative to the 
strategic goals, performance categories, and targets based upon 
available actual performance data, and: 

* shared our analysis with program officials and adjusted the analysis 
based on comments and additional data they provided. 

To determine the extent to which performance expectations defined in 
NMCI SLAs have been met, we: 

* conducted interviews with NMCI program office and contractor 
officials to gain an understanding of available SLA performance data 
and potential analysis methods, 

* obtained data on actual SLA performance that are used by the Navy as 
the basis for making performance-based payments to the contractor and, 
for each SLA, these data indicated whether one or more measurement(s) 
were taken and if so, whether the measure was met or not, for each seat 
type (i.e., basic, high end, and mission-critical), at every site for 
each month from October 2004 through March 2006,[Footnote 32] 

* analyzed data for site-specific SLAs by calculating the number of 
seats that met each agreement at each site for each month and when 
measurement data were available according to seat type, we calculated 
the number of seats that met each agreement for each seat type. 
Otherwise, we calculated the total number of seats that met each 
agreement. We counted an agreement as met at a site if all of the 
agreement's measured targets were met at the site for a given month. To 
calculate the percentage of seats for which an agreement was met, we 
added the total number of seats at all sites for which an agreement was 
met, and divided it by the total number of seats at all sites for which 
measurements were made, 

* analyzed data for enterprisewide SLAs by determining whether an 
agreement was met at all Navy (excluding the Marine Corps) and all 
Marine Corps sites for each month, and we counted an agreement as met 
if all of the agreement's measured targets were met for a given month, 

* compared our site specific and enterprisewide SLA analyses across 
months to identify patterns and trends in overall SLA performance and 
in situations were an SLA is composed of site specific and 
enterprisewide measures, we did not aggregate our site specific and 
enterprisewide results. Thus, an SLA could have been met at the site 
level but not at the enterprisewide, and vice versa, and: 

* described our analysis method and shared our results with program 
office and contractor officials and made adjustments based on their 
comments. 

To determine whether NMCI customers are satisfied, we: 

* obtained and analyzed results of end users surveys conducted from 
June 2002 through March 2006 and commanders and network operations 
leaders surveys from September 2005 through March 2006, 

* conducted interviews with NMCI program office and contractor 
officials to gain an understanding of how the surveys were developed 
and administered and their procedures for validating and auditing 
reported results, 

* analyzed data in the survey reports by comparing actual with desired 
results, and we also analyzed the data to identify trends in 
satisfaction levels over time and variation in satisfaction by 
question, organization, and type of service, and: 

* conducted interviews with a broad range of NMCI users at Navy sites: 
Portsmouth Naval Shipyard, Norfolk Naval Shipyard, Puget Sound Naval 
Shipyard, Jacksonville Naval Air Depot, and North Island Naval Air 
Depot. We selected these sites because they are among the largest, 
include diverse user communities, and represent different stages of 
program implementation. Participants in the interviews included 
officials from the Offices of the Commander, CIO, Information 
Technology and Communications Services, end users relying on NMCI 
desktop services in day-to-day operations, and the contractor. 

To determine what has been done to improve customer satisfaction, we: 

* interviewed program office and contractor officials to identify and 
develop an understanding of customer satisfaction improvement efforts. 
To determine the results and impact of each effort, and we interviewed 
program officials and obtained and analyzed relevant documentation, 

* researched best practices into effective management of improvement 
activities and compared the program office's approach with the 
practices we identified to evaluate the overall effectiveness of the 
customer satisfaction improvement activities, and: 

* attended the June 2005 NMCI enterprise conference to observe the 
proceedings. 

We performed our work from April 2005 to August 2006 in accordance with 
generally accepted government auditing standards. 

[End of section] 

Appendix II: Customer Satisfaction Survey Questions: 

This appendix includes the questions used in the three customer 
satisfaction surveys: End User Customer Satisfaction Survey, Navy 
Echelon II and Marine Corps Major Command Commander's Incentive Survey, 
and Navy and Marine Corps Network Operations Leader's Survey. 

End User Customer Satisfaction Survey Questions: 

The end user customer satisfaction survey consists of 14 questions, 10 
of which are tied to incentives. Users are asked to think only of the 
experiences they have had with the services during the prior 3 months. 
If a question is not relevant to their experience, they are asked to 
indicate that it is not applicable. Otherwise, they are asked to score 
it on a 1-10 scale with 1-5 being levels of dissatisfaction, and 6-10 
being levels of satisfaction. Users are also currently asked 
demographic information in the survey, as well as suggestions for 
improvement, and sources of dissatisfaction. Table 9 lists the end user 
customer satisfaction survey questions.[Footnote 33] 

Table 9: NMCI End User Customer Satisfaction Survey Questions: 

What is your satisfaction. 

* With having access to the computer hardware you need to accomplish 
your job? 

With the dependability of the computer you use? 

* With having access to the software you need to accomplish your job? 

With network reliability? 

With the professionalism of EDS personnel? 

With finding and using information about NMCI services? 

With the accuracy of information describing how to use NMCI services? 

* With training on how to use NMCI effectively? 

With technical support services provided by the help desk? 

With technical support services provided by on-site personnel? 

With the timeliness of problem resolution? 

With the solution implemented to correct any problem you experienced? 

*With the process to make changes to your IT environment? 

What is your overall satisfaction with services provided by EDS? 

Source: March 2006 Quarterly Customer Satisfaction Survey Report. 

Note: Questions marked with an asterisk are not used for incentive 
purposes. 

[End of table] 

Navy Echelon II Commanders and Marine Corps Major Command Commander's 
Customer Satisfaction Incentive Survey: 

The commander's customer satisfaction incentive survey consists of four 
topics (warfighter support services, cutover services, technology 
solutions, and service delivery) corresponding to key mission and/or 
business objective-related services or capabilities. Each topic is 
broken down into a number of subtopics. Under each subtopic, the survey 
asks commanders to indicate whether they agree, disagree, or have no 
basis to respond to a series of statements about EDS's performance. The 
survey also asks commanders to rate their overall satisfaction with 
each topic as "extremely satisfied," "mostly satisfied," "slightly 
satisfied," "not satisfied," or "no basis to respond." The last section 
of each topic contains two open-ended questions soliciting feedback on 
satisfaction with NMCI services. 

Table 10 is a condensed version of the commander's customer 
satisfaction survey that includes each of the subtopics, statements 
about EDS's performance, the overall topic satisfaction question, and 
the two open-ended questions.[Footnote 34] 

Table 10: Navy Echelon II and Marine Corps Major Command Commander's 
Customer Satisfaction Incentive Survey's Questions: 

Warfighter support services. 

1. Please evaluate EDS support for the following warfighter support 
service areas: 

Classified network support. 

* EDS understands the requirements unique to SIPRNet Systems; 
* EDS adequately supports SIPRNet operations; 
* EDS provides timely SIPRNet technical support; 
* EDS provides adequate remote access to the SIPRNet from NMCI seats. 

Deployable support. 

* EDS provides adequate and effective predeployment training; 
* EDS provides deployment process documentation that is readily 
available, clear, and accurate; 
* EDS provides effective NMCI help desk support to deployed assets; 
* EDS effectively supports the movement of resources out of the NMCI 
environment for deployment into IT21 and MTDN environments; 
* EDS effectively supports the reintegration of deployed resources into 
the NMCI environment; 
* EDS provides "Pack up Kits" with appropriate content for supporting 
resources while deployed. 

Emergent requirement support (support for unplanned events). 

* EDS effectively responds to emergent requirements; 
* EDS provides flexible and responsive support; 
* EDS is innovative in developing solutions to support emergent 
requirements. 

2. Please rate your overall satisfaction with warfighter support 
services; 

3. Comments and feedback; 
* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

1. Please evaluate EDS performance for the following cutover services: 

Cutover planning. 

* EDS incorporates lessons learned into its cutover planning processes; 
* EDS cutover planning considers requirements unique to specific sites 
and organizations; 
* EDS accurately identifies infrastructure "build out" requirements. 

Cutover preparation. 

* EDS correctly captures site/organization data in support of NMCI 
asset; 
* EDS coordinates with the designated points of contact prior to asset 
cutover; 
* EDS infrastructure "build outs" are completed correctly and in 
coordination with the government designated points of contact. 

Cutover execution. 

* EDS delivers complete and accurate services as ordered; 
* EDS delivers according to agreed upon schedules; 
* EDS fulfills its; 
* "Execution Discipline" obligations; 
* EDS effectively deploys specialized assets (classified, deployable, 
very small site design, etc.). 

2. Please rate your overall satisfaction with cutover services; 

3. Comments and feedback; 

* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

1. Please evaluate EDS performance for the following technology 
services: 

New service order and delivery process. 

* EDS makes available new services in a timely manner once they have 
been added to the contract and approved for operation; 
* EDS delivers ordered services in a timely fashion; 
* EDS delivers accurately against submitted task orders. 

Technical performance. 

* EDS NMCI service (hardware/software, help desk, on-site support, and 
connectivity) are available when and where needed; 
* EDS provides accurate and dependable technical services; 
* EDS provides quality technical services; 
* EDS technical services are flexible enough to support dynamic 
organizational needs. 

2. Please rate your overall satisfaction with technical solutions; 

3. Comments and feedback; 

* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

1. Please evaluate EDS performance in the following service delivery 
areas: 

Organizational understanding. 

* EDS understands my command's mission requirements; 
* EDS understands my command's operational processes; 
* EDS understands my organizational structure and hierarchy. 

Customer service. 

* EDS NMCI Help Desk Support (1.866.THE.NMCI) is consistent and 
effective; 
* EDS NMCI on-site technical support is consistent and effective; 
* EDS communicates relevant information to command personnel in a 
timely fashion; 
* EDS supports individual command requirements. 

Issue management. 

* EDS coordinates with the appropriate government personnel and 
representatives; 
* EDS responds to issues in a timely manner; 
* EDS resolves issues timely and effectively; 
* EDS develops solutions that are transferable throughout the 
enterprise; 
* EDS appropriately considers command and Department of Navy needs as 
part of issue prioritization; 
* EDS accurately tracks and provides insight into identified issues. 

2. Please rate your overall satisfaction with service delivery. 

3. Comments and feedback; 

* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

Source: Navy Echelon II Commands and Marine Corps Command Commanders 
Customer Satisfaction Incentive Survey: Period of Performance April 1, 
2005, through September 30, 2005. 

[End of table] 

Navy and Marine Corps Network Operations Leaders' Customer Satisfaction 
Incentive Survey: 

The network operations leaders' customer satisfaction incentive survey 
consists of three topics (mission support and planning, network 
management, and service delivery) corresponding to key mission and/or 
business objective-related services or capabilities. Each topic is 
broken down into a number of subtopics. Under each subtopic, the survey 
asks the leaders to indicate whether they agree, disagree, or have no 
basis to respond to a series of statements about EDS's performance. The 
survey also asks the leaders to rate their overall satisfaction with 
each topic as "extremely satisfied," "mostly satisfied," "slightly 
satisfied," "not satisfied," or "have no basis to respond." The last 
section of each topic contains two open-ended questions soliciting 
feedback on satisfaction with NMCI services. 

Table 11 is an abbreviated version of the network operations leader's 
surveys that includes each of the subtopics, statements about EDS's 
performance, the overall topic satisfaction question, and the two open- 
ended questions.[Footnote 35] 

Table 11: Navy and Marine Corps Network Operations Leader's Customer 
Satisfaction Incentive Survey's Questions: 

Mission support and planning: 1. Please evaluate EDS's performance for 
the following mission support and planning services: 

Interoperability support. 

* EDS adequately supports internal (Navy/ Marine Corps) 
interoperability; 
* EDS adequately supports external (.mil, .com, Joint, coalition) 
interoperability; 
* EDS provides adequate reach back capabilities to legacy 
systems/applications; 
* EDS correctly identifies and is able to resolve interoperability 
issues. 

Continuity of operations. 

* EDS is knowledgeable concerning continuity of operations plans; 
* EDS demonstrates the effectiveness of its continuity of operations 
plans; 
* EDS can effectively recover NMCI systems and data in the event of a 
disaster; 
* EDS effectively utilizes and supports the NMCI Military Detachment 
Training Program. 

Future readiness. 

* EDS solutions are scalable; 
* EDS is flexible in planning for future scenarios; 
* EDS is innovative in developing solutions to combat emerging IT 
threats to NMCI operations. 

Public Key Infrastructure (PKI) services. 

* EDS has developed an efficient and effective PKI solution; 
* EDS provides PKI services that are readily available and reliable; 
* EDS provides PI services that are easy to understand and use; 
* EDS provides adequate PKI related training and other instructional 
documentation. 

2. Please rate your overall satisfaction with mission support and 
planning services. 

3. Comments and feedback. 

* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

Network Management: 

1. Please evaluate EDS performance for the following network management 
services: 

Network status information. 

* EDS provides sufficient availability to network performance data; 
* EDS provides sufficiently detailed visibility into network 
performance issues; 
* EDS provides network performance data that adequately represents live 
network operations. 

Information Operations Condition (INFOCON)/Information Assurance 
Vulnerability Alert (IAVA) Awareness and Compliance. 

* EDS implements IAVA's in a timely manner; 
* EDS understands the requirements associated with each INFOCON level; 
* EDS adjusts to INFOCON changes in a timely manner. 

Urgent software patch implementation. 

* EDS efficiently and effectively supports the processes required to 
get urgent software patches approved so that they can be deployed onto 
the network; 
* EDS maintains a current knowledge of software patch availability and 
deployment processes; 
* EDS maintains accurate configuration management of software patch 
deployment throughout the enterprise; 
* EDS provides timely responses to urgent software patch releases. 

Data management. 

* EDS effectively manages user account data; 
* EDS effectively manages systems log data; 
* EDS effectively manages system permissions and trust relationships; 
* EDS effectively manages system and user backup data; 
* EDS effectively manages network architecture diagrams. 

2. Please rate your overall satisfaction with network management 
services. 

3. Comments and feedback. 

* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

Service delivered: 

1. Please evaluate EDS performance for the following service delivery 
areas: 

Organizational understanding. 

* EDS understands organizational mission requirements; 
* EDS understands organizational policies and operational procedures; 
* EDS understands your organizational structure. 

Communications. 

* EDS effectively communicates with the right people; 
* EDS effectively communicates planned maintenance and network outages; 
* EDS effectively communicates changes in NMCI configurations; 
* EDS coordinates with the appropriate parties when planning network 
events that significantly impact the network. 

Issue management. 

* EDS provides sufficient visibility into the status of open issues; 
* EDS appropriately coordinates issue resolution efforts with network 
operators; 
* EDS independently identifies and reports to the government network 
related issues; 
* EDS appropriately considers command and Navy needs in issue 
prioritization; 
* EDS applies lessons learned in order to resolve related issues across 
the NMCI enterprise. 

2. Please rate your overall satisfaction with EDS service delivery. 

3. Comments and feedback. 

* What improvement would most increase your satisfaction?; 
* If your satisfaction with this service has changed during the past 3 
months, what is the primary reason for the change? 

Source: Navy and Marine Corps Network Operations Leaders Customer 
Satisfaction Incentive Survey: Period of Performance April 1, 2005, 
through September 30, 2005. 

[End of table] 

[End of section] 

Appendix III: SLA Descriptions and Performance: 

This appendix contains descriptions and performance trends for NMCI's 
service level agreements. SLAs are measured at site level, 
enterprisewide, or both the site and enterprisewide. Site level SLA 
performance is based on the percentage of operational seats that met 
the SLA, meaning that all performance targets for a given SLA were met 
for a particular month. Where applicable, the percentage of seats 
meeting an SLA was analyzed by seat type (i.e., basic, high end, and 
mission-critical). 

Enterprisewide SLA performance is based on whether the SLA was met for 
a given month, meaning that all performance targets for a given SLA 
were met for a particular month. 

SLA 101-End user problem resolution: This SLA measures the percentage 
of all resolved NMCI problems against identified performance target 
values. Figure 14 portrays the contractor's historical site level 
performance with SLA 101. 

Figure 14: Site Level Performance SLA 101: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 102-Network problem resolution: This SLA measures the resolution of 
problems associated with the contractor provided network devices and 
connections. Figure 15 portrays the contractor's historical site level 
performance with SLA 102. 

Figure 15: Site Level Performance for SLA 102: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 103-End user services: This SLA measures performance with end user 
services, including E-mail, Web and Portal, File Share, Print, Network 
Logon, Access to Government Applications, and RAS services. Figure 16 
portrays the contractor's historical site level performance with SLA 
103. Figure 17 portrays the contractor's historical enterprisewide 
performance with SLA 103. 

Figure 16: Site Level Performance for SLA 103: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 17: Enterprisewide Performance for SLA 103: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 104-Help desk: This SLA measures help desk services including, 
average speed of answer, average speed of response, call abandonment 
rate, and first call resolution. Figure 18 portrays the contractor's 
historical enterprisewide performance with SLA 104. 

Figure 18: Enterprisewide Performance for SLA 104: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 105-Move, add, change (MAC): This SLA measures the time to complete 
MAC activity, from the receipt of the MAC request from an authorized 
government submitter to the completion of the MAC activity. MACs 
include activities such as moving a seat from one location to another 
and adding seats at a location. Figure 19 portrays the contractor's 
historical site level performance with SLA 105. 

Figure 19: Site Level Performance for SLA 105: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 106-Information assurance (IA) services: This SLA measures the 
contractor's IA services, including security event detection, security 
event reporting, security event response, and IA configuration 
management. Figure 20 portrays the contractor's historical 
enterprisewide performance with SLA 106. 

Figure 20: Enterprisewide Performance for SLA 106: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 107-NMCI intranet: This SLA measures performance of the NMCI 
Intranet in areas of availability, latency/packet loss,[Footnote 36] 
and quality of service in support of videoteleconferencing and voice- 
over-IP. Figure 21 portrays the contractor's historical site level 
performance with SLA 107. 

Figure 21: Site Level Performance for SLA 107: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 203-E-mail services: This SLA measures the performance of e-mail 
transfers. Figure 22 portrays the contractor's historical 
enterprisewide performance with SLA 203. 

Figure 22: Enterprisewide Performance for 203: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 204-Directory services: This SLA measures the availability and 
responsiveness of directory services. Directory services include 
supporting the management and use of file services, security services, 
messaging, and directory information (e.g., e-mail addresses) for 
users. Figure 23 portrays the contractor's historical site level 
performance with SLA 204. Figure 24 portrays the contractor's 
enterprisewide performance with SLA 204. 

Figure 23: Site Level Performance for SLA 204: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 24: Enterprisewide Performance for SLA 204: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 206-Web access services: This SLA measures the performance of user 
access to internal and external Web content. Figure 25 portrays the 
contractor's historical site level performance with SLA 206. Figure 26 
portrays the contractor's historical enterprisewide performance with 
SLA 206. 

Figure 25: Site Level Performance for SLA 206: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 26: Enterprisewide Performance for SLA 206: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 211-Unclassified but Sensitive Internet Protocol Router Network 
(NIPRNET) access: This SLA measures the performance of NIPRNET access, 
including latency and packet loss. Figure 27 portrays the contractor's 
historical site level performance with SLA 211. Figure 28 portrays the 
contractor's historical enterprisewide performance with SLA 211: 

Figure 27: Site Level Performance for SLA 211: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 28: Enterprisewide Performance for SLA 211: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 225-Base area network/local area network (BAN/LAN) communications 
services: This SLA measures BAN/LAN performance, including availability 
and latency. Figure 29 portrays the contractor's historical site level 
performance with SLA 225. 

Figure 29: Site Level Performance for SLA 225: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 226-Proxy and caching service: This SLA measures the availability 
of the proxy and caching services. Proxy servers are located between a 
client and a network server and are intended to improve network 
performance by fulfilling small requests. Figure 30 portrays the 
contractor's historical enterprisewide performance with SLA 226. 

Figure 30: Enterprisewide Performance for SLA 226: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 231-System service-domain name server: This SLA measures the 
availability and latency of Domain Name Server services. The Domain 
Name Server translates domain names to IP addresses and vice versa. 
Figure 31 portrays the contractor's historical site level performance 
with SLA 231. Figure 32 portrays the contractor's historical 
enterprisewide performance with SLA 231. 

Figure 31: Site Level Performance for SLA 231: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

Figure 32: Enterprisewide Performance for SLA 231: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 324-Wide area network connectivity: This SLA measures the percent 
of bandwidth used to provide connection to external networks. Figure 33 
portrays the contractor's historical site level performance with SLA 
324. 

Figure 33: Site Level Performance for SLA 324: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 325-BAN/LAN communication services: This SLA measures the percent 
of bandwidth utilized on shared network segments. Figure 34 portrays 
the contractor's historical site level performance for SLA 325. 

Figure 34: Site Level Performance for SLA 325: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 328-Network management service-asset management: This SLA measures 
the time it takes to implement new assets, such as seats, and 
application servers. Figure 35 portrays the contractor's historical 
site level performance with SLA 328. 

Figure 35: Site Level Performance for SLA 328: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 329-Operational support services: This SLA measures the 
effectiveness of NMCI's disaster recovery plan. Figure 36 portrays the 
contractor's historical enterprisewide performance with SLA 329. 

Figure 36: Enterprisewide Performance for SLA 329: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 332-Application server connectivity: This SLA measures both the 
time it takes for the contractor to implement the connectivity between 
the network backbone and an application server and the percentage of 
available bandwidth from an application server to the local supporting 
backbone. Figure 37 portrays the contractor's historical site level 
performance with SLA 332. 

Figure 37: Site Level Performance for SLA 332: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 333-NMCI security operational services-general: This SLA measures 
the percentage of successful accreditations on the first attempt, based 
on compliance with DOD certification and accreditation policies and 
procedures. Figure 38 portrays the contractor's historical 
enterprisewide performance with SLA 333. 

Figure 38: Enterprisewide Performance for SLA 333: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 334-Information assurance operational service-PKI: This SLA 
measures the timeliness of revoking a PKI certificate when required, 
ability of a NMCI user to obtain the DOD PKI certificate of another 
NMCI user, and the time it takes for user registration of DOD PKI 
within NMCI. Figure 39 portrays the contractor's historical 
enterprisewide performance with SLA 334. 

Figure 39: Enterprisewide Performance for SLA 334: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

SLA 336-Information assurance planning services: This SLA measures the 
time it takes to distribute new or revised security products (hardware 
and software). Figure 40 portrays the contractor's historical 
enterprisewide performance with SLA 336. 

Figure 40: Enterprisewide Performance for SLA 336: 

[See PDF for image] - graphic text: 

Source: GAO analysis of Navy data. 

[End of figure] - graphic text: 

[End of section] 

Appendix IV: Comments from the Department of Defense: 

Assistant Secretary Of Defense: 
6000 Defense Pentagon: 
Washington, DC 20301-6000: 
Networks And Information Integration: 

Nov 21 2006: 

Mr. Randolph C. Hite: 
Director, Information Technology Architecture and Systems Issues: 
U.S. Government Accountability Office: 
441 G Street, N. W. 
Washington, D.C. 20548: 

Dear Mr. Hite: 

This is the Department of Defense (DoD) response to the GAO draft 
report (GAO-07-51), "Information Technology: DoD Needs To Ensure That 
Navy Marine Corps Intranet Program Is Meeting Goals and Satisfying 
Customers", October 10, 2006. (GAO Code 310602). 

We appreciate the opportunity to comment on the draft report and the 
time your staff afforded us during their preparation of the report. 

The Department's response contains two parts. In the first enclosure, 
the Department's reply to each recommendation is provided. In general, 
we concur with your recommendations. In fact, the Department of the 
Navy (DoN) has previously implemented, is currently implementing, or 
plans to implement the report's recommendations. 

The second enclosure provides DoN comments on the draft report. The DoN 
is best suited to respond to the report's details because they have day-
to-day management and oversight authority for Navy Marine Corps 
Intranet. The DoN's comments seek to correct errors of fact, clarify 
misinterpretations, and address unsupported conclusions. 

Our point of contact is Leo Milanowski at 703-602-2720, ext 142. Please 
contact him with any questions or if you need clarification. 

Sincerely, 

Signed by: 

John R. Landon: 
Deputy Assistant Secretary of Defense: 
(C3ISR & IT Acquisition): 

Enclosures: 
As stated: 

GAO Draft Report Dated October 10, 2006 GAO-07-51 (GAO Codes 310602): 

"Information Technology: DOD Needs To Ensure That Navy Marine Corps 
Intranet Program Is Meeting Goals And Satisfying Customers" 

Department Of Defense Comments To The GAO Recommendations: 

Recommendation 1: The GAO recommends that the Secretary of Defense 
direct the Secretary of the Navy to ensure that the Navy Marine Corps 
Intranet (NMCI) Program adopt robust performance management practices 
that evaluate and appropriately adjust the original plan for measuring 
achievement of strategic program goals and provide for NMCI 
implementation in a manner that treats such measurement as a program 
priority: 

DoD Response: Concur; the report accurately highlights the need for DoN 
to adjust its NMCI strategic goals and associated measures. Given that 
more than six years have elapsed since the original plan was 
formulated, NMCI performance measures should have evolved to reflect 
changed priorities, incorporated lessons learned, and integrated 
updated guidance. 

Recommendation 2: The GAO recommends that the Secretary of Defense 
direct the Secretary of the Navy to ensure that the NMCI Program adopt 
robust performance management practices that expand its range of 
activities to measure and understand service level agreements (SLA) to 
provide increased visibility into performance relative to each 
agreement: 

DoD Response: Concur; NMCI currently undertakes extensive analysis of 
all SLA data to understand root causes of problems and to develop plans 
for corrective action. Areas that indicate special need - e.g. sites 
and seats not meeting SLA, or SLA which indicate a recurring problem - 
are afforded special coverage. The entire range of NMCI activities is 
under constant review, and if warranted, SLA coverage is expanded. For 
example, a current area under consideration is measurement metrics 
across the entire DoN enterprise. 

Recommendation 3: The GAO recommends that the Secretary of Defense 
direct the Secretary of the Navy to ensure that the NMCI Program adopt 
robust performance management practices that share the NMCI performance 
results with DoD, OMB, and congressional decision makers as a part of 
the program's annual budget submissions. 

DoD Response: Concur; the NMCI Program Office currently provides 
extensive reports to decision makers in DoN. In addition, the Program 
Office has complied with other requests for program information. As a 
part of its annual budget submission to the DoD and OMB, DoN submits an 
Exhibit 300 for NMCI. The Exhibit 300 will be reviewed for areas where 
additional performance data can be added in future submissions. 

Recommendation 4: The GAO recommends that the Secretary of Defense 
direct the Secretary of the Navy to ensure that the NMCI Program adopt 
robust performance management practices that re-examine the focus, 
scope, and transparency of its customer satisfaction activities to 
ensure that areas of dissatisfaction described in this report are 
regularly disclosed to the decision makers and that customer 
satisfaction improvement efforts are effectively planned and managed. 

DoD Response: Concur; the DoN has continually reexamined the focus, 
scope and transparency of its customer satisfaction efforts. It 
continues to revise survey questions, and has added surveys of the 
Commanders and Network Operations Leaders. The NMCI Program Office 
regularly briefs customer satisfaction results to DoN leadership via 
reports, briefings, meetings, and CIO Forum updates. Through these 
vehicles, leading causes of dissatisfaction are identified and actions 
prescribed to address them such as contract modifications, service 
additions, performance improvement efforts or other initiatives. 

Recommendation S: The GAO recommends that the Secretary of Defense 
direct the Secretary of the Navy, in collaboration with the various 
Navy entities involved in overseeing, managing, and employing NMCI, to 
take appropriate steps to ensure that the findings in this report and 
the outcomes from implementing the above recommendations are used to 
consider and implement warranted changes to the NMCI's scope and 
approach. 

DoD Response: Concur; the Office of the Secretary of Defense will 
engage with DoN to ensure that appropriate steps are taken to implement 
the report's recommendations, where warranted. 

General DoN Comments On Draft GAO Report GAO-07-51: 

1. The NMCI Program encompasses both the U.S. Navy and U.S. Marine 
Corps (USMC). The GAO did not visit any Marine Corps sites during this 
audit; the audit focused on Navy shipyards and depots. Consequently, 
the following comments do not reflect the USMC's views on the subject 
audit. Future audits that address the DoN information technology (IT) 
enterprise or its components should include the USMC through the USMC 
Inspector General's office. 

2. NMCI can best be characterized as a strategic success that has 
endured some tactical difficulties along the way. NMCI has put in place 
very stringent performance requirements, provided unprecedented 
detailed visibility into DoN IT expenditures and has developed a 
detailed reporting regimen for network performance and customer 
satisfaction. The initial contract has taught the DoN a lot about our 
environment and the challenges of operating an enterprise network. The 
DoN has already begun the requirements generation process for the Next 
Generation NMCI, and will be better prepared to address these 
shortcomings in future contracts. 

3. While NMCI's performance has been far from perfect, it has steadily 
improved over time. DoN recognizes that more improvement must be shown 
before NMCI can claim to be the preeminent network that DoN envisions. 
Since 2001, DoN has overcome a myriad of challenges in transitioning to 
NMCI from our legacy IT environment. With over 650,000 users now 
supported, NMCI has become the largest corporate intranet in the world. 
NMCI has consolidated over 1,000 separate legacy DoN networks into a 
single, secure common computing and communications backbone with a 
standardized set of hardware and software across the enterprise. NMCI 
has deployed over 320,000 seats at hundreds of locations, all served by 
a world-class IT infrastructure. We continue to build upon NMCI's 
capabilities. For example, in the past year, secure Broadband Remote 
Access (BuRAS), Cellular Wireless Broadband Access, 802.11 Wireless LAN 
capabilities and anti-spam and anti-spyware capabilities have been 
fielded in NMCI. NMCI was recently awarded the 2006 QUALCOMM 3G A-List 
Award for its use of a secure wireless solution in support of post- 
Katrina recovery efforts. We executed a contract extension in March 
2006 that clarified many contract provisions, reduced seat prices by 
15% in the last three years of the contract and positioned the DoN for 
a re-compete of the contract in 2010. 

4. One significant challenge that has been successfully met is 
information security. NMCI has never suffered a root-level intrusion 
and has thwarted attacks that penetrated other DoD systems on several 
occasions. NMCI is the only network that completely implemented and 
enforced the DoD's Public Key Infrastructure Cryptographic Log-On 
mandate. Everyday NMCI strips more than 4,000 potentially harmful 
attachments from e-mails. In 2005, NMCI detected and countered over 2 
million unauthorized intrusion attempts. NMCI provides a previously 
unavailable "fail-over" capability, the ability to re-route during 
contingency operations and a centralized Network Operations Center to 
maintain 24/7 surveillance of network availability. Compared with the 
DoN's legacy network environment, NMCI represents a quantum improvement 
in information superiority, as recognized by the Commander, Naval 
Network Warfare Command in directing the shutdown of legacy networks 
and migration to NMCI as part of its Operation Cyber Condition Zebra. 
Improved information security has been the greatest benefit of NMCI. 
The security of our network will continue to be the primary focus of 
our efforts. 

5. The report recommends measuring NMCI progress against performance 
measurements defined in the June 2000 NMCI Report to Congress, by 
mapping NMCI SLA directly to strategic performance measurements. (It 
should be noted that the performance metrics in the Report to Congress 
were drafted before the contract was negotiated and awarded.) In the 
NMCI contract, performance requirements designed to determine the level 
of payment to the vendor are in place, and are intended to measure very 
specific elements that do not translate well into broad goals. Mapping 
NMCI SLA directly to strategic performance measurements was attempted, 
but proved awkward in that a SLA result that was less than 100% 
attainment was reported as failure to achieve a goal. Many of the 
performance measurements in the Report to Congress (Interoperability, 
Information Assurance, Service Efficiency, Customer Satisfaction, 
Workforce Capabilities and Process Improvement) required comparisons to 
the DoN's legacy environment, in addition to NMCI performance data, to 
accurately gauge the overall effectiveness of NMCI. The NMCI 
Performance Business Case Analysis of September 2004 and other data, 
documents NMCI's qualitative and quantitative improvements in 
performance over the DoN's legacy environment in all of these areas. 
NMCI has, and continues to make, regular reports to DoN and DoD 
leadership on NMCI performance with regard to current goals as defined 
by the DoN IM/IT Strategic Plan, Secretary of the Navy Objectives, 
Chief of Naval Operations Guidance and DoD Enterprise Transition Plans. 

6. Overall, the DoN believes that NMCI is meeting its strategic goals 
of information superiority and fostering innovation. Compared with the 
DoN's legacy network environment, NMCI has been a vast improvement in 
supporting information superiority. The NMCI Operations Evaluation 
(OPEVAL) concluded that NMCI had: the "best external security of any 
Navy network"; improved network security against outsider threats; 
improved virus protection; a more robust firewall architecture; and 
improved ability to isolate and recover/restore from a hostile attack. 
The OPEVAL further concluded that NMCI's approach to defense-in-depth 
was a strength. NMCI is meeting its strategic goal of fostering 
innovation by providing interoperability and a high level of common 
services within the DoN. The NMCI OPEVAL found that, as a result of 
NMCI, many DoN commands that previously did not have state-of-the-art 
workstations and network services now do, thus eliminating the "have 
and have not" issue within DoN. 

7. The DoN believes the report's analysis of NMCI SLA data contains 
some understandable misinterpretations. The report tends to 
misinterpret the NMCI SLA categories of Full Payment and Full 
Performance, and often bundles and averages SLA data. Full Payment and 
Full Performance are separate calculations. The specific SLAs that 
apply for payment calculations depend on the Contract Line Items 
(CLINs) associated with sites and accordingly can vary from site to 
site (depending on the CLINs that have been ordered by each site). 
Misapplying these concepts can skew data and produce incorrect 
conclusions. NMCI regularly undertakes extensive analysis of all SLA 
data to understand root causes of problems and to develop plans of 
corrective action. 

8. We realize that customer satisfaction is not what it should be. 
However, the DoN believes that the report's analysis of NMCI customer 
satisfaction data reaches some unsupported conclusions. The report 
implies that NMCI's customer satisfaction standards are too low (5.5 on 
a scale of 10). NMCI's customer satisfaction standards are based on 
industry practice and in line with those of the Gartner Group (3 on a 
scale of 5) for "satisfied" users. The DoN is still not satisfied with 
the current level of customer satisfaction and continues to work to 
improve customer services. The report characterizes some users as 
"marginally" satisfied. NMCI customer satisfaction data does not break 
out this way. The customer satisfaction survey clearly indicates to the 
user whether or not a given score indicates satisfaction or 
dissatisfaction. The report includes subjective statements from naval 
shipyard and depot officials that were not accompanied by any objective 
data to support the conclusions implied and does not include 
information provided by the NMCI Program Manager that addressed many of 
these concerns. 

9. We believe that NMCI has provided and continues to provide adequate 
reports to key decision makers. NMCI is the most tested, overseen and 
reported upon network in history. No other DoN or DoD network has been 
tested as rigorously as NMCI; and no other DoN or DoD network has even 
been the subject of an operational test evaluation. This testing, 
combined with the continuous automated performance monitoring built 
into the NMCI network, has given the DoN detailed knowledge of every 
aspect of the capabilities, performance and daily health of NMCI. NMCI 
has been the subject of extensive oversight by Congress, the DoD and 
the DoN. NMCI makes regular and detailed reports to DoN and DoD 
leadership on progress, performance and program issues. Several reports 
on performance have been provided to Congress as a requirement for 
moving forward with the program. Numerous briefings were and continue 
to be provided to members of Congress and their staffs. 

[End of section] 

Appendix V: GAO Contact and Staff Acknowledgments: 

GAO Contact: 

Randolph C. Hite, 202-512-3439, hiter@gao.gov: 

Staff Acknowledgments: 

In addition to the individual named above, Mark Bird, Assistant 
Director; Scott Borre; Timothy Case; Barbara Collier; Vijay D'Souza; 
Neil Doherty; Jim Fields; Mike Gilmore; Peggy Hegg; Wilfred Holloway; 
George Kovachick; Frank Maguire; Charles Roney; Sidney Schwartz; Karl 
Seifert; Glenn Spiegel; Dr. Rona Stillman; Amos Tevelow; and Eric 
Winter made key contributions to this report. 

(310602): 

FOOTNOTES 

[1] Such IT outsourcing arrangements are commonly referred to as seat 
management contracts because they involve contactor-owned hardware and 
software assets and services that are bundled together and provided to 
a client at a fixed price per unit (or seat). 

[2] SLAs are contractually specified performance level expectations. 

[3] The Navy has about 550 NMCI sites. The number of seats at these 
sites ranges from 1 to about 14,000. 

[4] A basic seat involves the standard service that the Navy can order 
from the contractor; a high end seat includes enhanced performance such 
as increased processing power; and a mission-critical seat involves 
enhanced services such as greater maintenance responsiveness. 

[5] Among other things, this act defines the roles and responsibilities 
of CIOs relative to managing IT investments. 

[6] Certification is a comprehensive evaluation of security controls 
that provides the necessary information for a designated approving 
authority to formally declare that a system is approved to operate at 
an acceptable level of risk. Accreditation is the authorization of an 
information system to process, store, or transmit information that 
provides a form of quality control. The accreditation decision is to be 
based on the implementation of an agreed-upon set of management, 
operational, and technical controls for a system and is supported by a 
comprehensive evaluation or certification of these security controls 
that provides the necessary information for a designated approving 
authority to formally declare that a system is approved to operate. 

[7] National Defense Authorization Act for Fiscal Year 2002, Pub. L. 
No. 107-107, Dec. 28, 2001. 

[8] The Navy categorizes the sites as very small, small, or large based 
on the number of seats at the site. 

[9] NETWARCOM acts as the Navy's central operational authority for 
space, information technology requirements, and network and information 
operations in support of naval forces afloat and ashore. Among other 
things, it is responsible for operating a secure and interoperable 
naval network; coordinating and assessing Navy operational requirements 
for and use of network; command and control; information technology; 
and information operations and space. 

[10] MCNOSC is the Corps' enterprise network operations center and 
serves as the Marine component to U.S. Strategic Command's Joint Task 
Force for Computer Network Operations. Its mission is to provide global 
network operations and computer network defense in order to facilitate 
seamless information exchange in support of Marine and joint forces 
operating worldwide. MCNOSC is the Corps' nucleus for enterprise data 
network services, network support to deploying forces, and technical 
development of network-enabled IT solutions. 

[11] Prior to September 2005, the contract specified a third 
performance level--payment for improved performance. To achieve this 
level, the contractor would have to meet all of the applicable SLAs, 
and at least 50 percent of the seats would have to be operational. If 
these two conditions were met at a given site, it resulted in 90 
percent payment to the contractor for that site. 

[12] Echelon II's, otherwise known as Budget Submitting Offices (BSOs) 
are the Naval entities that report directly to the Chief of Naval 
Operations, including the Naval Air Systems Command, Office of Navy 
Intelligence, and Space and Naval Warfare Systems Command. Major 
Commands include entities that report directly to the Marine Corps, 
including Marine Forces Atlantic, Marine Forces Pacific, and Marine 
Forces Reserve. 

[13] To be eligible to be included in a quarterly survey, end users 
must have had at least 45 days of experience with services provided by 
the NMCI contractor, have an e-mail account and user identification 
code, and not have been included in any other recent surveys. Because 
additional end users are continually being transitioned to contractor- 
provided services, the number of individuals that have had 45 days or 
more of direct experience with these has increased since 2002, when the 
first survey was conducted. 

[14] The program is responsible for identifying end user requirements 
for computer hardware and software, managing the process of making 
changes to the IT environment, and conducting training to prepare end 
users for the transition to NMCI. 

[15] The contractor is responsible for, among other things, providing 
information on its services, appropriate computer hardware and software 
that meets requirements identified by the Navy, access to the NMCI 
Intranet, and customer services (e.g., help desk and other kinds of 
technical support). 

[16] GAO, Defense Acquisitions: Observations on the Procurement of the 
Navy/Marine Corps Intranet, GAO/T-NSIAD/AIMD-00-116 (Mar. 8, 2000). 

[17] GAO, Information Technology: Issues Affecting Cost Impact of Navy/ 
Marine Corps Intranet Need to be Resolved, GAO-03-33 (Washington, D.C.: 
Oct. 31, 2002). 

[18] GAO, Information Technology: DOD Needs to Leverage Lessons Learned 
from Its Outsourcing Projects, GAO-03-371 (Washington, D.C.: Apr. 25, 
2003). 

[19] See, for example, Executive Office of the President, Office of 
Management and Budget, Evaluating Information Technology Investments, A 
Practical Guide (November 1995). 

[20] LISI is a DOD method for measuring interoperability. According to 
DOD, it uses five levels (0 through 4 with 0 being the lowest). 
According to DOD, LISI typically has four categories: (1) Procedures, 
which focuses on the doctrine, policies and procedures, architecture, 
and technical standards that enable systems to exchange information; 
(2) Data, which covers the formats and protocols that enable data 
interchange, along with the shared semantics that enable information 
interchange; (3) Applications, which focuses on the applications that 
enable exchange, processing, and manipulation; and (4) Infrastructure, 
which addresses the technology environment (hardware, networks, systems 
services, etc.) that enable interaction. 

[21] According to NMCI program officials, responsibility for achieving 
this target should be viewed as shared among Navy organizations because 
bringing applications into compliance with security standards before 
they can be used on NMCI is outside the responsibility of the NMCI 
program. 

[22] According to a program official, "green," "yellow," and "red" mean 
low, medium, and high risk levels, respectively. 

[23] Public key infrastructure is a system of computers, software, and 
data that relies on certain cryptographic techniques for some aspects 
of security. For more information, see GAO, Information Security: 
Advances and Remaining Challenges to Adoption of Public Key 
Infrastructure Technology, GAO-01-277 (Washington, D.C.: Feb. 26, 
2001). 

[24] As discussed earlier in this report, the 23 SLAs are relevant to 
one or more types of seats. For each seat type, the performance 
measures being used can differ. 

[25] Latency is the time it takes for data to get from one designated 
point to another. Packet loss is when data traveling over a network 
fails to reach its destination. 

[26] April 1, 2004, is the beginning of the period covered by the 
quarterly customer satisfaction survey for the period ending on June 
30, 2004. 

[27] Beginning in March 2004, the end user satisfaction surveys, and 
the reported results, have differentiated between satisfaction with the 
NMCI program and the NMCI contractor. 

[28] The sites visited are Portsmouth Naval Shipyard, Norfolk Naval 
Shipyard, Puget Sound Naval Shipyard, Jacksonville Naval Air Depot, and 
North Island Naval Air Depot. 

[29] GAO, DOD Information Technology: Software and Systems Process 
Improvement Programs Vary in Use of Best Practices, GAO-01-116 
(Washington, D.C.: Mar. 30, 2001). 

[30] SEI is a federally funded research and development center 
established at Carnegie Mellon University to address software 
engineering practices. IDEALSMis a service mark of Carnegie Mellon 
University and stands for initiating, diagnosing, establishing, acting, 
and leveraging. For more information on the model, see IDEALSSM A 
User's Guide for Software Process Improvement (CMU/SEI-96-HB-001). 

[31] Lean six sigma combines two process improvement methodologies: 
lean focuses on improving speed and efficiency through the elimination 
of non-value added activities; six sigma focuses on increasing process 
precision and accuracy through the reduction in variation during 
performance of activities. 

[32] These data reflect revisions to the SLAs that the Navy and EDS 
agreed to in September 2004. We did not include classified seats 
because data about them were not readily available. 

[33] The survey questions listed are from the March 2006 survey results 
report. Questions could change over time. 

[34] Survey questions are from the September 2005 survey. Questions 
could change over time. 

[35] Survey questions are from the September 2005 survey. Questions 
could change over time. 

[36] Latency is the time it takes for data to get from one designated 
point to another. Packet loss is when data traveling over a network 
fails to reach its destination. 

GAO's Mission: 

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics. 

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading. 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office 

441 G Street NW, Room LM 

Washington, D.C. 20548: 

To order by Phone: 

Voice: (202) 512-6000: 

TDD: (202) 512-2537: 

Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: www.gao.gov/fraudnet/fraudnet.htm 

E-mail: fraudnet@gao.gov 

Automated answering system: (800) 424-5454 or (202) 512-7470: 

Public Affairs: 

Jeff Nelligan, managing director, 

NelliganJ@gao.gov 

(202) 512-4800 

U.S. Government Accountability Office, 

441 G Street NW, Room 7149 

Washington, D.C. 20548: