This is the accessible text file for GAO report number GAO-08-79 
entitled 'Information Technology: Census Bureau Needs to Improve Its 
Risk Management of Decennial Systems' which was released on October 5, 
2007. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

GAO Highlights: 

Highlights of GAO-08-79, a report to the Subcommittee on Federal 
Financial Management, Government Information, Federal Services, and 
International Security, Committee on Homeland Security and Governmental 
Affairs, U.S. Senate. 

Why GAO Did This Study: 

Automation and information technology (IT) are expected to play a 
critical role in the 2010 decennial census. The Census Bureau plans to 
spend about $3 billion on automation and technology that are to improve 
the accuracy and efficiency of census collection, processing, and 
dissemination. The Bureau is holding what it refers to as a Dress 
Rehearsal, during which it plans to conduct operational testing that 
includes the decennial systems. In view of the importance of IT 
acquisitions to the upcoming census, GAO was asked to (1) determine the 
status and plans for four key IT acquisitions, including schedule and 
cost, and (2) assess whether the Bureau is adequately managing 
associated risks. To achieve its objectives, GAO analyzed acquisition 
documents and the projects’ risk management activities and compared 
these activities to industry standards. 

What GAO Found: 

Three key systems acquisitions for the 2010 Census are in process, and 
a fourth contract was recently awarded. The ongoing acquisitions show 
mixed progress in meeting schedule and cost estimates. Currently, two 
of the projects are not on schedule, and the Bureau plans to delay 
certain functionality. The award of the fourth contract, originally 
scheduled for 2005, was awarded in September 2007. In addition, one 
project has incurred cost overruns and increases to its projected life-
cycle cost. As a result of the schedule changes, the full complement of 
systems and functionality that were originally planned will not be 
available for the Dress Rehearsal operational testing. This limitation 
increases the importance of further system testing to ensure that the 
decennial systems work as intended. 

The Bureau’s project teams for each of the four IT acquisitions have 
performed many practices associated with establishing sound and capable 
risk management processes, but critical weaknesses remain. Three 
project teams had developed a risk management strategy that identified 
the scope of the risk management effort. However, not all project teams 
had identified risks, established mitigation plans, or reported risks 
to executive-level officials. For example, one project team did not 
adequately identify risks associated with performance issues 
experienced by mobile computing devices. In addition, three project 
teams developed mitigation plans that were often untimely or included 
incomplete activities and milestones for addressing the risks. Until 
the project teams implement key risk management activities, they face 
an increased probability that decennial systems will not be delivered 
on schedule and within budget or perform as expected. 

Table: Performance of Risk Management Activities by Key Census 
Acquisition Projects: 

Specific practices: Preparing for risk management: Determine risk 
sources and categories; 
Acquisition Project: 1: Practice Partially Implemented; 
Acquisition Project: 2: Practice Fully Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Fully Implemented. 

Specific practices: Preparing for risk management: Define risk 
parameters; 
Acquisition Project: 1: Practice Fully Implemented; 
Acquisition Project: 2: Practice Fully Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Fully Implemented. 

Specific practices: Preparing for risk management: Establish and 
maintain a risk management strategy; 
Acquisition Project: 1: Practice Partially Implemented; 
Acquisition Project: 2: Practice Fully Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Fully Implemented. 

Specific practices: Preparing for risk management: Identify and involve 
the relevant stakeholders; 
Acquisition Project: 1: Practice Partially Implemented; 
Acquisition Project: 2: Practice Partially Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Partially Implemented. 

Specific practices: Identify and analyze risks: Identify and document 
the risks; 
Acquisition Project: 1: Practice Fully Implemented; 
Acquisition Project: 2: Practice Partially Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Partially Implemented. 

Specific practices: Identify and analyze risks: Evaluate, categorize, 
and prioritize risks; 
Acquisition Project: 1: Practice Partially Implemented; 
Acquisition Project: 2: Practice Fully Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Fully Implemented. 

Specific practices: Mitigate risks: Develop risk mitigation plans; 
Acquisition Project: 1: Practice Partially Implemented; 
Acquisition Project: 2: Practice Partially Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Not Implemented. 

Specific practices: Mitigate risks: Monitor status and implement risk 
mitigation plans; 
Acquisition Project: 1: Practice Partially Implemented; 
Acquisition Project: 2: Practice Partially Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Partially Implemented. 

Specific practices: Executive oversight: Review status with executive-
level management; 
Acquisition Project: 1: Practice Not Implemented; 
Acquisition Project: 2: Practice Not Implemented; 
Acquisition Project: 3: Practice Fully Implemented; 
Acquisition Project: 4: Practice Not Implemented. 

Sources: GAO analysis of Census project data against industry 
standards. 

[End of table] 

What GAO Recommends: 

GAO is recommending that the Bureau strengthen its systems testing and 
risk management activities, including risk identification and 
oversight. The Bureau agreed to examine additional ways to manage 
risks, but disagreed with the view that a full complement of systems 
would not be tested in a census-like environment, stating it planned to 
do so during the Dress Rehearsal or later; however, the test plans have 
not been finalized and it remains unclear whether this testing will be 
done. 

To view the full product, including the scope and methodology, click on 
GAO-08-79. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov. 

[End of section] 

Report to the Subcommittee on Federal Financial Management, Government 
Information, Federal Services, and International Security, Committee on 
Homeland Security and Governmental Affairs, U.S. Senate: 

United States Government Accountability Office: 

GAO: 

October 2007: 

Information Technology: 

Census Bureau Needs to Improve Its Risk Management of Decennial 
Systems: 

Information Technology: 

GAO-08-79: 

Contents: 

Letter1: 

Results in Brief: 

Background: 

Decennial IT Acquisitions Are at Various Stages of Development and Show 
Mixed Progress against Schedule and Cost Baselines: 

The Bureau Is Making Progress in Risk Management Activities, but 
Critical Weaknesses Remain: 

Conclusions: 

Recommendations for Executive Action: 

Agency Comments and Our Evaluation: 

Appendix I: Objectives, Scope, and Methodology: 

Appendix II: Key 2010 Census Information Technology Acquisitions: 

Appendix III: Comments from the Department of Commerce: 

GAO Comments: 

Appendix IV: GAO Contacts and Staff Acknowledgments: 

Tables: 

Table 1: Four Key IT Acquisitions Supporting Census 2010: 

Table 2: Comparison of FDCA Original and Revised Schedules: 

Table 3: FDCA Life-Cycle Cost Estimates: 

Table 4: Comparison of DRIS Original and Current Schedules: 

Table 5: DRIS Cost Estimates for Phase I (as of March 2006): 

Table 6: Risk Management Preparation Activities Completed for the Key 
2010 Census Systems: 

Table 7: Risk Identification and Evaluation Activities Completed for 
the Key 2010 Census Systems: 

Table 8: Risk Mitigation Activities Completed for Key 2010 Census 
Systems: 

Table 9: Executive-Level Risk Oversight Activities Completed for the 
Key 2010 Decennial Systems: 

Figures: 

Figure 1: Key 2010 Census Systems and Interfaces: 

Figure 2: Description and Examples of Key Risk Practice Areas: 

Abbreviations: 

CMMI®: Capability Maturity Model® Integration: 

DADSII: Data Access and Dissemination System II: 

DRIS: Decennial Response Integration System: 

FDCA: Field Data Collection Automation: 

IT: information technology: 

MAF: Master Address File: 

MTAIP: MAF/TIGER Accuracy Improvement Project: 

SEI: Software Engineering Institute: 

TIGER: Topologically Integrated Geographic Encoding and Referencing: 

United States Government Accountability Office: 

Washington, DC 20548: 

October 5, 2007: 

The Honorable Thomas R. Carper: 
Chairman: 
The Honorable Tom Coburn: 
Ranking Member: 
Subcommittee on Federal Financial Management, Government Information, 
Federal Services, and International Security: 
Committee on Homeland Security and Governmental Affairs: 
United States Senate: 

As you know, the decennial census is mandated by the U.S. Constitution 
and provides data that are vital to the nation. These data are used to 
reapportion the seats of the U.S. House of Representatives; realign the 
boundaries of the legislative districts of each state; allocate 
billions of dollars in federal financial assistance; and provide a 
social, demographic, and economic profile of the nation's people to 
guide policy decisions at each level of government. 

Carrying out the census is the responsibility of the department of 
Commerce's Census Bureau, which is now preparing for the 2010 Census. 
The Bureau is required to begin the population count on April 1, 2010, 
and the Secretary of Commerce is required to report to the President on 
the tabulation of total population by state within 9 months of that 
date.[Footnote 1] 

The Bureau plans to rely on automation and technology to improve the 
coverage, accuracy, and efficiency of the 2010 Census. Specifically, it 
has awarded four information technology (IT) contracts. It is also 
holding what it refers to as a Dress Rehearsal, a period centering 
around a mock Census Day on April 1, 2008. Planned Dress Rehearsal 
activities include operational testing of the 2010 Census systems in a 
census-like environment. The Bureau estimates that its IT acquisitions 
will spend about $3 billion of the total $11.5 billion cost of the 
entire census. 

Given the importance of these IT acquisitions, you asked us to (1) 
determine the status and plans, including schedule and costs, for four 
key IT acquisitions, and (2) assess whether the Bureau is adequately 
managing the risks facing these key system acquisitions. 

To address the first objective, we analyzed system documentation, 
including project plans, deliverables, cost estimates, earned value 
management data,[Footnote 2] other acquisition-related documents, as 
well as interviewed Bureau officials and contractors. To address the 
second objective, we identified sound industry standards and compared 
them to the Bureau's practices for the key acquisitions. We performed 
our work from December 2006 through August 2007 in accordance with 
generally accepted government auditing standards. Appendix I contains 
details about our objectives, scope, and methodology. 

Results in Brief: 

Three key systems acquisitions for the 2010 Census are in process, and 
a fourth contract was recently awarded. The status of each acquisition 
and the Census Bureau's plans are as follows: 

* In one project, the Bureau is modernizing the database that provides 
address lists, maps, and other geographic support services for the 
census. Currently, this project is on schedule to complete improvements 
by the end of fiscal year 2008 and is meeting cost estimates. 

* In a second project, the Bureau is acquiring systems, equipment, and 
infrastructure for field staff to use in collecting census data. 
Deliverables provided to date include mobile computing devices and 
installation of key support infrastructure. However, the schedule for 
this acquisition has been revised, resulting in delays in system 
development and testing of interfaces. Also, the life-cycle cost 
estimates for this program have increased, and we project an $18 
million cost overrun by December 2008. According to the contractor, the 
overrun is occurring primarily because of an increase in the number of 
system requirements. 

* In a third project, the Bureau is acquiring a system for integrating 
paper, telephone responses, and field operations. The software 
development and testing are currently on schedule to provide, by 
December 2007, an initial system to process the major census forms 
during the Dress Rehearsal activities. However, the schedule was 
revised in October 2005, which is delaying some functionality. For 
example, a telephone-assistance system that was originally intended to 
be completed by fiscal year 2008 has been delayed. This acquisition is 
meeting current cost estimates. 

* Finally, a contract to replace the current systems used to tabulate 
and disseminate census data was recently delayed by about a year from a 
previously deferred date. The Bureau awarded this contract in September 
2007. As a result, the Dress Rehearsal will use the current tabulation 
and dissemination system rather than a modernized version. 

Delays in functionality mean that the Dress Rehearsal operational 
testing will take place without the full complement of systems and 
functionality that was originally planned. As a result, further system 
testing will be necessary to ensure that the decennial systems work as 
intended. However, Bureau officials have not finalized their plans for 
testing all the systems, and it is not clear whether these plans will 
include testing to address all interrelated systems and functionality, 
such as end-to-end testing.[Footnote 3] According to officials, these 
plans will not be finalized until February 2008. Without sufficient 
testing of all systems and their functionality, the Bureau increases 
the risk that costs will increase further, that decennial systems will 
not perform as expected, or both. 

The Bureau has taken action to manage the risks facing the four 
acquisitions; that is, the four project teams managing the acquisitions 
have performed many practices associated with establishing sound and 
capable risk management processes; however, critical weaknesses remain. 
Specifically, three of the four project teams had developed risk 
management strategies identifying the scope of their risk management 
efforts; however, three project teams had weaknesses in identifying 
risks, establishing mitigation plans that identified planned actions 
and milestones, and reporting risk status to executive-level officials. 
For example, one project team did not adequately identify risks 
associated with performance issues experienced by mobile computing 
devices. In addition, three project teams developed mitigation plans 
that were often untimely or included incomplete activities and 
milestones for addressing the risks. Also, two projects did not provide 
evidence of reporting risk status to executive-level officials. As we 
have previously reported, a root cause of weaknesses in completing key 
risk management activities is the lack of policies for managing major 
acquisitions at the Bureau.[Footnote 4] Until the project teams 
implement key risk management activities, they face an increased 
probability that decennial systems will not be delivered on schedule 
and within budget or perform as expected. 

Because the entire complement of systems will not be available for 
Dress Rehearsal activities as originally planned, we are recommending 
that the Census Bureau plan for and perform end-to-end testing so that 
all systems are tested in a census-like environment. To help ensure 
that the three key acquisitions for the 2010 Census operate as 
intended, we are also recommending that the project teams strengthen 
risk management activities, including those associated with risk 
identification, mitigation, and oversight. 

In response to a draft of this report, the Under Secretary for Economic 
Affairs of Commerce provided written comments from the department. 
These comments are reproduced in appendix III. Specifically, with 
regard to risk management, the department said it plans to examine 
additional ways to manage risks and will prepare a formal action plan 
in response to our final report. However, the department said it had a 
major disagreement with our findings with regard to operational 
testing, stating it plans to test all critical systems and interfaces 
during the Dress Rehearsal or later. Nonetheless, the Bureau's test 
plans have not been finalized, and it remains unclear whether testing 
will address all interrelated systems and functionality in a census- 
like environment, as would be provided by end-to-end testing. 
Consistent with our recommendation, following up with documented test 
plans to do end-to-end testing will help ensure that decennial systems 
will work as intended. The department also provided technical comments 
that we incorporated where appropriate. 

Background: 

The Census Bureau's mission is to serve as the leading source of high- 
quality data about the nation's people and economy. The Bureau's core 
activities include conducting decennial, economic, and government 
censuses, conducting demographic and economic surveys, managing 
international demographic and socioeconomic databases, providing 
technical advisory services to foreign governments, and performing such 
other activities as producing official population estimates and 
projections. 

Conducting the decennial census is a major undertaking involving 
considerable preparation, which is currently under way. A decennial 
census involves: 

* identifying and correcting addresses for all known living quarters in 
the United States (known as "address canvassing"); 

* sending questionnaires to housing units; 

* following up with nonrespondents through personal interviews; 

* identifying people with nontraditional living arrangements; 

* managing a voluminous workforce responsible for follow-up activities; 

* collecting census data by means of questionnaires, calls, and 
personal interviews; 

* tabulating and summarizing census data; and: 

* disseminating census analytical results to the public. 

Role of IT in the Decennial Census: 

The Bureau estimates that it will spend about $3 billion on automation 
and IT for the 2010 Census, including four major systems acquisitions 
that are expected to play a critical role in improving its coverage, 
accuracy, and efficiency. Figure 1 shows the key systems and interfaces 
supporting the 2010 Census; the four major IT systems involved in the 
acquisitions are highlighted. As the figure shows, these four systems 
are to play important roles with regard to different aspects of the 
process. 

Figure 1: Key 2010 Census Systems and Interfaces: 

[See PDF for image] 

Source: U.S. Census Bureau. 

Note: Shaded boxes indicate systems discussed in the report. 

[End of figure] 

To establish where to count (as shown in the top row of fig. 1), the 
Bureau will depend heavily on a database that provides address lists, 
maps, and other geographic support services. The Bureau's address list, 
known as the Master Address File (MAF), is associated with a geographic 
information system containing street maps; this system is called the 
Topologically Integrated Geographic Encoding and Referencing (TIGER®) 
database.[Footnote 5] The MAF/TIGER database, highlighted in fig. 1, is 
the object of the first major IT acquisition--the MAF/TIGER Accuracy 
Improvement Project (MTAIP). The project is to provide corrected 
coordinates on a county-by-county basis for all current features in the 
TIGER database. The vital role of this database in the census 
operations is the reason that MTAIP is a key acquisition, even though 
it is relatively small in scale (compared with the other three key IT 
acquisitions) and will not result in new systems. 

To collect respondent information (see the middle row of fig. 1), the 
Bureau is pursuing two initiatives. First, the Field Data Collection 
Automation (FDCA) program is expected to provide automation support for 
field data collection operations as well as reduce costs and improve 
data quality and operational efficiency. This acquisition includes the 
systems, equipment, and infrastructure that field staff will use to 
collect census data, such as mobile computing devices.[Footnote 6] 

Second, the Decennial Response Integration System (DRIS) is to provide 
a system for collecting and integrating census responses from all 
sources, including forms, telephone interviews, and mobile computing 
devices in the field. DRIS is expected to improve accuracy and 
timeliness by standardizing the response data and providing it to other 
Bureau systems for analysis and processing. 

To provide results, the Data Access and Dissemination System II (DADS 
II) acquisition (see the bottom row of fig. 1) is to replace legacy 
systems for tabulating and publicly disseminating data. The DADS II 
program is expected to provide comprehensive support to DADS. 
Replacement of the legacy systems is expected to: 

* maximize the efficiency, timeliness, and accuracy of tabulation and 
dissemination products and services; 

* minimize the cost of tabulation and dissemination; and: 

* increase user satisfaction with related services. 

Table 1 provides a brief overview of the four acquisitions. 

Table 1: Four Key IT Acquisitions Supporting Census 2010: 

IT acquisition: MAF/TIGER Accuracy Improvement Project (MTAIP); 
Purpose: Modernize the system that provides the address list, maps, and 
other geographic support services for the Census and other Bureau 
surveys. 

IT acquisition: Field Data Collection Automation (FDCA); 
Purpose: Provide automated resources for supporting field data 
collection, including the provision of handheld mobile computing 
devices to collect data in the field, including address and map data. 

IT acquisition: Decennial Response Integration System (DRIS); 
Purpose: Provide a solution for data capture and respondent assistance. 

IT acquisition: Data Access and Dissemination System (DADS II); 
Purpose: Develop a replacement for the DADS legacy tabulation and 
dissemination systems. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

Responsibility for these acquisitions lies with the Bureau's Decennial 
Management Division and the Geography Division. Each of the four 
acquisitions is managed by an individual project team staffed by Bureau 
personnel. Additional information on the contracts for these four 
systems is provided in appendix II. 

In preparation for the 2010 Census, the Bureau plans a series of tests 
of its operations and systems (new and existing) in different 
environments, as well as to conduct what it refers to as the Dress 
Rehearsal. During the Dress Rehearsal period, which runs from February 
2006 through June 2009, the Bureau plans to conduct development and 
testing of systems, run a mock Census Day, and prepare for Census 2010, 
which will include opening offices and hiring staff. 

As part of the Dress Rehearsal activities, the Bureau began address 
canvassing[Footnote 7] in April 2007 and plans to distribute 
questionnaires in February 2008 in preparation for the mock Census Day 
on April 1, 2008. It plans to begin performing nonresponse follow-up 
activities immediately afterwards. These Dress Rehearsal activities are 
to provide an operational test of the available system functionalities, 
in a census-like environment, as well as other operational and 
procedural activities. 

Prior IT Management Reviews of Census Activities: 

We have previously reported on weaknesses in the Bureau's IT 
acquisition management. In June 2005, we reported on the Bureau's 
progress in five IT areas--investment management, systems development 
and management, enterprise architecture management, information 
security, and human capital.[Footnote 8] These areas are important 
because they have substantial influence on the effectiveness of 
organizational operations and, if implemented effectively, can reduce 
the risk of cost and schedule overruns and performance shortfalls. We 
reported that while the Bureau had many practices in place, much 
remained to be done to fully implement effective IT management 
capabilities. To improve the Bureau's IT management, we made several 
recommendations. The Bureau agreed with the recommendations but is 
still in the process of implementing them. 

In March 2006, we presented testimony on the Bureau's progress in 
implementing acquisition and management capabilities for two key IT 
system acquisitions for the 2010 Census--FDCA and DRIS.[Footnote 9] We 
testified that although the project offices responsible for these two 
contracts had carried out initial acquisition management activities, 
neither office had the full set of capabilities needed to effectively 
manage the acquisitions, including a full risk management process. 
Effective management of major IT programs requires that organizations 
use sound acquisition and management processes, including project and 
acquisition planning, solicitation, requirements development and 
management, and risk management. We recommended that the Bureau 
implement key activities needed to effectively manage acquisitions. For 
example, we recommended that the Bureau establish and enforce a system 
acquisition management policy that incorporates best practices, 
including those for risk management. The Bureau agreed with our 
recommendations and is in the process of implementing them. 

Decennial IT Acquisitions Are at Various Stages of Development and Show 
Mixed Progress against Schedule and Cost Baselines: 

Three key systems acquisitions for the 2010 Census are in process, and 
a fourth contract was recently awarded. The ongoing acquisitions are 
showing mixed progress in providing deliverables while adhering to 
planned schedules and cost estimates. Currently, two of the three 
projects have experienced schedule delays, and the date for awarding 
the fourth contract was postponed several times. In addition, we 
estimate that one of the three ongoing projects (FDCA) will incur about 
$18 million in cost overruns. In response to schedule delays as well as 
other factors, including cost, the Bureau has made schedule adjustments 
and plans to delay certain system functionality. As a result, Dress 
Rehearsal operational testing will not address the full complement of 
systems and functionality that was originally planned, and the Bureau 
has not yet finalized its plans for further system tests. Delaying 
functionality increases the importance of system testing after the 
Dress Rehearsal operational testing to ensure that the decennial 
systems work as intended. 

MTAIP Is Completing Improvements on Schedule and at Estimated Cost: 

MTAIP is a project to improve the accuracy of the MAF/TIGER database, 
which contains information on street locations, housing units, rivers, 
railroads, and other geographic features. MTAIP is to provide corrected 
coordinates on a county-by-county basis for all current features in the 
TIGER database. Features not now in TIGER are to be added with accurate 
coordinates and required attributes. 

Currently, the acquisition is in the second and final phase of its life 
cycle. During Phase I, from June 2002 through December 2002, the 
contractor identified technical requirements and established the 
production approach for Phase II activities. In Phase II, which began 
in January 2003 and is ongoing, the contractor is developing improved 
maps for all 3,037 counties in the United States; to date, it has 
delivered more than 75 percent of these maps, which are due by 
September 2008. Beginning in fiscal year 2008, maintenance for the 
contract will begin. The contract closeout activities are scheduled for 
fiscal year 2009. 

MTAIP is on schedule to complete improvements by the end of fiscal year 
2008 and is meeting cost estimates. The following is the status of 
MTAIP's schedule and cost estimates: 

* The MTAIP acquisition is on schedule for the deliverables for Phases 
I and II. According to Bureau documents, as of September 2006, the 
contractor (Harris Corporation) had delivered (as required) 2,000 
improved county maps out of the 3,037. As of March 2007, Bureau 
documents showed that the contractor had completed 338 of the 694 
counties expected to be complete by the end of fiscal year 2007. The 
contractor is scheduled to complete the remaining 356 counties by the 
end of fiscal year 2007. 

* Cost estimates for Phase I and Phase II are $4.8 million and $205.2 
million, respectively, for a total contract value of $210 million. The 
contract met cost estimates for Phase I, and based on cost performance 
reports, we project no cost overruns by September 2008. As of June 
2007, the Bureau had obligated $178 million through September 2010. 

FDCA Has Provided Deliverables, but It Has Delayed Functionality and Is 
Experiencing Cost Increases: 

FDCA is to provide the systems, equipment, and infrastructure that 
field staff will use to collect census data. It is to establish office 
automation for the 12 regional census centers, the Puerto Rico area 
office, and approximately 450 temporary local census offices. It is to 
provide the telecommunications infrastructure for headquarters, 
regional and local offices, and mobile computing devices for field 
workers. FDCA also is to facilitate integration with other 2010 Census 
systems and to provide development, deployment, technical support, de- 
installation, and disposal services. At the peak of the 2010 Census, 
about 4,000 field operations supervisors, 40,000 crew leaders, 500,000 
enumerators and address listers, and several thousand office employees 
are expected to use or access FDCA components. 

The FDCA acquisition is currently in the first phase of execution, 
since it has completed its baseline planning period in June 2006. The 
contractor is currently in the process of developing and testing FDCA 
software for the Dress Rehearsal Census Day. In future phases, the 
project will continue development, deploy systems and hardware, support 
census operations, and perform operational and contract closeout 
activities. 

However, as shown in table 2, according to the Bureau it revised its 
original schedule and delayed or eliminated some key functionality that 
was expected to be ready during Execution Period 1. The Bureau, said it 
revised the schedule because it realized it had underestimated the 
costs for the early stages of the contract, and that it could not meet 
the level of first-year funding because the fiscal year 2006 budget was 
already in place. According to the Bureau, this initial underestimation 
led to schedule changes and overall cost increases. 

Table 2: Comparison of FDCA Original and Revised Schedules: 

Phase: Baseline Planning Period; 
Dates: March 31-June 30, 2006; 
Original schedule (March 2006): 
* Develop project oversight documentation; 
Revised schedule (July 2006): No change. 

Phase: Execution Period 1; 
Dates: July 1, 2006-December 31, 2008; 
Original schedule (March 2006): 
* Deliver consolidated approach to software development; 
Revised schedule (July 2006): Deliver software development activities 
into an incremental approach. 

Phase: Execution Period 1; 
Dates: July 1, 2006-December 31, 2008; 
Original schedule (March 2006): 
* Develop a space tracking system; 
Revised schedule (July 2006): Eliminated. 

Phase: Execution Period 1; 
Dates: July 1, 2006-December 31, 2008; 
Original schedule (March 2006): 
* Develop an automated software distribution system; 
Revised schedule (July 2006): Delayed to Execution Period 2. 

Phase: Execution Period 1; 
Dates: July 1, 2006-December 31, 2008; 
Original schedule (March 2006): 
* Provide mobile computing devices; 
Revised schedule (July 2006): Delivered in March 2007 for Dress 
Rehearsal address canvassing. 

Phase: Execution Period 2; 
Dates: January 1, 2009-September 30, 2011; 
Original schedule (March 2006): 
* Deploy the 2010 FDCA solution; 
Revised schedule (July 2006): No change. 

Phase: Execution Period 2; 
Dates: January 1, 2009-September 30, 2011; 
Original schedule (March 2006): 
* Complete operational testing; 
Revised schedule (July 2006): No change. 

Phase: Execution Period 2; 
Dates: January 1, 2009-September 30, 2011; 
Original schedule (March 2006): 
* Conduct 2010 Census operations; 
Revised schedule (July 2006): No change. 

Phase: Execution Period 2; 
Dates: January 1, 2009-September 30, 2011; 
Original schedule (March 2006): [Empty]; 
Revised schedule (July 2006): Added delayed activities. 

Phase: Execution Period 3; 
Dates: August 1, 2010-end of contract; 
Original schedule (March 2006): 
* Perform operational and contract closeout activities; 
Revised schedule (July 2006): No change. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

In the revised schedule, the Bureau delayed or eliminated some key 
functionality from the Dress Rehearsal, including the automated 
software distribution system. Further, the revised software development 
schedule stretches from two to seven increments over a longer period of 
time. Delivery of these increments ranges from December 2006 through 
December 2008. As of May 2007, the contractor reported that the 
increment development schedule continues to be aggressive. 

The project is meeting all planned milestones on the revised schedule. 
The contractor has delivered 1,388 mobile computing devices to be used 
in address canvassing for the Dress Rehearsal. Also, key FDCA support 
infrastructure has been installed, including the Network Operations 
Center, Security Operation Center, and the Data Processing Centers. 
According to the department, all Regional Census Centers and Puerto 
Rico area offices have been identified and are on schedule to open in 
January 2008. 

The project life-cycle costs have already increased. At contract award 
in March 2006, the total cost of FDCA was estimated not to exceed $596 
million. However, in September 2006, the project life-cycle cost was 
increased to about $624 million. In May 2007, the life-cycle cost rose 
by a further $23 million because of increasing system requirements, 
which resulted in an estimated life-cycle cost of about $647 million. 
Table 3 shows the current life-cycle cost estimates for FDCA. 

Table 3: FDCA Life-Cycle Cost Estimates: 

Dollars in millions. 

Baseline planning period; 
Start date: March 31, 2006; 
End date: June 30, 2006; 
Cost estimates: September 2006: $11; 
Cost estimates: May 2007: $11. 

Execution Period 1; 
Start date: July 1, 2006; 
End date: December 31, 2008; 
Cost estimates: September 2006: 200; 
Cost estimates: May 2007: 225. 

Execution Period 2; 
Start date: January 1, 2009; 
End date: September 30, 2011; 
Cost estimates: September 2006: 319; 
Cost estimates: May 2007: 318. 

Execution Period 3; 
Start date: August 1, 2010; 
End date: End of contract; 
Cost estimates: September 2006: 10; 
Cost estimates: May 2007: 10. 

Leased equipment; 
Start date: N/A; 
End date: N/A; 
Cost estimates: September 2006: 12; 
Cost estimates: May 2007: 12. 

Management reserve; 
Start date: N/A; 
End date: N/A; 
Cost estimates: September 2006: 7; 
Cost estimates: May 2007: 5. 

Award fee; 
Start date: N/A; 
End date: N/A; 
Cost estimates: September 2006: 65; 
Cost estimates: May 2007: 65. 

Total; 
Start date: [Empty]; 
End date: [Empty]; 
Cost estimates: September 2006: $624; 
Cost estimates: May 2007: $647. 

Source: GAO analysis of Census Bureau data. 

Note: Total may not add due to rounding. 

[End of table] 

In addition, the FDCA project has already experienced $6 million in 
cost overruns, and more are expected. Both our analysis and the 
contractor's analysis expect FDCA to experience additional cost 
overruns. Based on our analysis of cost performance reports (from July 
2006 to May 2007), we project that the FDCA project will experience 
further cost overruns by December 2008. The FDCA cost overrun is 
estimated between $15 million and $19 million, with the most likely 
overrun to be about $18 million. Harris, in contrast, estimates about a 
$6 million overrun by December 2008. 

According to Harris, the major cause of projected cost overruns is the 
system requirements definition process. For example, in December 2006, 
Harris indicated that the requirements for the Dress Rehearsal Paper 
Based Operations in Execution Period 1 had increased significantly. 
According to the cost performance reports, this increase has meant that 
more work must be conducted and more staffing assigned to meet the 
Dress Rehearsal schedule. 

The schedule changes to FDCA have increased the likelihood that the 
systems testing at the Dress Rehearsal will not be as comprehensive as 
planned. The inability to perform comprehensive operational testing of 
all interrelated systems increases the risk that further cost overruns 
will occur and that decennial systems will experience performance 
shortfalls. 

After a Schedule Revision, DRIS Is Delivering Reduced Functionality at 
Projected Cost: 

DRIS is to provide a system for collecting and integrating census 
responses, standardizing the response data, and providing it to other 
systems for analysis and processing. The DRIS functionality is critical 
for providing assistance to the public via telephone and for monitoring 
the quality and status of data capture operations. 

The DRIS acquisition is currently in the first of three overlapping 
project phases. In Phase I, which extends from March 2006 to September 
2008, the project is performing software development and testing of 
DRIS. By December 2007, it is to provide an initial system to be used 
for the Dress Rehearsal Census Day, during which DRIS will process 14 
census forms (out of 84 possible forms). In October 2007, the project 
is to begin Phase II, in which it is to deploy the completed system and 
perform other activities to support census operations. The final phase 
is to be devoted to data archiving and equipment disposal. 

Although DRIS is currently on schedule to meet its December 2007 
milestone, the Bureau revised the original DRIS schedule after the 
contract was awarded in October 2005. Under the revised schedule (see 
table 4), the Bureau delayed or eliminated some functionality that was 
expected to be ready for the Dress Rehearsal Census Day. 

Table 4: Comparison of DRIS Original and Current Schedules: 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: Deliver solution design and documentation; 
Revised schedule: Reduced scope. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: 
* Requirements definition; 
Revised schedule: Eliminated. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule:  
* Workflow segment cross-program testing; 
Revised schedule: Eliminated. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: Develop, test, and deploy the DRIS Dress Rehearsal 
solution; 
Revised schedule: Reduced scope. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: 
* Telephone Questionnaire Assistance System; 
Revised schedule: Delayed to Phase 2. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: Capture all questionnaire forms; 
Revised schedule: Delayed to Phase 2. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: Conduct Dress Rehearsal; 
Revised schedule: Reduced scope. 

Phase and dates: Phase 1: March 2006-September 2008; 
Original schedule: Site selection, design, build-out,[A] and fit-up[B] 
of data centers for the 2010 Census; 
Revised schedule: Delayed to Phase 2. 

Phase and dates: Phase 2: October 2007-January 2011; 
Original schedule: Deploy the 2010 DRIS solution; 
Revised schedule: No change. 

Phase and dates: Phase 2: October 2007-January 2011; 
Original schedule: Complete operational testing; 
Revised schedule: No change. 

Phase and dates: Phase 2: October 2007-January 2011; 
Original schedule: Conduct 2010 Census operations; 
Revised schedule: No change. 

Phase and dates: Phase 2: October 2007-January 2011; 
Original schedule: Shut down the data centers; 
Revised schedule: No change. 

Phase and dates: Phase 2: October 2007-January 2011; 
Original schedule: [Empty]; 
Revised schedule: Added delayed activities. 

Phase and dates: Phase 3: July 2010-end of contract: 
Original schedule: Archive DRIS data and image per NARA guidelines; 
Revised schedule: No change. 

Phase and dates: Phase 3: July 2010-end of contract: 
Original schedule: Dispose of all DRIS equipment; 
Revised schedule: No change. 

Source: GAO analysis of Census Bureau data. 

[A] Build-out is the upgrading of facilities in order to prepare them 
for the installation of equipment, telecommunications, etc. 

[B] Fit-up is the process of setting up facilities with computer 
equipment, furniture, water, power, heating, ventilation, air 
conditioning, etc., for the 2010 Census operations. 

[End of table] 

According to Bureau officials, they delayed the schedule and eliminated 
functionality for DRIS when they realized they had underestimated the 
fiscal year 2006 through 2008 costs for development. As shown in table 
5, the government's funding estimates for DRIS Phase I were 
significantly lower than the contractor's. 

Table 5: DRIS Cost Estimates for Phase 1 (as of March 2006): 

Dollars in millions. 

Fiscal year: 2006; 
Cost estimates: Contractor: $18.6; 
Cost estimates: Government: $11.2. 

Fiscal year: 2007; 
Cost estimates: Contractor: 53.3; 
Cost estimates: Government: 23.8. 

Fiscal year: 2008; 
Cost estimates: Contractor: 48.7; 
Cost estimates: Government: 31.5. 

Total; 
Cost estimates: Contractor: $120.6; 
Cost estimates: Government: $66.5. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

Originally, the DRIS solution was to include paper, telephone, 
Internet, and field data collection processing; selection of data 
capture sites; and preparation and processing of 2010 Census forms. 
However, the Bureau reduced the scope of the solution by eliminating 
the Internet functionality. In addition, the Bureau has stated that it 
will not have a robust telephone questionnaire assistance system in 
place for the Dress Rehearsal. The Bureau is also delaying selecting 
sites for data capture centers, preparing data capture facilities, and 
recruiting and hiring data capture staff. 

Although Bureau officials told us that the revisions to the schedule 
should not affect meeting milestones for the 2010 Census, the delays 
mean that more systems development and testing will need to be 
accomplished later. Given the immovable deadline of the decennial 
census, the Bureau is at risk of reducing functionality or increasing 
costs to meet its schedule. 

The government's estimate for the DRIS project was $553 million through 
the end of fiscal year 2010. In October 2005, at contract award, the 
Phase I and Phase II value was $484 million. 

The DRIS project is not experiencing cost overruns, and our analysis of 
cost performance reports from April 2006 to May 2007 projects no cost 
overruns by December 2008. As of May 2007, the Bureau had obligated $37 
million, and the project was 44 percent completed. As of May 2007, the 
DRIS contract value had not increased. 

DADS II Contract Was Recently Awarded after a Delay: 

The DADS II acquisition is to replace the legacy DADS systems, which 
tabulate and publicly disseminate data from the decennial census and 
other Bureau surveys.[Footnote 10] The DADS II contractor is also 
expected to provide comprehensive support to the Census 2000 legacy 
DADS systems. 

In January 2007, the Bureau released the DADS II request for proposal. 
The contract was awarded in September 2007. 

However, the Bureau had delayed the DADS II contract award date 
multiple times. The award date was originally planned for the fourth 
quarter of 2005, but the date was changed to August 2006. On March 8, 
2006, the Bureau estimated it would delay the award of the DADS II 
contract from August to October 2006 to gain a clearer sense of budget 
priorities before initiating the request for proposal process. The 
Bureau then delayed the contract award again by about another year. 
Because of these delays, DADS II will not be developed in time for the 
Dress Rehearsal. Instead, the Bureau will use the legacy DADS system 
for tabulation during the Dress Rehearsal. However, the Bureau's plan 
is to have the DADS II system available for the 2010 Census. 

No cost information on the DADS II contract was available because it 
was recently awarded. 

Delayed Functionality Increases the Importance of Further Operational 
Testing: 

Operational testing helps verify that systems function as intended in 
an operational environment. For system testing to be comprehensive, 
system functionality must be completed. Further, for multiple 
interrelated systems, end-to-end testing is performed to verify that 
all interrelated systems, including any external systems with which 
they interface, are tested in an operational environment. 

However, as described above, two of the projects have delayed planned 
functionality to later phases, and one project contract was recently 
awarded (September 2007). As a result, the operational testing that is 
to occur during the Dress Rehearsal period around April 1, 2008, will 
not include tests of the full complement of decennial census systems 
and their functionality. According to Bureau officials, they have not 
yet finalized their plans for system tests. If further delays occur, 
the importance of these system tests will increase. Delaying 
functionality and not testing the full complement of systems increase 
the risk that costs will rise further, that decennial systems will not 
perform as expected, or both. 

The Bureau Is Making Progress in Risk Management Activities, but 
Critical Weaknesses Remain: 

The project teams varied in the extent to which they followed 
disciplined risk management practices. For example, three of the four 
project teams had developed strategies to identify the scope of the 
risk management effort. However, three project teams had weaknesses in 
identifying risks, establishing adequate mitigation plans, and 
reporting risk status to executive-level officials. These weaknesses in 
completing key risk management activities can be attributed in part to 
the absence of Bureau policies for managing major acquisitions, as we 
described in our earlier report.[Footnote 11] Without effective risk 
management practices, the likelihood of project success is decreased. 

According to the Software Engineering Institute (SEI), the purpose of 
risk management is to identify potential problems before they occur. 
When problems are identified, risk-handling activities can be planned 
and invoked as needed across the life of a project in order to mitigate 
adverse impacts on objectives. Effective risk management involves early 
and aggressive risk identification through the collaboration and 
involvement of relevant stakeholders. Based on SEI's Capability 
Maturity Model® Integration (CMMI®), risk management activities can be 
divided into four key areas (see fig. 2): 

* preparing for risk management, 

* identifying and analyzing risks, 

* mitigating risks, and: 

* executive oversight. 

Figure 2: Description and Examples of Key Risk Practice Areas: 

[See PDF for image] 

Source: GAO analysis of CMMI criteria. 

[End of figure] 

The discipline of risk management is important to help ensure that 
projects are delivered on time, within budget, and with the promised 
functionality. It is especially important for the 2010 Census, given 
the immovable deadline. 

Project Teams Usually Established Risk Preparation Activities, but 
Improvements Are Possible: 

Risk preparation involves establishing and maintaining a strategy for 
identifying, analyzing, and mitigating risks. The risk management 
strategy addresses the specific actions and management approach used to 
perform and control the risk management program. It also includes 
identifying and involving relevant stakeholders in the risk management 
process. Table 6 shows the status of the four project teams' 
implementation of key risk preparation activities.[Footnote 12] 

Table 6: Risk Management Preparation Activities Completed for the Key 
2010 Census Systems: 

Specific practices: Determine risk sources and categories; 
MTAIP: Practice Partially Implemented; 
FDCA: Practice Fully Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Fully Implemented. 

Specific practices: Define parameters used to analyze and categorize 
risks and parameters used to control risk management efforts; 
MTAIP: Practice Fully Implemented; 
FDCA: Practice Fully Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Fully Implemented. 

Specific practices: Establish and maintain the strategy to be used for 
risk management; 
MTAIP: Practice Partially Implemented; 
FDCA: Practice Fully Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Fully Implemented. 

Specific practices: Identify and involve the relevant stakeholders of 
the risk management process as planned; 
MTAIP: Practice Partially Implemented; 
FDCA: Practice Partially Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Partially Implemented. 

Source: GAO analysis of project data. 

[End of table] 

As the table shows, three project teams have established most of the 
risk management preparation activities. However, the MTAIP project team 
implemented the fewest practices. The team did not adequately determine 
risk sources and categories, or adequately develop a strategy for risk 
management. As a result, the project's risk management strategy is not 
comprehensive and does not fully address the scope of the risk 
management effort, including discussing techniques for risk mitigation 
and defining adequate risk sources and categories. 

In addition, three project teams (MTAIP, FDCA, and DADS II) had 
weaknesses regarding stakeholder involvement. The three teams did not 
provide sufficient evidence that the relevant stakeholders were 
involved in risk identification, analysis, and mitigation activities; 
reviewing the risk management strategy and risk mitigation plans; or 
communicating and reporting risk management status. In addition, the 
FDCA project team had not identified relevant stakeholders. These 
weaknesses can be attributed in part to the absence of Bureau policies 
for managing major acquisitions, as we described in our earlier 
reports.[Footnote 13] Without adequate preparation for risk management, 
including establishing an effective risk management strategy and 
identifying and involving relevant stakeholders, project teams cannot 
properly control the risk management process. 

The Project Teams Identified and Analyzed Risks, but Not All Key Risks 
Were Identified: 

Risks must be identified and described in an understandable way before 
they can be analyzed and managed properly. This includes identifying 
risks from both internal and external sources and evaluating each risk 
to determine its likelihood and consequences. Analyzing risks includes 
risk evaluation, categorization, and prioritization; this analysis is 
used to determine when appropriate management attention is required. 
Table 7 shows the status of the four project teams' implementation of 
key risk identification and evaluation activities. 

Table 7: Risk Identification and Evaluation Activities Completed for 
the Key 2010 Census Systems: 

Specific practices: Identify and document the risks; 
MTAIP: Practice Fully Implemented; 
FDCA: Practice Partially Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Partially Implemented. 

Specific practices: Evaluate and categorize each identified risk using 
the defined risk categories and parameters, and determine its relative 
priority; 
MTAIP: Practice Partially Implemented; 
FDCA: Practice Fully Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Fully Implemented. 

Source: GAO analysis of project data. 

[End of table] 

As of July 2007, the MTAIP and DRIS project teams were adequately 
identifying and documenting risks, including system interface risks. 
For example, these teams were able to identify the following: 

* The MTAIP project identified significant risks regarding potential 
changes in funding and the turnover of contractor personnel as the 
program nears maturity. 

* The DRIS project identified significant risks regarding new system 
security regulations, changes or increases to Phase II baseline 
requirements, and new interfaces after Dress Rehearsal. 

However, the FDCA and DADS II project teams did not identify all risks, 
including specific system interface risks. For example: 

* The FDCA project had not identified any significant risks related to 
the handheld mobile computing devices, for the project office to 
monitor and track, despite problems arising during the recent address 
canvassing component of the Dress Rehearsal.[Footnote 14] However, it 
did identify significant risks for the contractor to manage; these 
risks were associated with using the handheld mobile computing devices 
including usability and failure rates. Responsibility for mitigating 
these risks was transferred to the contractor. 

* The FDCA and DADS II projects did not provide evidence that specific 
system interface risks are being adequately identified to ensure that 
risk handling activities will be invoked should the systems fail during 
2010 Census. For example, although the DADS II will not be available 
for the Dress Rehearsal, the project team did not identify any 
significant interface risks associated with this system. 

One reason for these weaknesses, as mentioned earlier, is the absence 
of Bureau policies for managing major acquisitions. Failure to 
adequately identify and analyze risks could prevent management from 
taking the appropriate actions to mitigate those risks; this increases 
the probability that the risks will materialize and magnifies the 
extent of damage incurred in such an event. 

Three of Four Project Teams' Risk Mitigation Plans and Monitoring 
Activities Were Incomplete: 

Risk mitigation involves developing alternative courses of action, 
workarounds, and fallback positions, with a recommended course of 
action for the most important risks to the project. Mitigation includes 
techniques and methods used to avoid, reduce, and control the 
probability of occurrence of the risk; the extent of damage incurred 
should the risk occur; or both. Examples of activities for mitigating 
risks include documented handling options for each identified risk; 
risk mitigation plans; contingency plans; a list of persons responsible 
for tracking and addressing each risk; and updated assessments of risk 
likelihood, consequence, and thresholds. Table 8 shows the status of 
the four project teams' implementation of key risk mitigation 
activities. 

Table 8: Risk Mitigation Activities Completed for Key 2010 Census 
Systems: 

Specific practices: 
Develop a risk mitigation plan for the most important risks to the 
project, as defined by the risk management strategy. 
MTAIP: Practice Partially Implemented; 
FDCA: Practice Partially Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Not Implemented.

Specific practices: 
Monitor the status of each risk periodically and implement the risk 
mitigation plan as appropriate. 
MTAIP: Practice Partially Implemented; 
FDCA: Practice Partially Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Partially Implemented.

Source: GAO analysis of project data. 

[End of table] 

Three project teams (MTAIP, FDCA, and DADS II) developed mitigation 
plans that were often untimely or included incomplete activities and 
milestones for addressing the risks. Some of these untimely and 
incomplete activities and milestones included the following: 

* Although the MTAIP project team developed mitigation plans, the plans 
were not comprehensive and did not include thresholds defining when 
risk becomes unacceptable and should trigger the execution of the 
mitigation plan. 

* The FDCA project team had developed mitigation plans for the most 
significant risks, but the plans did not always identify milestones for 
implementing mitigation activities. Moreover, the plans did not 
identify any commitment of resources, several did not establish a 
period of performance, and the team did not always update the plans 
with the latest information on the status of the risk. In addition, the 
FDCA project team did not provide evidence of developing mitigation 
plans to handle the other significant risks as described in their risk 
mitigation strategy. (These risks included a lack of consistency in 
requirements definition and insufficient FDCA project office staffing 
levels.) 

* The mitigation plans for DADS II were incomplete, with no associated 
future milestones and no evidence of continual progress in working 
towards mitigating a risk. In several instances, DADS II mitigation 
plans were listed as "To Be Determined." 

With regard to the second practice in the table (periodically 
monitoring risk status and implementing mitigation plans), the MTAIP, 
FDCA, and DADS II project teams were not always implementing the 
mitigation plans as appropriate. For example, although the MTAIP 
project team has periodically monitored the status of risks, its 
mitigation plans do not include detailed action items with start dates 
and anticipated completion dates; thus, the plans do not ensure that 
mitigation activities are implemented appropriately and tracked to 
closure. The FDCA and DADS II project teams did not identify system 
interface risks nor prepare adequate mitigation plans to ensure that 
systems will operate as intended. In addition, the DADS II risk reviews 
showed no evidence of developing risk-handling action items, tracking 
any existing open risk-handling action items, or regularly discussing 
mitigation steps with other risk review team members. 

Because they did not develop complete mitigation plans, the MTAIP, 
FDCA, and DADS II project teams cannot ensure that for a given risk, 
techniques and methods will be invoked to avoid, reduce, and control 
the probability of occurrence. 

Project Teams Are Inconsistent in Reporting Risk Status to Executive- 
Level Management: 

Reviews of the project teams' risk management activities, status, and 
results should be held on a periodic and event-driven basis. The 
reviews should include appropriate levels of management, such as key 
Bureau executives, who can provide visibility into the potential for 
project risk exposure and appropriate corrective actions. Table 9 shows 
the status of the four project teams' implementation of activities for 
senior-level risk oversight. 

Table 9: Executive-Level Risk Oversight Activities Completed for the 
Key 2010 Decennial Systems: 

Specific practices: 
Review the activities, status, and results of the risk management 
process with executive-level management, and resolve issues; 
MTAIP: Practice Not Implemented; 
FDCA: Practice Not Implemented; 
DRIS: Practice Fully Implemented; 
DADS: Practice Fully Implemented. 

Source: GAO analysis of project data. 

[End of table] 

The project teams were inconsistent in reporting the status of risks to 
executive-level officials. DRIS and DADS II did regularly report risks; 
however, the FDCA and MTAIP projects did not provide sufficient 
evidence to document that these discussions occurred or what they 
covered. Although presentations were made on the status of the FDCA and 
MTAIP projects to executive-level officials, presentation documents did 
not include evidence of discussions of risks and mitigation plans. 
Failure to report a project's risks to executive-level officials 
reduces the visibility of risks to executives who should be playing a 
role in mitigating them. 

Conclusions: 

The IT acquisitions planned for 2010 Census will require continued 
oversight to ensure that they are achieved on schedule and at planned 
cost levels. Although the MTAIP and DRIS acquisitions are currently 
meeting cost estimates, FDCA is not. In addition, while the Bureau is 
making progress developing systems for the Dress Rehearsal, it is 
deferring certain functionality, with the result that the Dress 
Rehearsal operational testing will address less than a full complement 
of systems. Delaying functionality increases the importance of later 
development and testing activities, which will have to occur closer to 
the census date. It also raises the risk of cost increases, given the 
immovable deadline for conducting the 2010 Census. 

The Bureau's project teams for each of the four acquisitions have 
implemented many practices associated with establishing sound and 
capable risk management processes, but they are not always consistent: 
the teams have not always identified risks, developed complete risk 
mitigation plans, or briefed senior-level officials on risks and 
mitigation plans. Among risks that were not identified are those 
associated with the FDCA mobile computing devices and systems testing. 
Also, mitigation plans were often untimely or incomplete. Further, no 
evidence was available of senior-level briefings to discuss risks and 
mitigation plans. One reason for these weaknesses is the absence of 
Bureau policies for managing major acquisitions, as we pointed out in 
earlier work. Until the project teams and the Decennial Management 
Division implement appropriate risk management activities, they face an 
increased probability that decennial systems will not be delivered on 
schedule and within budget or perform as expected. 

Recommendations for Executive Action: 

To ensure that the Bureau's four key acquisitions for the 2010 Census 
operate as intended, we are making four recommendations. First, to 
ensure that the Bureau's decennial systems are fully tested, we 
recommend that the Secretary of Commerce require the Director of the 
Census Bureau to direct the Decennial Management Division and Geography 
Division to plan for and perform end-to-end testing so that the full 
complement of systems is tested in a census-like environment. 

To strengthen risk management activities for the decennial census 
acquisitions, the Secretary should also direct the Director of the 
Census Bureau to ensure that project teams: 

* identify and develop a comprehensive list of risks for the 
acquisitions, particularly those for system interfaces and mobile 
computing devices, and analyze them to determine probability of 
occurrence and appropriate mitigating actions; 

* develop risk mitigation plans for the significant risks, including 
defining the mitigating actions, milestones, thresholds, and resources; 
and: 

* provide regular briefings on significant risks to senior executives, 
so that they can play a role in mitigating these risks. 

We are not making recommendations at this time regarding the Bureau's 
policies for managing major acquisitions, as we have already done so in 
previous reports.[Footnote 15] 

Agency Comments and Our Evaluation: 

In response to a draft of this report, the Under Secretary for Economic 
Affairs of Commerce provided written comments from the department. 
These comments are reproduced in appendix III. 

The department disagreed with our conclusion about operational testing 
during the 2008 Dress Rehearsal. According to the department, although 
some minimal functionalities are not a part of the Dress Rehearsal, all 
critical systems and interfaces would be tested during the 2008 Dress 
Rehearsal. It planned to conduct additional fully integrated testing of 
all systems and interfaces after the Dress Rehearsal, including the 
functionalities not included in the Dress Rehearsal itself. It also 
planned to incorporate lessons learned from the Dress Rehearsal in this 
later testing. Nonetheless, the Bureau's test plans have not been 
finalized. Further, the Dress Rehearsal will not include two critical 
systems (the DRIS telephone system and the DADS II tabulation system). 
Thus, it remains unclear whether testing will in fact address all 
interrelated systems and functionality in a census-like environment. 
Consistent with our recommendation, following up with documented test 
plans to do end-to-end testing would help ensure that decennial systems 
will work as intended. 

With regard to risk management, the department said it plans to examine 
additional ways to manage risks and will prepare a formal action plan 
in response to our final report. However, it disagreed with our 
assessment with regard to risk identification, pointing out that one 
project identified risks associated with handheld mobile computing 
devices and assigned responsibility for these to the contractor. In 
addition, the project identified systems interfaces as a risk. However, 
the project did not identify significant risks for the project office 
to monitor and track related to problems arising during the address 
canvassing component of the Dress Rehearsal. Also, although this 
project identified a general risk related to system interfaces, it did 
not identify specific risks related to particular interfaces. 

The department also provided technical comments that we incorporated 
where appropriate. 

We are sending copies of this report to the Chairman and Ranking Member 
of the Committee on Homeland Security and Governmental Affairs. We are 
also sending copies to the Secretary of Commerce, the Director of the 
U.S. Census Bureau, and other appropriate congressional committees. We 
will make copies available to others on request. In addition, this 
report will be available at no charge on the GAO Web site at 
[hyperlink, http://www.gao.gov]. 

If you have any questions about this report, please contact David A. 
Powner at (202) 512-9286 or pownerd@gao.gov or Madhav S. Panwar at 
(202) 512-6228 or panwarm@gao.gov. Contact points for our Offices of 
Congressional Relations and Public Affairs may be found on the last 
page of this report. GAO staff who made major contributions to this 
report are listed in appendix IV. 

Signed by: 

David A. Powner: 
Director, Information Technology Management Issues: 

and: 

Madhav S. Panwar: 
Senior Level Technologist, Center for Technology and Engineering: 

[End of section] 

Appendix I: Objectives, Scope, and Methodology: 

Our objectives were to (1) determine the status and plans, including 
schedule and costs, for four key information technology (IT) 
acquisitions, and (2) assess whether the Census Bureau is adequately 
managing the risks facing these key system acquisitions. 

To determine the status and plans, we reviewed documents related to the 
major 2010 Census acquisitions, including requests for proposals, 
acquisition contracts, project plans, schedules, cost estimates, 
program review reports, earned value management data, test plans, and 
other acquisition-related documents. We analyzed earned value 
management data obtained from the contractors to assess the 
contractor's cost and schedule performance. We also interviewed program 
officials to determine the current status of the acquisitions' 
schedules and cost estimates. 

To assess the status of risk management, we evaluated the practices for 
key areas (establishing a risk strategy, risk identification, 
mitigation, and reporting) and compared these to industry standards-- 
specifically, the Capability Maturity Model® Integration (CMMI®). The 
CMMI model was developed by Carnegie Mellon University's Software 
Engineering Institute (SEI) and includes criteria to evaluate risk 
management for development and maintenance activities. We adapted these 
CMMI criteria and performed a Class B Standard CMMI Appraisal Method 
for Process Improvement[Footnote 16] to evaluate the risk management of 
program teams and contractors involved in the decennial system 
acquisitions and development initiatives. In doing so, we selected 
leading practices within the areas of preparing for risk management, 
identifying and analyzing risks, mitigating risks, and executive 
oversight. We evaluated the practices as fully implemented, partially 
implemented, or not implemented. Specifically, a blank circle indicates 
that practices are not performed at all or are performed on a 
predominantly ad hoc basis; a half circle indicates that the while 
selected key practices have been performed, others remain to be 
implemented; and a solid circle indicates that practices adhere to 
industry standards. 

To evaluate the extent to which the Bureau and contractors followed 
these leading practices, we reviewed relevant documents such as risk 
management plans, risk reports, mitigation plans, meeting minutes from 
risk review meetings; we also interviewed knowledgeable officials about 
their risk management activities. Specifically, we met with project 
team officials for the four key decennial system acquisitions and their 
primary contractors (Harris Corporation and Lockheed Martin), as 
applicable. We also reviewed the lists of risks identified by each of 
the project teams and their primary contractors and assessed their 
accuracy and completeness, including whether the risks were associated 
with the acquisition's development plans. 

We conducted our work from December 2006 through August 2007 in the 
Washington, D.C., metropolitan area in accordance with generally 
accepted government auditing standards. 

[End of section] 

Appendix II: Key 2010 Census Information Technology Acquisitions: 

Table 10: 

IT acquisition: MAF/TIGER Accuracy Improvement Project (MTAIP); 
Contractor: Harris Corporation; 
Purpose: Modernize the system that provides the address list, maps, and 
other geographic support services for the Census and other Bureau 
surveys; 
Contract type: Cost plus award fee; 
Contract award: June 2002. 

IT acquisition: Field Data Collection Automation (FDCA); 
Contractor: Harris Corporation; 
Purpose: Provide automated resources for supporting field data 
collection, including the provision of handheld mobile computing 
devices to collect data in the field, including address and map data; 
Contract type: Cost plus award fee with some firm fixed price elements; 
Contract award: March 2006. 

IT acquisition: Decennial Response Integration System (DRIS); 
Contractor: Lockheed Martin Corporation; 
Purpose: Provide a solution for data capture and respondent assistance; 
Contract type: Cost plus award fee with some firm fixed price elements; 
Contract award: October 2005. 

IT acquisition: Data Access and Dissemination System (DADS II); 
Contractor: IBM; 
Purpose: Develop a replacement for the DADS legacy tabulation and 
dissemination systems; 
Contract type: To be determined; 
Contract award: September 2007. 

Source: GAO analysis of Census Bureau data. 

[End of table] 

[End of section] 

Appendix III Comments from the Department of Commerce: 

United States Department Of Commerce: 
The Under Secretary for Economic Affairs: 
Washington, D.C. 20230: 
Mr. David Powner: 
Director: 
IT Management Issues: 

United States Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Powner: 

The U.S. Department of Commerce appreciates the opportunity to comment 
on the United States Government Accountability Office's Draft Report 
Entitled Information Technology: Census Bureau Needs to Improve Its 
Risk Management of Decennial Systems — (GAO-08-79). The Department's 
comments on this report are enclosed. 

Sincerely, 

Signed by: 

Cynthia A. Glassman: 

U.S. Department of Commerce: 
Comments on the: 
United States Government Accountability Office: 
Draft Report Entitled Information Technology: Census Bureau Needs to 
Improve Its: 
Risk Management of Decennial Systems -- (GAO-08-79) September 2007: 

The U.S. Census Bureau appreciates the United States Government 
Accountability Office's (GAO) efforts to review our contract management 
processes for key information technology systems planned for the 2010 
Census and also appreciates this opportunity to review the draft 
report. 

We believe we have made significant efforts to date in successfully 
managing these major Information Technology (IT) contracts. Certainly, 
additional efforts to manage risks can improve the likelihood of 
success, and we will examine ways to do that in preparing our formal 
action plan in response to the final version of this report. 

We do have one major disagreement with the GAO's various statements (on 
pages 5 and 12, for example) and conclusion (on page 29) about limited 
operational testing. Although some minimal functionalities are not part 
of the 2008 Census Dress Rehearsal, all critical systems and interfaces 
are part of the Dress Rehearsal—our best Census-like environment to 
conduct such testing. In addition, after the Dress Rehearsal we will 
conduct additional, fully integrated, testing of all systems and 
interfaces, including the functionalities not included in the Dress 
Rehearsal, and incorporate any lessons learned from the Dress 
Rehearsal. 

Specific Comments on the Draft Report: 

Page 5: The draft report states that for the Field Data Collection 
Automation (FDCA) Program, ".the life-cycle cost estimates for this 
program have increased, and we project an $I8 million cost overrun by 
December 2008. According to the contractor, the overrun is occurring 
primarily because of an increase in the number of system requirements." 

Census Bureau Comment: As written, this statement could be interpreted 
as an indication that the Census Bureau has conveyed a number of 
previously unstated requirements to the contractor, and that the 
contractor is overrunning its estimated costs. That is not entirely 
accurate. We have added some new requirements, mostly regarding IT 
systems security. Some cost growth has resulted from the process of 
decomposing high level functional requirements (those stated in Section 
C of the contract) into more detailed and specific requirements (system 
requirements and software requirements). 

Pages 5-6: The draft report states: "Delays in functionality mean that 
the Dress Rehearsal operational testing will take place without the 
full complement of systems and functionality that was originally 
planned. As a result, further system testing will be important to 
ensure that the decennial systems work as intended. However, Bureau 
officials have not finalized their plans for testing of all systems, 
and it is not clear whether these plans will include testing to address 
all interrelated systems and functionality, such as end-to-end 
testing." 

Census Bureau Comment: As stated in our major comment above, all 
critical systems and interfaces are part of the Dress Rehearsal, and we 
will conduct additional, fully integrated, testing of all systems and 
interfaces, including the minor functionalities not part of the Dress 
Rehearsal, and any lessons learned from the Dress Rehearsal. 

Page 5: The draft report states: ".schedule for this acquisition has 
been revised, resulting in delays.life-cycle cost estimates for this 
program have increased." 

Census Bureau Comment: None of the discussions on this page about 
schedule revisions and cost changes provide any context about the cause 
of the changes. For example, regarding the FDCA contract, after 
contract award, detailed discussions with the contractor revealed that 
our original life-cycle cost estimates for this effort had allocated 
too much money to later years, and not enough to the earlier years of 
the contract. Furthermore, because our FY 2006 budget was already in 
place at that point, we could not meet the level of first-year funding 
required under the solution the contractor had bid. Therefore, we had 
to develop a re-plan with the contractor. This resulted in some 
schedule changes and cost increase overall, because the contractor 
would have less time to develop its solutions and still meet our 
deadlines. This same comment applies to the statement in the third 
paragraph on page 14, that the Census Bureau ".revised the schedule 
because it had initially underestimated costs." Also, the draft 
report's discussion on page 6 follows the description of schedule and 
cost changes, and thus implies—incorrectly—that the changes to date 
resulted from poor or insufficient risk management. 

Page 9: The draft reports states: "...the Field Data Collection 
Automation (FDCA) Program is to provide automation support for directly 
capturing information collected during personal interviews, as well as 
eliminating the need for paper maps and address lists for the major 
file data collection operations. " 

Census Bureau Comment: To clarify, the FDCA contract only provides for 
automated data collection for three personal-visit operations (Address 
Canvassing, Nonresponse Follow-up [including Vacant/Delete Follow-up], 
and Coverage Measurement Personal Interviewing), and only eliminates 
the need for paper maps for Address Canvassing and Nonresponse Follow-
up. Also, in this same quote, the word "file" should be corrected to 
read 'field. " 

Page 12: The draft report states: ".As a result, Dress Rehearsal 
operational testing will not address the full complement of systems and 
functionality that was originally planned."

Census Bureau Comments: As stated in our major comment above, all 
critical systems and interfaces are part of the Dress Rehearsal, and we 
will conduct additional, fully integrated, testing of all systems and 
interfaces, including the minor functionalities not part of the Dress 
Rehearsal, and any lessons learned from the Dress Rehearsal. 

Page 14: For clarification, in the discussion at the top of the page 
about the number of staff expected to use or access FDCA components, we 
assume this refers to all FDCA equipment, infrastructure, and systems, 
not just the hand-held computers (HHC). Also, the draft report's 
reference to "National Operations Center" in the second paragraph, 
should be corrected to read Network Operations Center. 

Pages 14-15: In Table 2, for clarification, the transition of the 
Decennial Applicant Personnel and Payroll System (DAPPS) to the FDCA 
environment only was delayed until after the Dress Rehearsal, and we 
developed and are testing the FDCA/DAPPS Interface in the Dress 
Rehearsal. Also, the development of a space tracking system by the FDCA 
contractor was eliminated, not delayed to Period 2, and the development 
of an automated software distribution system was delayed to Period 2, 
not eliminated. Under Execution Period 2, the reference to "Perform 
delayed activities" is not clear, so perhaps that should be more 
specific (or footnoted). 

Page 15: The second paragraph of the draft report states: ".all sites 
for Regional Census Centers were to have been identified by April 2007, 
but this activity has not yet been fully completed. This delay may 
result in further delays, as described in recent FDCA performance 
reports. Because not all Regional Census Center sites have yet been 
identified, the risk is increased that the project will not meet the 
deployment date for these centers and the Puerto Rico Area Office by 
January 31, 2008." 

Census Bureau Comment: All sites for the Regional Census Centers and 
Puerto Rico Area Office have been identified, leases have been signed, 
and build-out is underway. We are on schedule to open all these offices 
in January 2008. Also, the FDCA contractor had no responsibility for 
identifying the sites nor in securing leases—those tasks were conducted 
by the U.S. General Services Administration. 

Page 15: The last paragraph of the draft report states: ". cost of the 
[FDCA] contract rose by a further $23 million, because of increasing 
system requirements." 

Census Bureau Comment: As stated above in our comment regarding page 5, 
we have added some new requirements, mostly regarding security of IT 
systems. Some cost growth has resulted from the process of decomposing 
high-level functional requirements (those stated in Section C of the 
contract) into more detailed and specific requirements, (system 
requirements and software requirements). 

Page 16: In both paragraphs of the draft report, reference is made to 
"cost overruns" of the FDCA contract.

Census Bureau Comment: We do not believe the term "cost overrun" is 
accurate in the context of what the GAO had described in this draft 
report. There has been some cost growth due to the new security 
requirements, and to the decomposition of high-level functional 
requirements, but "overrun" usually means a contractor originally 
underestimated its cost to perform a specific set of tasks. 

Page 19: The draft report states "At contract award in October 2005, 
the total cost of the DRIS project was not to exceed $553 million. In 
December 2005, the Bureau adjusted the life-cycle cost to $484 
million." 

Census Bureau Comment: The life-cycle cost of $484 million is 
incorrect. The $553 figure represents the government estimate of the 
DRIS contract costs through the end of FY 2010. This has always been 
our estimate, and it was not reduced at contract award. The figure of 
$484 million just represents the initial Phase I and Phase II value at 
award. This $484 million figure was based on the contractor's original 
proposal, which in turn, was based on simplified pricing instructions 
contained in the original request for proposal issued in 2005.
Page 20: Although this information was not available when this draft 
report was prepared, we note for the record that the DADS H contract 
was awarded on schedule the week of September 10 to IBM. 

Page 25: The draft report states "....the FDCA project office did not 
identify any significant risks associated with using the handheld 
mobile computing devices. In addition, neither FDCA nor the DADS II 
project team provided evidence that system interface risks are being 
adequately identified . 

and: 

Page 29 (Conclusions): The draft report states... "Among risks that 
were not identified are those associated with the FDCA mobile computing 
devices." 

Census Bureau Comment: We disagree with GAO's assertions that we did 
not identify risks associated with the HHCs or system interfaces. 

The FDCA risk management process identified the following HHC-related 
risks: HHC usability (risk ID #14), HHC failure rates (#18), HHC 
bandwidth (#20), and HHC supply chain (#32). The FDCA RRB transferred 
these technical (or solution) risks to the contractor as provided in 
Section 3.4.2.1 of the FDCA Risk Management Plan (providing a 
"Transfer" risk response strategy). "HHC Performance for Decennial 
Operations" is currently the highest-scored contractor risk; mitigation 
strategies have been identified and activated within the contractor's 
risk management process. 

The FDCA risk management process also identified Interface Management 
(Risk ID #15) as a project risk, a risk still carried on the FDCA Risk 
Register. The current response strategy for this risk is "Track," as 
Dress Rehearsal Address Canvassing interfaces were identified within 
the Decennial Census Architecture, documented in signed Interface 
Control Documents, and implemented as planned. 

Page 28: The draft report states "...FDCA and MTAIP did not provide 
evidence of regular [risk] reporting to higher level officials. 
Specifically...their reports did not include discussions of risks and 
mitigation plans." 

Census Bureau Comment: The FDCA and MTAIP project offices report 
through the following supervisory chain: Chief, Decennial Automation 
Contracts Management Office or Chief; Decennial Systems and Contracts 
Management Office; Assistant; Director for ACS and Decennial Census; 
Associate Director for Decennial Census; Deputy Director and Chief 
Operating Officer; and Director. Each of these higher-level officials, 
together with the Chiefs of Decennial Management, Field, and Geography 
Divisions, and other stakeholder divisions and offices, regularly 
receive information on FDCA and MTAIP project issues and risks, and 
generally via more than one communication channel. 

We have provided GAO with slide decks from presentations to the 
Commerce IT Review Board, Decennial Leadership Group ("internal Program 
Management Reviews"), and Census Integration Group. We have also 
provided examples of biweekly "briefing sheets" provided to inform 
upper management of issues relating to major decennial contracts. 
Additional discussion of FDCA risks occurs at weekly FDCA strategy 
sessions and at monthly contractor-conducted PMRs (we have also 
provided copies of chart decks from the latter events). Each of these 
regularly-scheduled vehicles includes at least two (and in most cases 
several) higher-level officials beyond the FDCA project office. While 
we understand from interactions with the GAO audit team that this 
finding is intended to point out that our artifacts do not (in GAO's 
view) clearly document "discussions" of risks and mitigation plans, we 
disagree emphatically with their characterization of this array of 
communication channels as "practice not implemented." 

Page 29 (Conclusions): The draft report states ".the Dress Rehearsal 
operational testing will address less than a full complement of 
systems. " 

Census Bureau Comment: As stated in our major comment above, although 
some minimal functionalities are not part of the 2008 Census Dress 
Rehearsal, all critical systems and interfaces are part of the Dress 
Rehearsal, and we will conduct additional, fully integrated testing of 
all systems and interfaces, including the minor functionalities not 
part of the Dress Rehearsal, and any lessons learned from the Dress 
Rehearsal. 

Page 33 (Appendix II): The draft report states the MTAIP contract type 
is "Cost plus award fee with some fixed priced elements." 

Census Bureau Comment: The MTAIP contract type is simply "Cost plus 
award fee" or CPAF. The original contract for Phase I was CPAF with the 
government anticipating Phase II being a hybrid with some fixed-price 
elements. However, fixed-price elements were never used, and the 
contract was modified to exclude them. Therefore, the MTAIP contract 
remained as it started—a CPAF. The report should be amended to reflect 
this. 

The following are GAO's comments on the department's letter dated 
September 25, 2007. 

GAO Comments: 

1. Although the department states that it plans to test all critical 
systems and interfaces either during or after the Dress Rehearsal, we 
are aware of two critical systems (the DRIS telephone system and the 
DADS II tabulation system) that are not to be included in the Dress 
Rehearsal, and the Bureau's plans are not yet finalized. As a result, 
we stand by our characterization that operational testing would take 
place during the Dress Rehearsal without the full complement of systems 
and functionality originally planned. Consistent with our 
recommendation, following up with documented test plans to do end-to- 
end testing would help ensure that systems work as intended. 

2. The department said that our statement could be interpreted that 
cost increases resulted from an increase in the number of system 
requirements. It said this is not entirely accurate because although 
some requirements were added (generally related to security), other 
cost increases were due to the process of developing detailed 
requirements from high-level functional requirements. However, it is 
our view that the process of developing detailed requirements from high-
level functional requirement does not inevitably lead to cost increases 
if the functional requirements were initially well-defined. 

3. See comment 1. 

4. We have modified our report to reflect this additional information. 
However, although our discussion of schedule and cost changes preceded 
our discussion of risk management, we did not intend to imply that risk 
management weaknesses had contributed to these changes. We revised our 
report to help clarify this. 

5. We have revised our report to clarify the use of automation for data 
collection for all FDCA components. 

6. See comment 1. 

7. We agree that this statement is referring to all FDCA equipment, 
infrastructure, and systems. 

8. We have revised our report to update the status of the systems. 

9. We have revised our report to reflect the status of the office site 
selections. 

10. See comment 2. 

11. We disagree with the department's comment that "cost overrun" 
refers to a contractor originally underestimating costs. We use "cost 
overrun" to refer to any increase in costs from original estimates. 

12. We have revised our report to reflect this information. 

13. We have revised our report to add this information. 

14. We agree that the FDCA project identified certain risks, as the 
department describes. However, although it identified risks associated 
with handheld mobile computing devices and assigned responsibility for 
these to the contractor, it did not identify significant risks for the 
project office to monitor and track related to problems arising during 
the address canvassing component of the Dress Rehearsal. In addition, 
although this project identified interface management as a risk, it did 
not identify specific risks related to other systems. Accordingly, 
although we modified our report to reflect this information, we did not 
change our overall evaluation. 

15. The department stated that for the FDCA and MTAIP projects, risk 
status is regularly discussed with executive-level officials at 
Commerce and the Bureau, and that it provided us with briefing slides 
to support this statement. It said that it also uses other 
communication channels to report project issues and risks. However, the 
evidence provided did not show that FDCA and MTAIP risks were regularly 
discussed with executive-level officials. For example, while the FDCA 
project provided two presentations in October 2006 and March 2007, 
these presentations did not have discussions of risk and mitigation 
plans. Similarly, our review of the MTAIP project teams' presentations 
during quarterly reviews did not show that risk status was discussed. 
Therefore, we still conclude that these projects, unlike the other two, 
did not have sufficient evidence that executive-level officials were 
being regularly briefed on risk status. 

17. See comment 2. 

17. We have revised our report to reflect this information. 

[End of section] 

Appendix IV: GAO Contacts and Staff Acknowledgments: 

GAO Contacts: 

David A. Powner, (202) 512-9286 or pownerd@gao.gov: 

Madhav Panwar, (202) 512-6228 or panwarm@gao.gov: 

Staff Acknowledgments: 

In addition to the contacts named above, individuals making 
contributions to this report included Cynthia Scott (Assistant 
Director), Mathew Bader, Carol Cha, Barbara Collier, Neil Doherty, Karl 
Seifert, Niti Tandon, Amos Tevelow, and Jonathan Ticehurst. 

[End of section] 

Footnotes: 

[1] 13 U.S.C. 141 (a) and (b). 

[2] Earned value management integrates the investment scope of work 
with schedule and cost elements for investment planning and control. 
The method compares the value of work accomplished during a given 
period with that of work expected in the period. Differences in 
expectations are measured in both cost and schedule variances. The 
Office of Management and Budget requires agencies to use earned value 
management as part of their performance-based management system for any 
investment under development or with system improvements under way. 

[3] End-to-end testing is a form of operational testing that is 
performed to verify that a defined set of interrelated systems that 
collectively support an organizational core business function 
interoperate as intended in an operational environment. The 
interrelated systems include not only those owned and managed by the 
organization, but also the external systems with which they interface. 

[4] GAO, Census Bureau: Important Activities for Improving Management 
of Key 2010 Decennial Acquisitions Remain to be Done, GAO-06-444T 
(Washington, D.C.: Mar. 1, 2006). 

[5] TIGER is a registered trademark of the U.S. Census Bureau. 

[6] Mobile computing devices will be used to update the Bureau's 
address list, to perform follow-up at addresses for which no 
questionnaire was returned, and to perform activities to measure census 
coverage. 

[7] Address canvassing is a field operation to build a complete and 
accurate address list. In this operation, census field workers go door 
to door verifying and correcting addresses for all households and 
street features contained on decennial maps. 

[8] GAO, Information Technology Management: Census Bureau Has 
Implemented Many Key Practices, but Additional Actions Are Needed, GAO-
05-661 (Washington, D.C.: June 16, 2005). 

[9] GAO-06-444T. 

[10] The DADS II contract was originally planned to establish a new Web-
based system that would serve as a single point for public access to 
all census data and integrate many dissemination functions currently 
spread across multiple Bureau organizations. 

[11] GAO-06-444T. 

[12] This analysis primarily addresses project teams' implementation of 
risk management processes. According to our analysis, the contractors 
for the three contracts awarded (MTAIP, FDCA, and DRIS) had implemented 
adequate risk management processes involving risk preparation, risk 
identification and analysis, and risk mitigation. 

[13] GAO-06-444T and GAO-05-661. 

[14] GAO, 2010 Census: Preparations for the 2010 Census Underway, but 
Continued Oversight and Risk Management Are Critical, GAO-07-1106T 
(Washington, D.C.: July 17, 2007). 

[15] GAO-06-444T and GAO-05-661. 

[16] CMMI® is registered in the U.S. Patent and Trademark Office by 
Carnegie Mellon University. Class B appraisals are recommended for 
initial assessments in organizations that do not have mature process 
improvement activities. 

GAO's Mission: 

The Government Accountability Office, the audit, evaluation, and 
investigative arm of Congress, exists to support Congress in meeting 
its constitutional responsibilities and to help improve the performance 
and accountability of the federal government for the American people. 
GAO examines the use of public funds; evaluates federal programs and 
policies; and provides analyses, recommendations, and other assistance 
to help Congress make informed oversight, policy, and funding 
decisions. GAO's commitment to good government is reflected in its core 
values of accountability, integrity, and reliability. 

Obtaining Copies of GAO Reports and Testimony: 

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each 
weekday, GAO posts newly released reports, testimony, and 
correspondence on its Web site. To have GAO e-mail you a list of newly 
posted products every afternoon, go to [hyperlink, http://www.gao.gov] 
and select "E-mail Updates." 

Order by Mail or Phone: 

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to: 

U.S. Government Accountability Office: 
441 G Street NW, Room LM: 
Washington, DC 20548: 

To order by Phone: 
Voice: (202) 512-6000: 
TDD: (202) 512-2537: 
Fax: (202) 512-6061: 

To Report Fraud, Waste, and Abuse in Federal Programs: 

Contact: 

Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: 
E-mail: fraudnet@gao.gov: 
Automated answering system: (800) 424-5454 or (202) 512-7470: 

Congressional Relations: 

Gloria Jarmon, Managing Director, JarmonG@gao.gov: 
(202) 512-4400: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7125: 
Washington, DC 20548: 

Public Affairs: 

Susan Becker, Acting Manager, BeckerS@gao.gov: 
(202) 512-4800: 
U.S. Government Accountability Office: 
441 G Street NW, Room 7149: 
Washington, DC 20548: