This is the accessible text file for GAO report number GAO-05-799R 
entitled 'Business Modernization: Some Progress Made toward 
Implementing GAO Recommendations Related to NASA's Integrated Financial 
Management Program' which was released on October 27, 2005. 

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov. 

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately. 

September 9, 2005: 

The Honorable Sherwood L. Boehlert: 
Chairman: 
Committee on Science: 
House of Representatives: 

The Honorable Bart Gordon: 
Ranking Member: 
Committee on Science: 
House of Representatives: 

Subject: Business Modernization: Some Progress Made toward Implementing 
GAO Recommendations Related to NASA's Integrated Financial Management 
Program: 

As we and others have reported in the past, the National Aeronautics 
and Space Administration (NASA) has fundamental problems with its 
financial management operations that undermine its external financial 
reporting ability and thwart its efforts to effectively manage and 
oversee its major programs. In April 2000, NASA began addressing many 
of its financial and management challenges through its effort to 
implement a new integrated financial management system, known as the 
Integrated Financial Management Program (IFMP), which NASA expects to 
complete in fiscal year 2008. However, in April and November 2003--3 
years into the IFMP implementation effort and with significant 
investment already made in the program--we issued a series of four 
reports[Footnote 1] that detailed weaknesses in NASA's acquisition and 
implementation strategy for IFMP. Specifically, we reported that NASA 
had not followed key best practices for acquiring and implementing IFMP 
and, therefore, was at risk of making a substantial investment in a 
financial management system that would fall far short of its stated 
goal of providing meaningful, reliable, and timely information to 
support effective day-to-day program management and external financial 
reporting. 

As part of the four reports we issued on IFMP, we made 45 
recommendations in the following areas: system component integration, 
enterprise architecture development and use, risk mitigation, system 
requirements definition, requirements management and testing, external 
financial reporting, and program cost and schedule control. Due to your 
continued interest in ensuring that NASA is taking the necessary 
actions to successfully implement IFMP, you asked us to assess the 
extent to which NASA has adopted the recommendations we made in our 
April and November 2003 reports. To achieve this objective, we 
interviewed the appropriate NASA officials and obtained and analyzed 
documentation supporting NASA's progress toward implementing GAO's 
recommendations. Our work was performed from March 2005 through June 
2005 in accordance with U.S. generally accepted government auditing 
standards. We requested and received written comments on a draft of 
this report from NASA and have included NASA's comments as enclosure 
III. Details on our scope and methodology are included in enclosure I. 

Results in Brief: 

Since we last reported on NASA's systems modernization program, NASA's 
effort has been focused primarily on trying to stabilize the core 
financial module, the backbone of IFMP. However, more recently, NASA 
has begun taking steps to implement a number of our recommendations. 
Overall, progress has been slow--particularly with respect to 
establishing an enterprise architecture, which is critical for guiding 
and constraining NASA's investment in IFMP. However, in some other 
areas--such as NASA's initiative to enhance the core financial module 
to provide better project management information--NASA is beginning to 
make some progress. Of the 45 recommendations we made, NASA has closed 
3 and partially implemented 13; however, 29 recommendations remain 
open. Table 1 summarizes our assessment of the extent to which NASA has 
implemented our recommendations. 

Table 1: NASA's Progress toward Implementing GAO's Recommendations: 

Recommendations: Recommendations to improve NASA's acquisition 
management practices; GAO-03-507; 
Closed: 0; 
Partially implemented: 2; 
Open: 0; 
Comments: Key elements of dependency analysis methodology still 
lacking; Suitability of already-acquired components not evaluated 
before acquiring additional components. 

Recommendations: Recommendations regarding development and use of 
enterprise architecture; GAO-04-43; 
Closed: 1; 
Partially implemented: 4; 
Open: 17; 
Comments: Architecture still missing important content and key 
architecture management processes not yet established; 
Already-implemented system components not mapped to architecture. 

Recommendations: Recommendations to mitigate risk associated with 
relying on already-deployed components; GAO-03-507; 
Closed: 0; 
Partially implemented: 0; 
Open: 6; 
Comments: NASA did not develop a formal corrective action plan to 
mitigate risks. 

Recommendations: Recommendations regarding defining program management 
needs and reengineering business processes; GAO-03-507; 
Closed: 1; 
Partially implemented: 0; 
Open: 1; 
Comments: Stakeholders engaged to define program management needs; 
Plans to reengineer contractor cost-reporting processes still several 
years away. 

Recommendations: Recommendations to improve NASA's requirements 
management and testing processes; GAO-03-507; 
Closed: 0; 
Partially implemented: 3; 
Open: 0; 
Comments: New requirements management methodology and tools acquired 
for future modules, but core financial module requirements not yet 
fully defined. 

Recommendations: Recommendations to improve external financial 
reporting; GAO-04-151; 
Closed: 0; 
Partially implemented: 0; 
Open: 4; 
Comments: Little progress made in developing a detailed plan for 
delivering a financial system that substantially complies with federal 
standards. 

Recommendations: Recommendations regarding IFMP life-cycle cost 
estimates and funding reserves; GAO-04-118; 
Closed: 1; 
Partially implemented: 4; 
Open: 1; 
Comments: Significant progress made in preparing life-cycle cost 
estimate, but consistency and support for estimates still lacking. 

Total; 
Closed: 3; 
Partially implemented: 13; 
Open: 29. 

Source: GAO analysis of NASA Information. 

[End of table] 

We considered a recommendation closed when NASA provided us with 
documentation that demonstrated it had fully addressed the concerns we 
raised in our prior reports. Recognizing that many of our 
recommendations may take considerable time and effort to fully 
implement, we considered the recommendation to be partially implemented 
if the documentation provided indicated that NASA had made significant 
progress addressing our concerns. For recommendations we consider open, 
NASA's documentation indicated that the agency was either in the very 
early planning stages or had not yet begun to implement the 
recommendation. Enclosure II provides our assessment of the status of 
each recommendation. 

We are recommending that NASA develop an integrated enterprise master 
schedule and milestones that include the improvement activities and 
plans already in place, dates for completion, how progress will be 
measured, and clear accountability for each action not completed in a 
timely and effective manner. 

In its written comments, which are reprinted in enclosure III, NASA 
concurred with our recommendation. However, NASA raised concerns that 
our characterization of certain recommendations as "open" did not 
appropriately recognize the full extent of the agency's effort and 
suggested that we use instead "partially implemented" or, whenever 
appropriate, "closed." We disagree with NASA's assessment and continue 
to believe that our characterization of NASA's progress using the 
criteria above is appropriate. 

Background: 

For more than a decade, we have identified weak contract management and 
the lack of reliable financial and performance information as posing 
significant challenges to NASA's ability to effectively run its largest 
and most costly programs. While NASA has made some progress in 
addressing its contract management weaknesses through improved 
management controls and evaluation of its procurement activities, NASA 
has struggled to implement a modern integrated financial management 
system. NASA made two efforts in the past to improve its financial 
management processes and develop a supporting system intended to 
produce the kind of accurate and reliable information needed to manage 
its projects and programs and produce timely, reliable financial 
information for external reporting purposes, but both of these efforts 
were eventually abandoned after a total of 12 years and a reported $180 
million in spending. In April 2000, NASA began its third attempt at 
modernizing its financial management processes and systems. This 
effort, known as IFMP, was expected to produce an integrated, NASA-wide 
financial management system through the acquisition and incremental 
implementation of commercial software packages and related hardware and 
software components. 

In April 2003, we issued our first report on IFMP. At that time, we 
reported that NASA was not following key best practices for acquiring 
and implementing the system, which may affect the agency's ability to 
fully benefit from the new system's capabilities. Specifically, we 
reported that NASA (1) did not analyze the relationships among selected 
and proposed IFMP components; (2) had deferred addressing the needs of 
key system stakeholders, including program managers and cost 
estimators; and (3) did not properly manage and test its system 
requirements prior to implementation of the core financial module. As a 
result, we reported that: 

* NASA has increased its risks of implementing a system that will not 
optimize mission performance and will cost more and take longer to 
implement than necessary, 

* the core financial module is not being designed to integrate the cost 
and schedule data that program managers need to oversee the work of 
NASA's contractors, and: 

* costly rework will likely be required to fix requirement defects not 
identified prior to implementation. 

In November 2003, we issued three separate reports on IFMP's (1) 
enterprise architecture, (2) financial reporting capabilities, and (3) 
cost and schedule controls. On IFMP's enterprise architecture, we found 
that NASA had not established an effective architecture to guide and 
constrain the program. Although NASA had established some important 
architecture management controls--such as establishing an enterprise 
architecture program office and designating a chief architect--it had 
not yet established others, which have made its efforts to develop, 
implement, and maintain a well-defined architecture more challenging. 
On IFMP's financial reporting capabilities, we found that NASA deferred 
configuration and testing of many key capabilities of the core 
financial module, including the ability to report the full cost of its 
programs. Further, we reported that many of the financial events or 
transaction types needed by program managers to carry out day-to-day 
operations and produce useful financial reports had not been included. 
As a result, we concluded that IFMP, as implemented in June 2003, did 
not comply substantially with the requirements of the Federal Financial 
Management Improvement Act of 1996.[Footnote 2] Finally, on IFMP's cost 
and schedule control, we reported that questionable cost estimates, an 
optimistic schedule, and insufficient processes for ensuring adequate 
funding reserves put IFMP at further risk of not meeting its cost and 
schedule commitments. In preparing the current cost estimate for IFMP's 
10-year life cycle, NASA did not include the full cost likely to be 
incurred during the life of the program, including costs to retire the 
system and other direct and indirect costs. 

NASA Is Taking Steps to Assess Integration Risk for IFMP Commercial 
Components: 

We reported in April 2003[Footnote 3] that NASA had not established and 
implemented a methodology for analyzing and understanding the 
interdependencies of commercial components prior to acquiring IFMP 
components. For programs like IFMP, which involve building a system 
from multiple commercial components, it is important for an agency to 
understand the behavioral interaction and compatibility of the 
commercial-off-the-shelf (COTS) components in order to select 
components that can be integrated in a predictable and standard way. 
Without an effective methodology to gain and apply such knowledge, 
building a commercial component-based system can quickly lapse into 
trial and error, which is fraught with risks. For example, a trial and 
error approach can lead the agency to pursue expensive modifications 
and customized solutions or unnecessarily increase the number and 
complexity of interfaces in an ad hoc and unplanned way--all of which 
increase system acquisition and maintenance costs, delay the delivery 
of capabilities and the realization of benefits, and contribute to less-
than-optimum agency performance. 

To avoid problems with integrating commercial components, we 
recommended that NASA, in order to mitigate future risks, direct the 
Program Executive Officer for IFMP to complete the following actions 
before acquiring any additional components: 

* Establish and implement a methodology for commercial system component 
dependency analysis and decision making. 

* Evaluate the suitability of already-acquired, but not yet 
implemented, IFMP component products within the context of a component 
dependency analysis methodology. 

NASA has made progress toward addressing these recommendations; 
however, the methodology it has established is incomplete and thus does 
not support adequate evaluation of IFMP components' suitability. 
Specifically, NASA's methodology does not include a defined design 
process[Footnote 4] that includes, among other things, the detailed 
process for allocating requirements among the various commercial 
component options and for using iterative prototyping to assess the 
interactions among these components, which would enable the mitigation 
of risks associated with integrating products prior to acquiring them. 
Further, the agency reports that it has to date only applied the 
methodology to evaluate one component (i.e., the contract management 
module) that has not yet been acquired. As a result, we focused our 
assessment of NASA's efforts to implement our recommendations on this 
component. 

According to relevant guidance,[Footnote 5] an analysis of component 
dependencies requires a system life-cycle methodology that effectively 
defines and integrates three system engineering processes--risk 
management,[Footnote 6] requirements development and 
management,[Footnote 7] and design--and the tools, techniques, methods, 
and practices for implementing these processes. NASA's methodology 
describes high-level steps for risk management and requirements 
development and management, and identifies supporting tools, 
techniques, methods, and practices for integrating multiple products to 
fulfill a set of user requirements. However, as stated above, the 
methodology does not define the activities that are to occur as part of 
the overall design process to effectively evaluate the suitability of 
the product for the integrated solution that is to be acquired. For 
example, it does not define the detailed activities that are to occur 
and the products to be developed when (1) performing the gap analysis 
between requirements and component capabilities as part of assessing 
product feasibility; (2) allocating requirements among the various 
commercial components that constitute a given system design option; (3) 
defining the interactions among the various commercial components to 
enable the processing of transactions, including those interactions 
that affect data cleanup and conversion activities; (4) documenting 
commitments and decisions; and (5) using iterative prototyping to 
assess the interactions among these components. 

Nevertheless, in executing the methodology for the one yet-to-be- 
acquired component (i.e., the contract management module), NASA used 
prototyping to assess the ability to successfully integrate this new 
commercial component with already-acquired IFMP commercial components. 
The application of prototyping to evaluate the interdependencies among 
the various components is consistent with best practices; however, in 
this case, the scope was not iterative. Specifically, it was limited to 
basic integration scenarios (e.g., creating purchase requisitions), and 
did not incorporate complex scenarios for interactions among the 
commercial components (e.g., reconciling the obligations of funds to 
the actual disbursement of cash to determine if previously obligated 
funds should be deobligated). The IFMP Integration Program Manager 
stated that the agency intends to use iterative prototyping, although 
this process was not reflected in the documented methodology. 

In addition, the IFMP Integration Program Manager stated that the 
agency was able to mitigate these prototyping weaknesses by applying 
other risk reduction methods. These methods included (1) interviewing 
another agency that had already implemented the commercial components 
to ensure that integration was feasible and (2) establishing agreements 
with the users that the products will not be modified to fulfill user 
requirements, but rather that the requirements will be modified, 
deleted, or addressed through other means. 

Beyond the steps taken to assess component integration risk, it is 
important that NASA establish more mature and transparent design 
decision analysis processes. Until it does, the agency remains at risk 
of implementing a solution that does not optimize mission performance 
and that costs more than anticipated and takes longer to implement than 
necessary. 

Limited Progress Made in Establishing an Enterprise Architecture to 
Guide Modernization Efforts: 

We reported in November 2003[Footnote 8] that NASA had acquired and 
implemented significant components of IFMP without having and using 
enterprise architecture[Footnote 9] to guide and constrain the program. 
Attempting major modernization programs, such as NASA's IFMP, without 
having and using a well-defined enterprise architecture often results 
in systems implementations that are duplicative, are not well 
integrated, require costly rework to interface, and do not effectively 
optimize mission performance. During the course of our fiscal year 2003 
review of IFMP, NASA recognized the need for an enterprise architecture 
and began efforts to develop one, including implementing key 
architecture program management structures and process controls (e.g., 
establishing an enterprise architecture program office and designating 
a chief architect). Over the last 18 months, NASA has made limited 
progress in adopting other key architecture management best practices 
that we recommended. In summary, the agency has implemented 1 of our 
recommendations and has partially implemented 4 others; however, 17 of 
our recommendations remain open. 

In implementing 1 of our 22 recommendations in this area, NASA has had 
each version of its enterprise architecture approved by the Chief 
Information Officer (CIO). In partially implementing 4 others, NASA 
has: 

* established an architecture board made up of senior agency executives 
that is responsible and accountable for developing and maintaining the 
architecture, 

* had the architecture board approve version 3.0 of the architecture, 

* had the NASA Administrator approve version 3.0 of the architecture, 
and: 

* established a verification and validation function to review the 
architecture and related management processes. 

These recommendations are categorized as partially implemented because 
although NASA has established an architecture board, it has yet to 
develop a policy, as we recommended, demonstrating institutional 
commitment to developing and using an architecture. In addition, we do 
not consider that the processes for approving the architecture and 
performing verification and validation activities are established until 
they have been repeated. Further, the current verification and 
validation function is not independent, in that it reports to the 
program office rather than to the architecture board. NASA's Deputy 
CIO/Chief Technology Officer (CTO) stated that the board and 
administrator would continue to review and approve subsequent versions 
of the architecture, and that verification and validation reviews would 
be performed on a recurring basis. 

NASA has yet to address our other recommendations. With regard to 
architectural content, the Deputy CIO/CTO stated that the agency has 
not determined the extent to which NASA's architecture includes the 
content that we identified as missing in our previous report. However, 
this official stated that the agency is currently developing a plan to 
address this recommendation. 

NASA has also not addressed our three recommendations aimed at ensuring 
that IFMP plans are aligned with the architecture and that acquisition 
and implementation activities are appropriately limited until this 
alignment is achieved. The Deputy CIO/CTO stated that the Office of the 
CIO, in conjunction with the agency's Chief Financial Officer (CFO), 
has conducted reviews of already-implemented IFMP modules (e.g., budget 
formulation) to determine whether they are aligned with the 
architecture. This official stated that the reviews conducted to date 
have not shown any instances of misalignment; however, the agency has 
yet to provide us with any evidence, such as documentation on the 
approach and results of these reviews. The Deputy CIO/CTO stated that 
the offices of the CIO and the CFO are currently reviewing soon-to-be- 
implemented modules (e.g., contract management) to assess the extent of 
alignment. 

Moreover, NASA has not implemented other architecture management 
capabilities that our November 2003 report[Footnote 10] cited as 
essential to having an effective enterprise architecture program and 
that we provided recommendations on. In particular, see the following: 

* NASA has not established a written/approved policy guiding the 
development of the enterprise architecture. The Deputy CIO/CTO stated 
that the agency is currently drafting this policy and that it should be 
approved by July 2005. 

* NASA has not placed enterprise architecture products under 
configuration management to maintain integrity and traceability and to 
control modifications or changes to the architecture products 
throughout their life cycles. However, the Deputy CIO/CTO stated that 
the agency is currently developing its configuration management plan 
and associated procedures. Further, the Deputy CIO/CTO told us that the 
agency has assigned a configuration manager for the architecture 
program and that all architecture products are under configuration 
management. However, NASA has yet to provide documentation showing that 
changes to the architecture products are identified, tracked, 
monitored, documented, reported, and audited.[Footnote 11] 

* NASA has not ensured that progress against architecture plans is 
measured and reported, as evidenced by the Deputy CIO/CTO's statement 
that the agency is not measuring and reporting progress against 
approved architecture project management plans. 

* NASA has not established a written/approved policy for architecture 
maintenance. The Deputy CIO/CTO stated that the policy was submitted 
for approval in July 2005. 

* NASA has not ensured that (1) the architecture products describe the 
enterprise in terms of business, performance, data, application, and 
technology; (2) the products describe the "As Is" environment, the "To 
Be" environment, and a sequencing plan; or (3) the business, 
performance, data, application, and technology descriptions address 
security. The Deputy CIO/CTO stated that the agency is currently 
developing a plan to address these three recommendations. 

* NASA has not measured and reported on the quality of enterprise 
architecture products, as evidenced by the Deputy CIO/CTO's statement 
that the agency has yet to develop metrics for evaluating the quality 
of its architecture products. 

* NASA has not established a written/approved policy for architecture 
implementation. The Deputy CIO/CTO stated that the policy was submitted 
for approval in July 2005. 

* NASA has not completed efforts intended to ensure that the enterprise 
architecture is an integral component of information technology (IT) 
investment management processes and that IT investments comply with the 
architecture. The Deputy CIO/CTO stated that the agency recognizes that 
such a process needs to be institutionalized and stated that the 
architecture policy being developed is intended to accomplish this. 
However, at this time, the policy and associated procedures are being 
drafted, and the process for conducting investment alignment reviews is 
being revised. This official also told us that the agency has started 
reviewing proposed system investments for compliance with the 
architecture and that the results of these reviews are being used to 
draft the policy and procedures, as well as revise the review process. 

* NASA has not measured and reported enterprise architecture return on 
investment. The Deputy CIO/CTO stated that the agency is establishing 
metrics and collecting data, and that it intends to issue a report on 
its enterprise architecture return on investment by the end of the 
fiscal year. 

* NASA has not measured and reported on enterprise architecture 
compliance, as evidenced by the Deputy CIO/CTO's statement that NASA 
has yet to establish metrics to measure and report on enterprise 
architecture compliance. 

According to the Deputy CIO/CTO, NASA has yet to develop the program 
management plans that it needs to effectively manage the development, 
maintenance, and implementation of the architecture. This official told 
us that the agency is currently drafting such plans and that the plans 
will specify measurable goals and outcomes to be achieved, the tasks to 
be performed to achieve these goals and outcomes, the resources 
(funding, staffing, and training) needed to perform the tasks, and the 
time frames within which the tasks will be performed. The Deputy 
CIO/CTO also stated that these plans will include the actions that the 
agency will take to address both our recommendations and those 
identified during the verification and validation effort. Until NASA 
has addressed our prior recommendations, the agency's modernization 
efforts, including IFMP, will be at risk of being implemented in a way 
that does not adequately ensure system integration, interoperability, 
and optimized mission support. 

NASA Did Not Develop a Corrective Action Plan to Mitigate the Risk of 
Relying on Already-Deployed Components: 

NASA has not yet developed a corrective action plan to identify known 
and potential risks and, therefore, has not implemented any of the six 
recommendations related to developing a risk mitigation strategy. 
According to IFMP officials, they have an overall risk mitigation 
strategy related to IFMP that they use for this purpose and did not 
think it necessary to revise their strategy based on our 
recommendations. As discussed later, NASA has begun to implement our 
recommendations to improve its requirements management and cost- 
estimating processes; if implemented properly, these improvements could 
help to mitigate the risk associated with relying on already-deployed 
IFMP components. However, we continue to believe that a comprehensive 
corrective action plan would aid NASA in its effort to stabilize the 
system and improve the functionality of IFMP. 

Progress Made toward Identifying Program Management Needs, but Process 
Reengineering Still Needed: 

In April 2003, we reported that NASA had not adequately engaged key 
stakeholders in designing and implementing its core financial module or 
fundamentally changed the way it operates by reengineering its core 
business processes. As a result, the new system had not, as originally 
envisioned, addressed many of the agency's most significant management 
challenges--including improving contract management, producing credible 
cost estimates, and providing the Congress with appropriate visibility 
over its large, complex programs. 

In response to two recommendations we made in that report to (1) engage 
program managers in identifying program management needs and (2) 
reengineer core business processes, NASA has recently begun to 
transform how it manages its programs and projects and oversees its 
contractors. Through an initiative known as Project Management 
Information Improvement (PMI2), NASA plans to enhance the core 
financial module to provide better project management information for 
decision-making purposes. While much remains to be done before IFMP 
will satisfy the information needs of program managers and cost 
estimators, NASA has taken an important first step toward achieving its 
goal. Specifically, NASA has, as we recommended, engaged stakeholders 
to identify program management needs. However, to ensure that NASA 
reaps the benefit of implementing this recommendation, it is critical 
that the agency follows through with its stated plans to reengineer its 
acquisition management processes such that contractors provide, and the 
system can accommodate, the information needed by NASA managers to 
oversee contracts and prepare credible cost estimates. Moreover, as 
discussed later, to ensure that the system is designed and implemented 
to satisfy user requirements, NASA will need to implement an effective 
requirements management process--which includes defining and testing 
detailed design and coding requirements that are traceable to higher 
level requirements. 

NASA Has Made Significant Progress toward Identifying Program 
Management Needs: 

As we reported in April 2003, NASA did not engage program managers and 
other key stakeholders when defining information requirements for the 
IFMP core financial module and, as a result, did not design the system 
to accommodate the information needed to adequately oversee its 
contracts and programs--including preparing credible cost estimates. To 
adequately oversee NASA's largest and most complex programs and 
projects, managers need reliable contract cost data--both budget and 
actual--and the ability to integrate these data with contract schedule 
information to monitor progress on the contract. A well-recognized 
technique used to monitor progress on contracts, as well as a NASA 
program management requirement, is earned value management 
(EVM).[Footnote 12] However, because NASA did not adequately define its 
program management needs, NASA did not design the core financial module 
to accommodate the EVM data needed to perform EVM analysis. 

In response to our April 2003 report, NASA has, as we recommended, 
engaged stakeholders to identify program management needs. 
Specifically, NASA has inventoried its ongoing programs and projects-- 
categorized by product line and priority and risk--and defined 
management and information requirements for each category. Through a 
series of data calls and budget analyses, over the course of a year the 
Office of the Chief Engineer (OCE) compiled a comprehensive listing of 
NASA programs and projects. From the listing of NASA programs and 
projects, OCE identified four product line divisions or investment 
areas, as follows: (1) Basic and Applied Research, (2) Advanced 
Technology Development, (3) Flight Development and Operations, and (4) 
Institutional Infrastructure. Programs and projects in each product 
line were then placed in a priority/risk category--category I, II, or 
III--based on priority and risk factors, such as the magnitude of the 
agency's financial investment, uncertainty surrounding the application 
of new or untested technology, and strategic importance of the program 
to the agency. OCE then defined project management requirements based 
on the product line and the program's priority and risk classification. 
For example, a category I--high-risk, high-priority--Applied Technology 
Development program would be required to prepare a life- cycle cost 
estimate linked to the program's work breakdown structure (WBS) as well 
as obtain an independent cost estimate. In contrast, an Applied 
Technology Development program in category III--low-risk, low- 
priority--is required only to obtain a sufficiency review of its life- 
cycle cost estimate. 

As part of PMI2, and based on the program management needs identified 
by OCE, NASA has established high-level functional requirements related 
to data structures, funds distribution, cost collection, and reporting 
structures, which, if implemented as intended, should provide the 
system with the functionality currently lacking and needed by program 
managers and cost estimators. Examples of the functional requirements 
added are (1) the system shall have the ability to collect the actual 
cost of work performed and budgeted cost of work performed, (2) the 
system shall have the ability to collect estimate-to-complete costs at 
any level of the WBS hierarchy, and (3) the system shall have the 
ability to interface the actual cost of work performed with the 
budgeted cost of work performed to an EVM tool. While defining high- 
level functional requirements is an important first step toward 
providing NASA managers with the information they need, as discussed 
later, to ensure that the system is designed and implemented properly, 
NASA will also need to implement an effective requirements management 
process--which includes defining and testing detailed design and coding 
requirements needed to implement these to higher level requirements. 

Future Plans to Reengineer Acquisition Management Processes Are Key: 

As part of PMI2, NASA plans to, as we recommended, reengineer its 
acquisition management process, which includes plans for (1) replacing 
its existing legacy system financial coding structure with a coding 
structure that will better accommodate the information requirements of 
program managers and cost estimators, (2) reevaluating its policies and 
processes for collecting contractor cost data to improve the quality of 
contractor-provided cost and performance data, and (3) integrating data 
contained in the core financial module with the tools needed for 
performing EVM analysis. However, NASA is in the very early planning 
stage of implementing our recommendation, and the details for how NASA 
will accomplish its objectives are still vague. Therefore, it was not 
possible to assess whether NASA's implementation of PMI2 will 
accomplish its stated goal of enhancing the core financial module to 
provide better project management information for decision-making 
purposes. Further, given the complexity of what NASA is attempting to 
accomplish, many of its PMI2 completion milestones appear to be 
unrealistic. Nonetheless, we are encouraged that NASA has acknowledged 
that its existing legacy coding structure and acquisition management 
processes do not always provide sufficiently detailed data to prepare 
credible cost estimates or effectively monitor contractor performance. 

As we reported in April 2003, because NASA did not fundamentally change 
the way it operates by involving key users in business process 
reengineering efforts, the core financial module as currently 
implemented does not capture cost information at the same level of 
detail that it is received from NASA's contractors. Instead of 
implementing a financial coding structure that met the information 
needs of program managers, NASA embedded the same financial coding 
structure that it used in its legacy reporting systems in the core 
financial module. As a result, the availability of detailed cost data 
depends on the adequacy of NASA's legacy coding structure. Therefore, 
in some cases, contractor-provided cost data must be aggregated to a 
higher, less detailed level before they are posted against the legacy 
financial coding structure. 

Using a two-phased approach, NASA now plans to design and implement a 
new financial and program coding structure, which is intended to better 
accommodate the information requirements of program managers and cost 
estimators. As shown in figure 1, NASA plans to organize its work by 
appropriation, mission, theme, program, and project. 

Figure 1: New Financial and Technical Work Breakdown Structure 
Excerpts: 

[See PDF for image] 

[End of figure] 

In phase one, which NASA expects to be complete by October 1, 2005, the 
agency plans to define and standardize the first two levels of the 
project management elements of NASA's work--referred to as the project 
technical work breakdown structure. In phase two, which is planned for 
completion by October 1, 2006, NASA plans to expand the project's 
technical work breakdown structure through level 7--as shown in figure 
1. NASA does not intend to standardize these lower level elements but 
instead will allow NASA project managers, within certain parameters, to 
define project-unique elements. However, these lower level elements may 
only include work that rolls up to the standard WBS elements. According 
to NASA PMI2 officials, it is imperative that phase one is completed 
and the new project WBS is defined and incorporated into the core 
financial module by October 1, 2005, to coincide with the beginning of 
the fiscal year. However, as of the end of our fieldwork in mid-June 
2005, NASA had defined level two WBS requirements for only one of its 
four product lines and had not yet validated these new requirements 
with the appropriate user groups. 

As part of PMI2, NASA plans to reevaluate its policies and processes 
for collecting contractor cost data. NASA obtains contractor cost data 
from two primary sources--monthly contractor financial management 
reports, NASA Form 533, and monthly contractor cost performance 
reports. Both reports contain budget and actual cost data, but only 
contractor cost performance reports contain the data needed to perform 
EVM analysis. However, as discussed in our April 2003 report, NASA did 
not evaluate the adequacy of its existing contractor cost reporting 
vehicles to determine whether the reports met the information needs of 
program managers and cost estimators. Instead, NASA chose to use NASA 
Form 533 data to populate the core financial module without considering 
the merits of the data contained in the contractor cost performance 
reports. Consequently, the cost data maintained in the core financial 
module are not adequate for monitoring contractor performance for 
NASA's largest, most complex contracts--those requiring EVM reporting 
and analysis. 

To respond to our recommendation to reengineer its acquisition 
management process, NASA plans to evaluate and potentially combine the 
two existing contractor cost reports in order to create contractor cost 
reporting requirements that satisfy its external financial reporting 
and program management needs. Although NASA plans to complete this 
process by October 1, 2006, many questions remain unanswered as to how 
NASA will implement new contractor cost reporting requirements. For 
example, it is unclear whether NASA will renegotiate existing contracts 
to include new contractor reporting requirements or implement these 
changes prospectively as new contracts are awarded. If NASA plans to 
implement new contractor reporting requirements prospectively, it is 
unclear how the core financial module would incorporate a new reporting 
format for new contracts while maintaining the old reporting format for 
existing contracts. On the other hand, renegotiating existing contracts 
to include new reporting requirements could prove to be extremely 
costly. Because NASA's plans for implementing new contractor reporting 
requirements are still in their infancy, with most elements of the 
plans still undefined, the planned October 1, 2006, completion date 
will be difficult, if not impossible, to meet. 

Finally, according to NASA, PMI2 will also address the integration of 
financial information with tools for planning, scheduling, and EVM 
analysis. However, NASA has not yet established a completion date for 
this phase of the project or any specific implementation details. 

Improvements Made to Requirements Management and Testing Processes: 

In April 2003 we reported that NASA had not effectively implemented the 
requirements management[Footnote 13] or disciplined testing processes 
necessary to support the implementation of the core financial module 
and, therefore, had increased the risk that it would not be able to 
effectively identify and manage the detailed system requirements that 
system developers and program managers need to acquire, implement, and 
test a system. Due in part to weaknesses in NASA's requirements 
management process, the core financial module NASA fielded in June 2003 
was not properly configured or designed to meet NASA's financial 
reporting and management needs. Although NASA has recently implemented 
new requirements management and testing processes, the agency has not 
implemented our recommendation to properly define and document system 
requirements for already-deployed IFMP modules, including the core 
financial module, and has only partially implemented our recommendation 
related to establishing an effective regression testing and metrics 
program. As a result, many of the system configuration problems caused 
by the agency's ineffective requirements management and testing 
processes continue to plague the core financial module. 

NASA Has Not Yet Fully Developed and Properly Documented Core Financial 
Module Requirements: 

Subsequent to our April 2003 report, NASA IFMP officials acknowledged 
that the requirements management and testing methodology and tools used 
by the contractor responsible for implementing the core financial 
module did not result in requirements that were consistent, verifiable, 
and traceable or that contained the necessary specificity to minimize 
requirement-related defects. While NASA has taken the critical first 
steps of implementing the necessary requirements management and testing 
processes to help manage IFMP, it has not yet fully implemented our 
recommendation to properly define and document system requirements for 
the already-deployed IFMP modules, including the core financial module. 
This is important not only because it affects the way the core 
financial module currently functions but also because it affects NASA's 
ability to implement future upgrades and other modules expected to 
interface with the core financial module. 

Requirements represent the blueprint that system developers and program 
managers use to design, develop, and acquire a system. Improperly 
defined or incomplete requirements have been commonly identified as a 
cause of system failure, resulting in systems not meeting their costs, 
schedules, or performance goals. Further, because requirements provide 
the foundation for system testing, requirement defects, such as those 
noted during our review relating to specificity and the ability to 
determine the relationship between requirements (commonly referred to 
as traceability), preclude an entity from implementing a disciplined 
testing process. That is, requirements must be complete, clear, and 
well documented to design and implement an effective testing program. 
Absent this, an organization is taking a significant risk that its 
testing efforts will not detect significant defects until after the 
system is placed into production. 

NASA officials stated that they understand the importance of 
implementing disciplined requirements management and testing processes 
and believe that they have developed the necessary procedures to govern 
its efforts. They also stated that, due to resources and priorities, 
they decided to use these improved procedures on new projects such as 
the e-payroll module and defer full implementation of NASA's improved 
requirements management processes until October 2006--when NASA plans 
to redefine the core financial module system requirements as part of 
the core financial module system upgrade. Our limited review of several 
payroll requirements, selected by NASA to illustrate the effectiveness 
of its new requirements management process, showed that NASA has made 
progress since our April 2003 report. For example, we were able to 
determine the relationship between a given test and a specific 
requirement, which was not always possible with the requirements 
management and testing methodology and tools used previously. Further, 
it was clear that the new tools and procedures would allow for and 
facilitate the type of specificity needed to reduce requirements- 
related defects to acceptable levels. However, some of the problems we 
identified previously relating to specificity were still present. For 
example, in reviewing the requirements documentation on employee 
deductions for such items as taxes, the documentation did not contain 
all the necessary business rules related to tax withholdings. This is 
important because, as discussed previously, inadequate or incomplete 
requirements preclude an entity from implementing a disciplined testing 
process. Consequently, when we then traced the tax withholding 
requirement to the testing documentation, we found that the tests NASA 
constructed did not adequately test key elements of the requirement-- 
including whether withholdings for Social Security taxes are suspended 
at the appropriate income threshold. 

In discussions with NASA officials, they agreed that our observations 
were correct and that they were not yet where they needed to be with 
respect to the specificity of NASA's requirements. They also stated 
that they will continue to monitor their process and look for 
opportunities for improvement, and as they learn more about performing 
this vital function, they expect the processes to improve. It will take 
time to effectively implement the disciplined processes needed to 
reduce the risks to acceptable levels. Therefore, it will be critical 
that NASA provide the management support and sustained leadership 
needed to ensure that this important initiative is successful. 

Based on our discussions with NASA officials, it is clear that they now 
have recognized that the best indicator of whether the project has 
reduced its risks to acceptable levels is the strength of the processes 
used to manage the project. For example, NASA officials stated that 
they are now utilizing an independent validation and verification 
contractor to help monitor NASA's project management processes and 
provide suggestions for improvement. It will be critical for NASA to 
continue its efforts to effectively monitor and evaluate its processes 
and make the necessary adjustments if it is to continue making 
progress. 

NASA Has Implemented a Regression Testing Program: 

As changes are made to IFMP, either because additional functionality is 
added or defects are corrected, it is important to test the revised 
application before it is released to ensure that modifications have not 
caused unintended effects and that the system still complies with its 
specified requirements. This practice is referred to as regression 
testing. At the time NASA fielded the core financial module in June 
2003, it did not have a regression testing program in place. According 
to NASA officials, as we recommended, the agency is now performing 
regression testing prior to all new system releases, which is clearly a 
step in the right direction. However, as discussed previously, 
complete, clear, and well-documented requirements are the foundation on 
which an effective testing program is established. Therefore, the 
weaknesses we identified previously in NASA's core financial module 
requirements impair the quality of NASA's regression testing program. 

In order to reduce the amount of effort involved in documenting the 
requirements that will support its upgrade efforts, improve regression 
testing efforts, and increase NASA's confidence in the regression 
testing program, we were told that in May 2003 NASA began documenting 
in its regression testing tool the specific business rules and 
requirements that are associated with its core financial module. While 
NASA recognizes that this does not fully accomplish all of the 
objectives called for in its improved processes discussed above, the 
agency believes that it does help mitigate the risks associated with 
the regression testing efforts. However, this approach does not provide 
reasonable assurance that (1) requirements have been properly validated 
and (2) the tests are designed with a complete set of requirements. 

NASA Now Tracks Metrics Related to System Defects: 

As we recommended in our April 2003 report, NASA has taken steps to 
develop metrics and implement a metrics measurement process that can be 
used to evaluate the effectiveness of its processes by identifying the 
causes of system defects. Understanding the cause of a defect is 
critical to evaluating the effectiveness of an organization's project 
management processes, such as requirements management and testing. For 
example, if a significant number of defects are caused by inadequate 
requirements definition, then the organization knows that the 
requirements management process is not effective, which helps the 
organization identify the corrective actions needed. While NASA has 
made progress in this important area by collecting information on the 
causes of system defects it identifies in its regression testing 
efforts, as discussed below, additional information and analysis could 
enhance the agency's efforts. 

According to IFMP officials, NASA is currently collecting data on the 
cause of core financial module system defects identified through its 
regression testing program. For example, NASA is tracking the number of 
system defects related to things such as configuration problems, 
inadequate requirements definition, inadequate testing, and programmer 
errors. However, NASA is not collecting the same data for system 
defects that are identified by users and, as a result, has limited 
visibility over the cause of these defects. In addition, while NASA is 
collecting data on defects identified through regression testing, the 
agency has not instituted a formal process for fully analyzing the data 
by identifying the trends associated with them. For example, by 
analyzing the trends, NASA could determine whether the requirements 
management approach it has adopted sufficiently reduces its risk of the 
system not meeting cost, schedule, and functionality goals to 
acceptable levels. This analysis would also help to quantify the risks 
inherent in the testing process that NASA has selected for the core 
financial module. 

NASA IFMP officials have acknowledged that this type of analysis would 
be beneficial and stated that they will determine what actions are 
necessary to implement the necessary improvements. Some of these 
changes will be easy to implement while others will require more 
effort. For example, since NASA has already decided to capture the 
cause associated with the defects identified during regression testing, 
developing the necessary trending information should be fairly easy. On 
the other hand, developing similar data for other initiatives, such as 
data conversion and user-reported problems, will require more effort 
since a process has not yet been put in place to develop and capture 
such information. 

Detailed Plan for Compliance with the Federal Financial Management 
Improvement Act Is Still Needed: 

The Office of the CFO recently updated its Financial Management 
Improvement Strategy and developed a Financial Leadership Plan, which 
are intended to lay the groundwork for improving NASA's financial 
management operations. However, neither of these documents nor actions 
taken by NASA to date are evidence of the kind of corrective action 
plan needed to produce a financial management system that complies 
substantially with the requirements of the Federal Financial 
Management: 

Improvement Act of 1996 (FFMIA).[Footnote 14] FFMIA requires that 
agencies implement and maintain financial management systems that 
comply substantially with federal financial management system 
requirements, applicable federal accounting standards, and the U.S. 
Government Standard General Ledger at the transaction level. FFMIA also 
requires the auditors of agencies' financial statements to report on 
such compliance. Further, FFMIA stresses the need for agencies to have 
systems that can generate timely, accurate, and useful financial 
information with which to make informed decisions, manage daily 
operations, and ensure accountability on an ongoing basis. 

As we reported in November 2003 and NASA's independent auditor reported 
again in November 2004 as part of its report disclaiming an opinion on 
NASA's fiscal year 2004 financial statements, NASA's new financial 
system does not comply substantially with the requirements of FFMIA. 
Key areas of concern include the core financial module's inability to 
(1) produce transaction-level detail in support of financial statement 
account balances, (2) identify adjustments or correcting entries, and 
(3) correctly and consistently post transactions to the right accounts. 
In addition, material weaknesses in NASA's internal controls over 
property, financial statement preparation, fund balance with Treasury, 
policies and procedures, and NASA's financial organization's structure 
also affect compliance with the requirements of FFMIA. Finally, as 
discussed previously, the core financial module currently lacks the 
capability to provide timely and reliable financial information to 
program managers and cost estimators. 

Although NASA has, as discussed previously, begun to implement a 
corrective action plan to engage key stakeholders in developing a 
complete and accurate set of user requirements and in reengineering its 
acquisition management processes, the agency has not prepared a 
detailed plan for its systems to meet the requirements of FFMIA. While 
NASA's Financial Management Improvement Strategy clearly expresses the 
need to properly record upward and downward adjustments, improve 
documentation and audit trails, and address noncompliant cost 
practices, little explanation of how the agency intends to accomplish 
these goals is provided. Similarly, while the Financial Leadership Plan 
provides the Office of the CFO's vision for the financial management 
organization and its roles and responsibilities, it provides little 
detail on what tasks are required to fulfill that vision. 

Improvements Made to NASA's IFMP Life-Cycle Cost Estimate and Processes 
for Calculating Funding Reserves: 

In November 2003, we reported that the reliability of NASA's cost 
estimate for IFMP was uncertain because disciplined cost-estimating 
processes required by NASA and recognized as best practices--preparing 
a full-cost life-cycle cost estimate, using breakdowns of work to be 
performed, and providing a clear audit trail--were not used in 
preparing the estimate. We also reported that reserve funding for IFMP 
contingencies may be insufficient because the program did not 
consistently perform in-depth analyses of the potential cost impact of 
risks and unknowns specific to IFMP, as required by NASA guidance, nor 
did the program quantify the cost impact of identified risks or link 
its risks to funding reserves. We recommended that the program use 
processes dictated by best practices and NASA guidance for preparing 
and updating the life-cycle cost estimate as well as establish 
additional disciplined processes to better ensure that the agency more 
accurately estimates program cost and predicts the impact of possible 
undesired events, such as schedule slippage. 

Since we issued our report, NASA has taken steps to prepare a full life-
cycle cost estimate for IFMP and improve the quality and credibility of 
the program's cost estimates for the remaining IFMP modules by (1) 
establishing a new WBS for IFMP, which better describes the work 
performed under the program, and (2) improving the audit trail 
supporting the program's life-cycle cost estimate, as we recommended. 
Similarly, NASA has made progress toward implementing our 
recommendations for ensuring that adequate funding is available for 
IFMP contingencies by (1) establishing a comprehensive risk evaluation 
methodology, which is used to facilitate the estimation and allocation 
of financial reserves; (2) requiring that the cost impact of high 
severity risks be analyzed and quantified using probabilistic software 
tools; and (3) establishing a clear relationship between the program's 
risk database and its financial reserves. While NASA has made good 
progress toward implementing our recommendations, additional work 
remains in order to fully implement most of the recommendations. 

Full Life-Cycle Cost Estimate for IFMP Not Yet Complete: 

In November 2003, we reported that the reliability of NASA's life-cycle 
cost estimate for IFMP was uncertain because disciplined processes 
required by NASA and recognized as best practices were not used in 
preparing the estimate. One of these processes was the preparation of a 
life-cycle cost estimate on a full-cost basis--including all direct and 
indirect costs for planning, procurement, operations and maintenance, 
and disposal. Such an estimate is important for helping decision makers 
better assess all the costs associated with operating and implementing 
a program and for controlling program costs. However, NASA's life-cycle 
cost estimate for IFMP was incomplete and did not include the full cost 
likely to be incurred during the life of the program. For example, the 
life-cycle cost estimate did not include the cost to operate and 
maintain the system beyond 2010;[Footnote 15] the cost of retiring the 
system; enterprise travel costs, which would be provided monthly by the 
NASA centers; and the cost of nonleased NASA facilities for housing 
IFMP. 

Since our review, NASA has made significant progress in preparing a 
full life-cycle cost estimate. Based on industry best practices, NASA 
determined that IFMP's life cycle spans from program inception in 1999 
through 2026 and is preparing the estimate according to this life 
cycle. The estimate also includes the full-cost categories required by 
NASA full-cost guidance, as well as disposal costs. However, agency 
officials agree that the estimate is still a work in process. During 
about a 2-week time frame, NASA provided us with four versions of the 
draft estimate. Our review of the last version indicated that there 
were still numerous errors in transferring data from the sources 
provided and cases where sources were not provided to support portions 
of the estimate. At the time of our review, NASA was still working to 
resolve discrepancies before finalizing the estimate. 

Current WBS Structure Not Used to Estimate Costs for All Remaining 
Modules: 

We also reported in 2003 that NASA did not consistently use breakdowns 
of the work to be performed--or WBS[Footnote 16]--in preparing the cost 
estimates for the IFMP modules, as recommended by NASA guidance. 
Without using the WBS as a structured approach to prepare the cost 
estimate, NASA cannot ensure that all costs are accounted for. 

Although NASA recently updated its schedule management framework for 
IFMP and included a new WBS that better reflects the work to be 
performed, the agency has not prepared cost estimates for all remaining 
modules using the new WBS. Instead, only one of the three remaining 
modules--Integrated Asset Management--has been prepared using the new 
WBS. Further, this WBS estimate is incomplete, as it does not address 
the central implementation element. In addition, it only reflects 
procurement costs and does not include integration project costs, civil 
service salaries and travel, general and administrative costs, or 
service pool costs. Likewise, the WBS estimates for the Contract 
Management and Labor Distribution System modules prepared using the old 
WBS structure were either incomplete or incorrect. According to a 
program official, the WBS estimates for the three remaining modules 
will be updated and prepared in accordance with the new WBS as a part 
of the fiscal year 2007 budget cycle. 

NASA Is on the Right Track to Provide an Audit Trail to Support the 
Life-Cycle Cost Estimate: 

In 2003, we also reported that in cases where the WBS was used to 
prepare the cost estimates for IFMP modules, NASA did not always 
provide a clear audit trail between the WBS estimate and the life-cycle 
cost estimate. Having a clear audit trail is among the Software 
Engineering Institute's (SEI)[Footnote 17] requisites for producing 
reliable cost estimates. Without a clear audit trail, it is difficult 
to determine whether differences between the detailed WBS estimates and 
the official program cost estimate are appropriate. 

NASA has made progress in providing an audit trial to support the life- 
cycle cost estimate it is preparing. For example, NASA drafted a 
document to accompany its life-cycle cost estimate that explains the 
methodology, assumptions, and data sources for the estimate. Also, in 
preparing the detailed spreadsheets to accumulate costs, the program 
added a column listing the data sources used. However, the detailed WBS 
estimates provided do not yet track clearly to the program's life-cycle 
cost estimate. Although additional work remains, we believe NASA is on 
the right track to fully implement this recommendation. 

Progress Made in Establishing a Comprehensive Risk Evaluation 
Methodology: 

In addition to our concerns about the reliability of NASA's life-cycle 
cost estimate, we reported in 2003 that NASA did not consistently 
perform in-depth analyses of the potential cost impact of risks and 
unknowns specific to IFMP, as required by NASA guidance. Instead, the 
agency established funding reserves on the basis of reserve levels set 
by other high-risk NASA programs. NASA guidance requires cost 
contingencies to be tailored to the specific risks associated with a 
particular program or project and suggests that tools such as 
probabilistic risk assessment can help in analyzing risks. As we 
reported in 2003, without in-depth analyses of the potential cost 
impact of risks and unknowns specific to IFMP, NASA cannot ensure that 
the funding set aside for IFMP contingencies is sufficient. 

Since issuance of our 2003 report, NASA has made progress toward 
implementing our recommendation to utilize a systematic, logical, and 
comprehensive tool in establishing the level of financial reserves for 
the remaining module projects and tailoring the analysis to risks 
specific to IFMP. NASA has established a comprehensive risk evaluation 
methodology, which is used to facilitate the estimation and allocation 
of financial reserves. NASA incorporated the methodology in its Program 
Risk Management Framework, which it has also updated. A key part of the 
methodology employs a probabilistic risk tool--CrystalBall--for setting 
IFMP specific risk-based reserves. However, NASA has not yet used this 
probabilistic risk tool to estimate financial reserves for all 
remaining module projects. CrystalBall is a COTS forecasting and 
simulation tool that allows the prediction of a range of possibilities 
based on assumptions. NASA relies on the tool's Monte Carlo simulation 
capabilities to add a level of rigor to the reserves calculation 
process. NASA has developed and approved a risk reserves template that 
the program and projects are required to follow and complete in 
calculating program-and project-level reserves funding. While the new 
methodology and probabilistic risk tool were used to set reserve levels 
for IFMP for the fiscal year 2006 budget cycle submission, the risk 
tool was not used in setting the reserves for the remaining module 
projects. IFMP program officials stated that this was because the 
projects submitted their fiscal year 2006 budgets before IFMP began 
using the risk tool. IFMP program officials said that NASA plans to use 
it in setting reserves for all of the projects for the fiscal year 2007 
budget cycle and use those reserves in the fiscal year 2007 budget 
submission. 

Progress Made, but Cost Impact of Risk Not Quantified Consistently: 

We further reported in 2003 that NASA typically did not quantify the 
cost impact of identified risks. According to SEI guidance, estimating 
the potential cost impact for all identified risks is an element of 
good estimating practice. Quantifying the cost impact of identified 
risks helps programs develop realistic budget estimates. As we reported 
in 2003, without estimating the potential cost impact of its IFMP 
risks, NASA cannot determine whether it has sufficient reserves to 
cover such risks. 

NASA has taken positive steps to implement our recommendation to 
quantify the cost impact of risks with a high likelihood of occurrence. 
Specifically, NASA now requires that the cost impact of high severity 
risks be analyzed more consistently through the use of standardized 
risk reserves templates and be quantified through the use of a more 
rigorous methodology and probabilistic risk tool. However, as we noted 
previously, NASA used a risk tool in calculating risk-based reserves 
for IFMP for the fiscal year 2006 budget cycle but not for the 
individual projects. While NASA did quantify the cost impact of high 
severity risks for individual projects, it did not do so using the 
tool, nor did it provide us other documentation that evidenced the 
projects' use of the tool. IFMP officials stated that NASA plans to use 
the risk tool along with the new methodology to quantify the cost 
impact of high severity risks for the projects during the fiscal year 
2007 budget cycle and reflect the established reserve levels in the 
fiscal year 2007 IFMP budget submission. 

NASA Has Established a Clear Link between IFMP's Risk Database and 
Financial Reserves: 

In addition to not typically quantifying the cost impact of identified 
risks, we reported in 2003 that NASA did not consistently link 
identified risks to funding reserves. Linking risks to funding reserves 
helps to ensure that funds are available should the risk occur. 
Moreover, quantifying the cost impact of identified risks and clearly 
and consistently linking the risk database to funding reserves helps 
programs develop realistic budget estimates. 

As we recommended in our November 2003 report, NASA has successfully 
established the link between the program's risk database and financial 
reserves. Specifically, NASA's new risk/reserves methodology ensures 
that traceability is maintained through the use of the risk reserve 
templates that assist the program and projects in determining the high 
severity risks and the direct relationship of risk and reserve setting. 
We also observed this relationship in the IFMP office's reserve 
template, which included the estimated cost of high severity risk and 
NASA's reflection of the estimated cost in the program office's budget 
submission for the fiscal year 2006 budget cycle. 

Conclusion: 

NASA has begun to implement some of our recommendations related to its 
acquisition and implementation of IFMP. However, progress in 
implementing our recommendations has been limited and slow in coming. 
The longer NASA takes to implement our recommendations, the greater the 
risk that IFMP will require costly and time-consuming rework to perform 
as intended. Because NASA did not adopt disciplined acquisition and 
implementation practices from the onset and has yet to fully implement 
our recommendations, it has been forced to take actions that should 
have been accomplished prior to implementation--causing the agency to 
unnecessarily invest time and resources to rework already-deployed 
system components in order to produce a system that meets user needs. 
By expeditiously implementing each of our recommendations, NASA has the 
opportunity to minimize the impact of past mistakes and begin to reap 
the benefits of operating with an integrated financial management 
system sooner. The longer NASA waits to fully implement our 
recommendations, the greater the risk is that the agency will continue 
to operate a system that does less than promised and costs more than 
expected. 

Recommendation for Executive Action: 

Given the significance of the remaining problems with IFMP, we 
recommend that the NASA Administrator direct the IFMP Program Executive 
Officer to develop an integrated enterprise master schedule and 
milestones in coordination with the Office of the CFO, OCE, and the 
Program Analysis and Evaluation Office. The schedule, developed in the 
context of modernized business processes and improved operations, 
should include the improvement activities and plans already in place, 
dates for completion, how progress will be measured, and clear 
accountability for each action not completed in a timely and effective 
manner. 

Agency Comments and Our Evaluation: 

In commenting on a draft of this report, NASA agreed with the intent of 
our recommendation but expressed concern that the recommendation might 
be misunderstood. NASA suggested that instead of recommending that NASA 
develop an overall corrective action plan to address the weaknesses 
identified, we recommend that NASA develop an integrated enterprise 
master schedule and milestones. We agree with NASA that the reworded 
recommendation captures the intent of our original recommendation and 
have incorporated this change into our final report. 

In its written comments, which are reprinted in enclosure III, NASA 
also expressed its concern that the nomenclature we used to describe 
NASA's progress toward implementing our recommendations was potentially 
ambiguous. Specifically, NASA raised concerns that our characterization 
of certain recommendations as "open" did not appropriately recognize 
the full extent of the agency's effort and suggested that we use 
instead "partially implemented" or, whenever appropriate, "closed." We 
disagree with NASA's assessment and continue to believe that our 
characterization of NASA's progress is appropriate. As discussed 
previously, our criteria for assessing the extent of implementation of 
our recommendation are as follows. We considered a recommendation 
closed when NASA provided us with documentation that demonstrated that 
it had fully addressed the concerns we raised in our prior reports. 
Recognizing that many of our recommendations may take considerable time 
and effort to fully implement, we considered a recommendation to be 
partially implemented if the documentation provided indicated that NASA 
had made significant progress addressing our concerns. For 
recommendations we consider open, NASA's documentation indicated that 
the agency was either in the very early planning stages or had not yet 
begun to implement the recommendation. 

In its comments, NASA stated that it has defined and implemented a 
methodology for software component dependency analysis that largely 
mirrors the elements of SEI's approach for performing such 
analysis.[Footnote 18] These elements include a design process that 
allocates requirements among the various commercial components that 
constitute a given system design option and using iterative prototyping 
to assess the interactions among these components. NASA also stated 
that the prototyping that it performed on an IFMP module (i.e., 
contract management module) was not limited to basic integration 
scenarios as we reported, but rather included end-to-end processes and 
detailed key accounting validations. The agency concluded that--based 
on its methodology and proven track record in implementing it on one 
module--our two recommendations relating to component dependency 
analysis should be considered "closed." 

We disagree. Our review of the methodology and supporting documentation 
provided, observation of the prototyping demonstration, and interviews 
with IFMP officials showed that while the agency had performed 
integration activities and produced artifacts and system development 
products, the methodology is incomplete in that it does not include a 
defined design process relevant to component dependency analysis. As a 
result, the methodology limits visibility into the information flows, 
documentation requirements, product standards, activities, commitment 
status, events, assessment methods, and roles and responsibilities 
needed to systematically define the progressive discovery, analysis, 
tracking, and resolution of component dependencies and their life-cycle 
relationships to good engineering and management decisions. With 
respect to the complexity of the integration scenarios that were 
prototyped, we found that the scenarios were becoming more complex; 
however, we found that critical and complex scenarios had yet to be 
prototyped. Specifically, we found that the prototyping scenarios being 
evaluated were primarily oriented to proving basic feasibility. 
Further, NASA's own documentation stated that complex but required 
integration scenarios (e.g., modifications) were not evaluated, but 
rather basic integration that appeared achievable. Moreover, with 
regard to its proven track record in implementing these procedures, 
NASA officials stated that the methodology was being applied for the 
first time to evaluate the contract management module, and NASA's 
documentation characterizes the methodology as an overview. We agree 
that the level of specificity contained in the methodology description 
provides only an overview, and believe that one application of such a 
methodology does not constitute a proven track record. 

With respect to our assessment of NASA's enterprise architecture, NASA 
made two primary points. First, NASA stated that it respectfully 
challenges our 2003 position that the agency acquired and implemented 
significant components of IFMP without having and using an enterprise 
architecture to guide and constrain the program. Second, NASA stated 
that since 2003 it has made extensive progress in adopting key 
architecture management best practices recommended by GAO and OMB, and 
it has continued to refine and expand the content of its enterprise 
architecture. As such, it stated that--based on Version 3.1 of the NASA 
enterprise architecture--20 of our 22 prior architecture 
recommendations are "closed." According to NASA, its actions since 2003 
provide a solid foundation for the agency's modernization efforts, 
including IFMP, mitigating the risk of investments being implemented in 
a way that does ensure system integration, interoperability, and 
optimized mission support. 

We disagree with both of NASA's points. First, in response to NASA's 
position challenging our finding that the agency had acquired and 
implemented significant components of IFMP without having and using an 
enterprise architecture, we note that NASA concurred with all of the 
recommendations in our 2003 report.[Footnote 19] In that report, we 
stated that NASA had either implemented or was in the process of 
implementing six of nine IFMP modules, and that the enterprise 
architecture, which NASA, at that time, had just recently begun to 
develop, lacked sufficient detail to guide and constrain investment 
decisions. Accordingly, we reported that significant IFMP components 
had been acquired and implemented outside the context of an enterprise 
architecture. At that time, NASA's CTO, who is currently the Deputy 
CIO/CTO, concurred with our position that the architecture products 
used to acquire and implement the six IFMP modules did not contain 
sufficient scope and content. Second, our assessment of NASA's efforts 
to address our prior recommendations is based on the totality of the 
evidentiary support that NASA has provided. The evidence provided to us 
during the course of our review and used as a basis for our analysis 
showed that NASA has implemented 1 and partially implemented 4 of our 
22 enterprise architecture recommendations. The evidence that NASA 
provided us related to Version 3.0 of its architecture and was in 
response to our request for the most current information. NASA neither 
cited the existence of Version 3.1 nor provided documentation 
associated with it. In addition, NASA has yet to respond to our request 
for the documents referenced in enclosure 3 of its comments (e.g., 
Master Work Plan and investment reviews) or Version 3.1 of the 
architecture. 

Finally, as part of NASA's written comments, NASA included four 
enclosures, as follows: (1) the agency's response to each section of 
our report, (2) NASA's position on each of our prior recommendations, 
(3) a synopsis of significant enterprise architecture accomplishments, 
and (4) a briefing slide synopsis of significant enterprise 
architecture accomplishments. NASA's briefing slide synopsis of 
significant enterprise architecture accomplishments restates the 
information contained in the third enclosure; therefore, we did not 
reprint this document. 

As agreed with your office, unless you publicly announce its contents 
earlier, we will not distribute this report further until 30 days from 
its date. At that time, we will send copies to interested congressional 
committees, the Administrator of NASA, and the Director of the Office 
of Management and Budget. We will make copies available to others upon 
request. In addition, the report will be available at no charge on the 
GAO Web site at [Hyperlink, http://www.gao.gov]. 

If you or your staff have any questions concerning this report, please 
contact Gregory D. Kutz at (202) 512-9095 or [Hyperlink, 
kutzg@gao.gov]. Contact points for our Offices of Congressional 
Relations and Public Affairs can be found on the last page of this 
report. Key contributors to this report are acknowledged in enclosure 
IV. 

Signed by: 

Gregory D. Kutz: 
Managing Director: 
Forensic Audit and Special Investigations: 

Signed by: 

Allen Li: 
Director: 
Acquisition and Sourcing Management: 

Signed by: 

Randolph C. Hite: 
Director: 
Information Technology Architecture and Systems Issues: 

Signed by: 

Keith A. Rhodes: 
Chief Technologist: 
Applied Research and Methods: 

[End of section] 

Enclosure I: Objectives, Scope, and Methodology: 

In fiscal years 2003 and 2004, we issued four reports on the National 
Aeronautics and Space Administration's (NASA) Integrated Financial 
Management Program (IFMP) and made a number of recommendations for 
improving the acquisition and implementation strategy for IFMP. The 
focus of this report was to determine the extent to which NASA adopted 
the recommendations made in each of the related reports. This 
engagement was carried out and managed jointly by GAO's Financial 
Management and Assurance, Information Technology Architecture and 
Systems Issues, Acquisition and Sourcing Management, and Applied 
Research and Methods teams. Each team interviewed the appropriate NASA 
officials and obtained documentation on NASA's progress toward 
implementing our recommendations. Based on our assessment of the 
documentation provided, we determined the extent to which NASA had 
implemented our recommendations. 

To determine whether NASA had established and implemented a commercial 
component dependency methodology and evaluated the suitability of 
already acquired but not yet implemented IFMP component products, we 
(1) obtained IFMP documentation for the requirements, design, and risk 
management processes with respect to managing commercial-off-the-shelf 
dependencies; (2) interviewed NASA IFMP officials to obtain an 
understanding of the methodology being employed by NASA to perform 
these analyses and the prototype being used; (3) observed NASA's 
demonstration of the prototype being used; and (4) reviewed 
documentation showing the preliminary results of NASA's evaluation of 
the contract management module and ongoing dependency analysis of the 
interactions between this module and already-implemented IFMP component 
products. 

To determine whether NASA had implemented our recommendations related 
to the development and use of an enterprise architecture, we (1) 
interviewed NASA's Deputy Chief Information Officer (CIO)/Chief 
Technology Officer (CTO) and (2) analyzed documents obtained from this 
official, including a verification and validation report and draft 
configuration management plan, to assess whether the agency had 
established effective architecture management processes and 
incorporated missing architecture content requirements. We used our 
Enterprise: 

Architecture Management Maturity Framework[Footnote 20] to assess the 
agency's efforts. 

Because NASA officials acknowledged that they had not developed a 
corrective action plan to mitigate the risks associated with relying on 
already-deployed IFMP commercial components, we interviewed IFMP 
program officials to understand their rationale for not developing such 
a plan. 

To determine whether NASA had engaged program managers and cost 
estimators to identify and document program management needs and 
reengineered its acquisition management processes, we interviewed 
officials from the Office of the Chief Engineer, the Office of the 
Chief Financial Officer (CFO), and the IFMP program office. We also 
obtained and analyzed NASA's recently updated procedural requirements 
for program and project management (NPR: 7120.5C) and Project 
Management Information Improvement (PMI2) planning documents-- 
including proposed system requirements, coding structures, and 
timelines. Because NASA management acknowledged that they have not yet 
reengineered the agency's acquisition management process, we documented 
the agency's plans to implement this recommendation through PMI2. 

Because NASA management acknowledged that our recommendations related 
to the Federal Financial Management Improvement Act of 1996 (FFMIA) 
compliance remain open, we documented the agency's current plans to 
produce systems that meet the requirements of FFMIA by obtaining and 
analyzing the Office of the CFO's Financial Leadership Plan and the 
Financial Management Improvement Strategy. 

To determine whether NASA had improved its requirements management and 
testing processes as we recommended, we interviewed IFMP program 
officials and obtained and analyzed (1) documentation from NASA's 
requirement tracking system for selected requirements to determine that 
requirements are consistent, verifiable, and traceable, and contain the 
necessary specificity; (2) examples of NASA's regression testing 
results; and (3) the metrics used by NASA to monitor IFMP's stability. 

To determine whether NASA had implemented our recommendations regarding 
IFMP program life-cycle cost estimates, we obtained and analyzed NASA's 
current life-cycle cost estimate to (1) determine whether estimates for 
the remaining modules are prepared in accordance with the current Work 
Breakdown Structure (WBS); (2) verify the integrity of the life-cycle 
cost estimate by footing, cross-footing, and analyzing components; (3) 
evaluate the audit trail by tracing amounts in the life-cycle cost 
estimate to the module WBS estimates and other supporting sources; and 
(4) ensure that the estimate was prepared in accordance with NASA's 
life-cycle cost and full-cost guidance. In addition, we interviewed 
IFMP officials to determine whether NASA is using a systematic, 
logical, and comprehensive tool, such as probabilistic risk assessment, 
in establishing financial reserves for IFMP-specific risks for the 
remaining modules--which include the Integrated Asset Management, 
Contract Management, and Labor Distribution System. 

To determine whether NASA had implemented our recommendations related 
to funding reserves, we (1) analyzed NASA's program/projects reserve 
templates, related budget documentation, and status reports to verify 
the extent to which reserve levels are being established using any new 
or enhanced risk evaluation methodology and probabilistic risk tool and 
(2) observed a demonstration of the probabilistic risk tool that NASA 
is now using. In addition, we determined the extent to which NASA 
reflected reserves established under its new or enhanced methodology 
and risk tool in its fiscal year 2006 budget submissions planned for 
doing so in the fiscal year 2007 IFMP budget submissions, or both. 
Using the reserve templates and related budget documentation, we 
determined the extent to which NASA has quantified the cost impact of 
high severity risks for IFMP and whether a direct relationship of risk 
and reserve setting was established in the templates or other 
databases. 

The audit work was conducted from March 2005 through June 2005 in 
accordance with U.S. generally accepted government auditing standards. 

[End of section] 

Enclosure II: Status of Recommendations: 

Table 2 summarizes the status of NASA's efforts to implement 
recommendations we made in a series of reports issued during fiscal 
years 2003 and 2004 on IFMP. We considered a recommendation closed when 
NASA provided us with documentation that demonstrated that it had fully 
addressed the concerns we raised in our prior reports. Recognizing that 
many of our recommendations may take considerable time and effort to 
fully implement, we considered a recommendation to be partially 
implemented if the documentation provided indicated that NASA had made 
significant progress addressing our concerns. For recommendations we 
consider open, NASA's documentation indicated that the agency was 
either in the very early planning stages or had not yet begun to 
implement the recommendation. 

Table 2: Status of Recommendations from GAO Reports on IFMP: 

Recommendations to improve NASA's acquisition management practices; GAO-
03-507. 

(1) Establish and implement a methodology for commercial system 
component dependency analysis and decision making. Partially 
implemented. 

(2) Evaluate the suitability of already-acquired, but not yet 
implemented, IFMP component products within the context of a component 
dependency analysis methodology. Partially implemented. 

Recommendations regarding NASA's enterprise architecture; GAO-04-43. 

(1) Establish a NASA enterprise architecture policy and designating a 
NASA architecture board, or comparable body, that is made up of agency 
executives who are responsible and accountable for developing and 
maintaining the architecture. Partially implemented. 

(2) Ensure that the architecture content requirements identified in 
this report are satisfied by first determining the extent to which 
NASA's initial release of an enterprise architecture satisfies these 
content requirements and then developing and approving a plan for 
incorporating any content that is missing. Open. 

(3) Ensure that the program's plans are aligned with the initial and 
subsequent versions of the enterprise architecture. Open. 

(4) Immediately map already-implemented IFMP components to the agency's 
enterprise architecture and report to the Program Executive Officer any 
instances of misalignment, the associated risks, and proposed 
corrective actions. Open. 

(5) Develop corrective action plans that include specific milestones, 
cost estimates, and detailed actions to be taken to align the program 
with the enterprise architecture. Open. 

(6) In developing the architecture, the board and the CIO should; (a) 
Establish a written and approved policy for architecture development. 
Open. 

(b) Place enterprise architecture products under configuration 
management. Open. 

(c) Ensure that progress against architecture plans is measured and 
reported. Open. 

(7) In completing the architecture, the board and the CIO should; (a) 
Establish a written and approved policy for architecture maintenance. 
Open. 

(b) Ensure that enterprise architecture products and management 
processes undergo independent verification and validation. Partially 
implemented. 

(c) Ensure that architecture products describe the enterprise's 
business and the data, application, and technology that support it. 
Open. 

(d) Ensure that enterprise architecture products describe the "As Is" 
environment, the "To Be" environment, and a sequencing plan. Open. 

(e) Ensure that business, performance, data, application, and 
technology descriptions address security. Open. 

(f) Ensure that the CIO approves the enterprise architecture. Closed. 

(g) Ensure that the steering committee and/or the investment review 
board has approved the current version of the enterprise architecture. 
Partially implemented. 

(h) Measure and report on the quality of enterprise architecture 
products. Open. 

(8) In implementing the architecture, the board and the CIO should (a) 
Establish a written and approved policy for IT investment compliance 
with the enterprise architecture. Open. 

(b) Ensure that the enterprise architecture is an integral component of 
IT investment management processes. Open. 

(c) Ensure that IT investments comply with the enterprise architecture. 
Open. 

(d) Obtain Administrator approval of each enterprise architecture 
version. Partially implemented. 

(e) Measure and report enterprise architecture return on investment. 
Open. 

(f) Measure and report on enterprise architecture compliance. Open. 

Recommendations to mitigate risk associated with relying on already- 
deployed components; GAO-03-507. 

(1) Identifying known and potential risks. Open. 

(2) Assessing the severity of the risks on the basis of probability and 
impact. Open. 

(3) Developing risk mitigation strategies. Open. 

(4) Assigning responsibility for implementing the strategies. Open. 

(5) Tracking progress in implementing these strategies. Open. 

(6) Reporting progress to relevant congressional committees. Open. 

Recommendations regarding identifying program management needs and 
reengineering business processes; GAO-03-507. 

(1) Engage stakeholders--including program managers, cost estimators 
and the Congress--in developing a complete and correct set of user 
requirements. Closed. 

(2) Reengineer acquisition management processes, particularly with 
respect to the consistency and detail of budget and actual cost and 
schedule data provided by contractors. Open. 

Recommendations to improve NASA requirements management and testing 
processes; GAO-03-507. 

(1) Develop and properly document requirements. Partially implemented. 

(2) Conduct thorough regression testing before placing modified 
components into production. Partially implemented. 

(3) Implement a metrics program that will identify and address the root 
causes of system defects. Partially implemented. 

Recommendations to improve external financial reporting; GAO-04-151. 

Implement a corrective action plan that will produce a financial 
management system that complies substantially with the requirements of 
FFMIA. This includes capabilities to produce timely, reliable, and 
useful financial information related to; (1) property, plant, 
equipment, and materials. Open. 

(2) budgetary information, including adjustments to prior year 
obligations. Open. 

(3) accounts payable and accrued costs; Open. 

(4) the full cost of programs for financial reporting purposes. Open. 

Recommendations regarding IFMP program life-cycle cost estimates and 
funding reserves; GAO-04-118. 

(1) Prepare a full life-cycle cost estimate for the entire IFMP that 
meets NASA's life-cycle cost and full cost guidance. Partially 
implemented. 

(2) Prepare cost estimates by the current Work Breakdown Structure for 
the remaining modules. Open. 

(3) Provide a clear audit trail between detailed WBS estimates and the 
program's cost estimate for the remaining modules. Partially 
implemented. Partially implemented. 

(4) Utilize a systematic, logical, and comprehensive tool, such as 
Probabilistic Risk Assessment, in establishing the level of financial 
reserves for the remaining module projects and tailor the analysis to 
risks specific to IFMP. Partially implemented. 

(5) Quantify the cost impact of at least all risks with a high 
likelihood of occurrence and a high magnitude of impact to facilitate 
the continuing analysis necessary to maintain adequate reserve levels. 
Partially implemented. 

(6) Establish a clear link between the program's risk database and 
financial reserves. 

Source: GAO. 

[A] Although NASA did not develop a corrective action plan to identify 
known and potential risks, NASA has begun to take steps to implement 
our recommendations to improve its requirements management and cost- 
estimating processes, which if implemented properly, could help to 
mitigate the risk associated with relying on already-deployed IFMP 
components. 

[End of table] 

[End of section] 

Enclosure III: Comments from the National Aeronautics and Space 
Administration: 

National Aeronautics and Space Administration: 
Office of the Administrator: 
Washington, DC 20546-0001: 

July 27, 2005: 

Mr. Gregory Kutz: 
Managing Director: 
Forensic Audits and Special Investigations: 
United States Government Accountability Office: 
Washington, DC 20548: 

Dear Mr. Kutz: 

Thank you for the opportunity to review and comment on the draft report 
entitled, Some Progress Made Toward Implementing GAO Recommendations 
Related to NASA's Integrated Financial Management Program (IFMP) (GAO- 
05-799R), dated June 22, 2005. I appreciate the Congressional and 
General Accountability Office's (GAO) interest in this vital program as 
it continues to improve NASA's business environment, which is also 
fully consistent with my own personal priorities for enhancing the 
Agency's ability to better manage and report on its mission and goals. 

We also truly appreciate the GAO recognition that progress has been 
made against its previous recommendations. Moreover, we also believe 
that significant progress has now been made in several of the areas 
identified by GAO in the past few years. As seen in Enclosures 1, 3, 
and 4 of this letter, a high number of actions and efforts have been 
successfully performed since the GAO issued its earlier reports in 2003 
and 2004. We are concerned with the potentially ambiguous "Open" 
nomenclature used as classification on most of the recommendations and 
respectfully suggest that the GAO, in producing its final report, use 
the possibly more accurate nomenclature of "Partially Implemented" or, 
whenever appropriate, "Closed." NASA's response to each section of the 
draft GAO report is provided within Enclosure 1, and our position on 
each of the recommendations is summarized in Enclosure 2. 

Finally, while NASA understands the intent of the draft report's new 
recommendation to "develop an overall corrective action plan," as noted 
above, we are quite concerned that the intent of this recommendation 
might be misunderstood and would respectfully propose that NASA 
"develop an integrated enterprise master schedule and milestones, 
encompassing and tightly coordinating with the improvement activities 
and plans already in place from the Office of the Chief Financial 
Officer, the Office of the Chief Engineer and the Program Analysis and 
Evaluation Office," which NASA believes meets the letter and intent of 
this new recommendation. 

My point of contact for this matter is Mr. Bobby German, Program 
Director for NASA's Integrated Financial Management Program. He may be 
contacted by e-mail at bobby.german@nasa.gov, or by telephone at (202) 
358-2498. 

Sincerely, 

Signed by: 

Frederick D. Gregory: 
Deputy Administrator: 

4 Enclosures: 

cc: 

Program Executive Officer/Mr. Ciganer; 
Chief Information Officer/Ms. Dunnington; 
Associate Administrator (Acting)/Mr. Geveden; 
Chief Financial Officer/Ms. Sykes. 
Office of Chief Financial Officer/Mr. Blair; 
Program Director/Mr. German; 
Office of the Chief Information Officer/Mr. McManus. 

Enclosure 1 - Detailed Responses to Draft GAO Report (GAO-05-799R): 

Response to GAO report section titled "NASA is Taking Steps to Assess 
Integration Risk for IFMP Commercial Components" 

The Integrated Financial Management Program (IFMP) has defined and 
implemented a methodology for software component dependency analysis. 
The elements of this approach largely mirror those of the Software 
Engineering Institute (SEI). The table below identifies the major areas 
that are covered by our methodology. We believe that our methodology 
provides a framework that reduces the risks introduced by 
organizational complexity and the variance in project priorities and 
agendas. 

IFMP Software Component Dependency Methodology: 

Recommended Methodology Activities: Gap analysis between requirements 
and component capabilities; 
Equivalent IFMP Integration Project Office Methodology Activity: 
Systematic Gap Analysis; 
Activity Deliverables: 
* Detailed Gap Analysis Assessment; 
* Communication concerning gaps in the Preliminary Project Review; 
Organization with Primary Responsibility: Agency System Implementation 
Project Office (ASIPO) Project Team. 

Recommended Methodology Activities: Risk Management; 
Equivalent IFMP Integration Project Office Methodology Activity: Risk 
Management Plan, On-going Risk Reviews; 
Activity Deliverables: 
* Prioritized list of risks; 
* Mitigation strategies and action plans; 
* Updated Risk Management database; 
Organization with Primary Responsibility: ASIPO Project Team. 

Recommended Methodology Activities: Allocating requirements among the 
various commercial components that comprise a given system design 
option; 
Equivalent IFMP Integration Project Office Methodology Activity: 
Requirements Management Plan; 
* Project Scope Document; 
* Requirements in RequisitePro (Level I – Level IV, as well as Use Case 
requirements); 
Organization with Primary Responsibility: ASIPO Project Team. 

Recommended Methodology Activities: Defining the interactions among the 
various commercial components to enable the processing of transactions; 
Equivalent IFMP Integration Project Office Methodology Activity: 
Initial Integration Workshop; Detailed Design Workshops; 
Activity Deliverables: 
* High-level ‘To Be’ Business Workflows; 
* List of proposed interfaces; 
* Detailed use cases and software requirement specifications; 
* Detailed list of requirements; 
* Maturity Level Designation; 
* Enterprise Application Integration Pattern Survey; 
Organization with Primary Responsibility: Integration Project Office. 

Recommended Methodology Activities: Interactions that affect data clean-
up and conversion activities; 
Equivalent IFMP Integration Project Office Methodology Activity: Data 
Conversion Workshops; 
Activity Deliverables: 
* Data Clean-up Plan; 
* List of proposed data conversions; 
* Functional designs; 
* Technical designs; 
Organization with Primary Responsibility: ASIPO Project Team. 

Recommended Methodology Activities: Documenting commitments and 
Decisions; 
Equivalent IFMP Integration Project Office Methodology Activity: Proof 
of Concept Review; Critical Design Review; 
Activity Deliverables: 
* Documented approval to proceed; 
Organization with Primary Responsibility: ASIPO Project Team. 

Recommended Methodology Activities: Using iterative prototyping to 
assess the interactions among these components; 
Equivalent IFMP Integration Project Office Methodology Activity: Proof 
of Concept Phase; 
Activity Deliverables: 
* Use cases and software requirement specifications; 
* Working integrations for selected interfaces; 
Organization with Primary Responsibility: Integration Project Office. 

[End of table] 

In addition, it should be noted that the prototyping executed for the 
Contract Management Module was not limited to basic integration 
scenarios as suggested in the GAO report. It included the following end-
to-end business processes and detailed key accounting validations: 

1. PR sent to PRISM (Commitment recorded in SAP); 
2. PR validated in PRISM; 
3. Award Created in CMM; 
4. PO Sent to SAP (Obligation recorded in SAP, Commitment Liquidated in 
SAP); 
5. PO Content Validated in SAP; 
6. Accounting Validated as Correct (Funds Management and Special 
General Ledger updated correctly in SAP); 
7. Goods Receipt Completed (Cost recorded in SAP); 
8. Accounting Validated as Correct (Funds Management and Special 
General Ledger updated correctly in SAP); 
9. Invoice Completed (Disbursement recorded in SAP). 

These steps were all executed against the following scenarios: 

* Simple fixed-price PO from a one-line, one-account PR, Project Funds; 
* Simple ten-line PO from a ten-line supply PR, Project Funds; 
* Fixed Price PO from a one-line, one-account service PR, Cost Center 
Funds; 
* Multi-Line Fixed Price Contract from a PR with two service lines. 
Each service line has one accounting line, Project Funds; 
* Cost-Plus Fixed Fee Contract from a one-line, two-account service PR, 
Material Group "A", Project Funds; 
* Hybrid Contract from a three-line service PR. Each line has at least 
two account lines, Project Funds; 
* PO from a two-line PR. One line is for supplies and one line is for 
services. Each line has only one accounting line, Project Funds; 
* FPDS-NG, fixed-price contract from a one-line PR with one accounting 
line FedBizOpps, fixed price contract from a one line PR with one 
accounting line; 

Based upon the scope of our methodology, and the fact that we have a 
proven track record in implementing these procedures, NASA's position 
is that the two recommendations related to component dependency 
analysis are well beyond being "partially implemented," and should be 
considered "closed." 

Response to GAO report section titled "Limited Progress Made in 
Establishing an Enterprise Architecture to Guide Modernization Efforts" 

NASA respectfully challenges the GAO's assessment that NASA acquired 
and implemented significant components of IFMP without having and using 
an enterprise architecture to guide and constrain the IFMP program. In 
response to the GAO fiscal year 2003 review of IFMP, NASA acknowledged 
the value of a more mature and robust enterprise architecture. NASA has 
continued efforts to refine the NASA Enterprise Architecture (EA), 
including implementing key architecture program management structures 
and process controls (e.g., establishing an enterprise architecture 
program office and designating a chief architect). NASA has now 
established the required Capital Planning and Enterprise Architecture 
processes to ensure that the NASA EA is current and NASA program and 
projects are measured in a proactive manner against current documents. 
All responses contained in this letter and supporting documentation are 
based on Version 3.1 of the NASA EA. 

Over the last 18 months, NASA has made extensive progress in adopting 
key architecture management best practices recommended by GAO and the 
Office of Management and Budget (OMB). Enclosures 2, 3 and 4 contain 
summaries of the actions taken over the past ten months. NASA's EA 
program continues to mature and provide measurable value to the Agency. 
Enclosure 2 contains an element-by-element response to the GAO-04-43 
audit recommendations. 

NASA has taken key steps in direct response to GAO's recommendations 
including: 

* Establishing an architecture board made up of senior agency 
executives that is responsible and accountable for developing and 
maintaining the architecture. 

* Having the architecture board approve Version 3.0 of the 
architecture. 

* Having the NASA Administrator approve Version 3.0 of the 
architecture. 

* Establishing an independent verification and validation function to 
review the architecture and related management processes. 

NASA continues to expand the content of the EA, developing elements in 
a priority order based on the strategic goals of the Agency. 

For the past two years, NASA has matured the IFMP related portion of 
the NASA EA, ensuring that IFMP plans are aligned with the architecture 
and that acquisition and implementation activities are appropriate. The 
reviews conducted to date have not shown any instances of misalignment, 
and the NASA OCIO and CFO are currently reviewing soon-to-be- 
implemented modules (e.g., contract management) to assess the extent of 
alignment. In addition, 

* NASA has established a draft written/approved policy guiding the 
development of the enterprise architecture. The policy was submitted 
for final approval in July 2005. 

* NASA has placed all enterprise architecture products under 
configuration management to maintain integrity and traceability and to 
control modifications or changes to the architecture products 
throughout their life cycles. 

* NASA has developed metrics to ensure that progress against 
architecture plans is measured and reported. NASA is measuring and 
reporting progress against approved architecture project management 
plans. 

* NASA has established a draft written/approved policy for architecture 
maintenance. The policy was submitted for approval in July 2005. 

* NASA has ensured that the architecture products describe the 
enterprise in terms of business, performance, data, application, and 
technology. The NASA EA products describe the "As Is" environment, the 
"To Be" environment, and a sequencing plan to transition from the "As 
Is" to the "To Be." The EA business, performance, data, application, 
and technology descriptions all address security. 

* NASA has measured and reported on the quality of enterprise 
architecture products. ROI for NASA's EA program is reported as a part 
of the Office of Management & Budget (OMB) business case submission and 
reporting process. 

* NASA has completed efforts intended to ensure that the enterprise 
architecture is an integral component of IT investment management 
processes and that IT investments comply with the architecture. NASA 
has a clearly documented Capital Planning and Investment Control 
Process and a set of Enterprise Architecture (EA) review processes for 
conducting investment alignment reviews. NASA reviews proposed system 
investments for compliance with the architecture and that the results 
of these reviews are used to revise the policy and procedures, as well 
as revise the review process. 

* NASA has established and documented a detailed review process for 
enterprise architecture compliance and has completed ten EA reviews of 
projects and steady state services. 

NASA has developed the EA program management plans required to 
effectively manage the development, maintenance, and implementation of 
the Enterprise Architecture. As documented in Enclosures 3 and 4, the 
plans specify measurable goals and outcomes, the tasks to be performed 
to achieve these goals and outcomes, the resources (funding, staffing, 
and training) needed to perform the tasks, and the time frames within 
which the tasks will be performed. 

NASA has taken considerable action and made significant progress in 
addressing prior GAO recommendations. These actions provide a solid 
foundation for the Agency's modernization efforts, including IFMP, 
mitigating risk of investments being implemented in a way that 
adequately ensures system integration, interoperability, and optimized 
mission support. 

Response to GAO report section titled "NASA Did Not Develop a 
Corrective Action Plan To Mitigate the Risk of Relying on Already 
Deployed Components" 

The GAO's comment in this area is partially ambiguous. In its original 
report, GAO-03-507, dated April 2003, the GAO recommended that NASA 
develop and implement (1) a short-term plan to identify and mitigate 
the risks currently associated with relying on already deployed IFMP 
commercial components and (2) a longer term strategy for acquiring 
additional IFMP components that includes implementing a methodology for 
commercial system component dependency analysis. At the time that GAO 
recommended that a "short- term plan" be developed, NASA was still in 
the process of implementing the Core Financial system, and the GAO was 
concerned about requirements and testing processes. 

Since the time that GAO issued its report (GAO-03-507), NASA has fully 
implemented and stabilized the financial system, and is continuing to 
seek ways to improve its use of the system (e.g., re-engineered 
financial structures). Also, as noted in the current GAO draft report, 
NASA has made significant progress with respect to (1) component 
dependency analysis, (2) requirements management processes, (3) testing 
processes, and (4) risk management and evaluation processes. In short, 
we believe that we have the right processes in place related to this 
area of concern, and respectfully disagree with the GAO's assessment 
that the recommendations in this area are "open." 

Response to GAO report section titled "Progress Made Toward Identifying 
Program Management Needs But Process Reengineering Still Needed" 

NASA Has Made Significant Progress Toward Identifying Program 
Management Needs: 

As noted in the GAO report, NASA has engaged stakeholders to identify 
program management needs. We agree with the GAO's findings and 
assessment. 

Future Plans to Reengineer Acquisition Management Processes Are Key: 

The GAO adequately captured the multitude of plans and activities 
currently underway which address weaknesses in acquisition management 
processes. As noted, NASA has embarked on an ambitious endeavor to re- 
engineer and implement a new financial structure aligned with the 
Agency's technical work breakdown structure. This will be completed by 
October 1, 2005. NASA has also formed a "Business Integration Team," 
consisting of both internal and external experts, to review NASA's 
contractor cost reporting and associated cost accrual processes, and to 
implement the re-engineered changes by October 1, 2006. These and other 
process improvement efforts will establish the critical framework 
needed for more effective acquisition management practices. 

Response to GAO report section titled "Improvements Made To 
Requirements Management and Testing Process" 

As stated in the GAO's report, beginning in May 2003, the IFMP 
Competency Center deployed a Test Management application that has since 
provided the basis for improved requirements management and regression 
testing of the Core Financial and subsequently implemented modules. In 
February 2004, the IFMP Quality Assurance team deployed an advanced 
computerized tool, "RequisitePro," used to create an additional level 
of control over the several thousands of detailed requirements 
associated with the development and performance of our IFMP software 
applications. In addition to the deployment of these new tools, a 
separate Quality Assurance team was established as part of the 
Competency Center to focus on sound requirements collection and 
documentation for all IFMP software components. 

Now that the framework is established, our aim is to continually 
improve our requirements management procedures. As stated in the GAO's 
report, we are looking at addressing the remaining outstanding 
requirements documentation issues from the Core Financial in time to 
leverage this new framework for the design, development, and testing 
associated with the SAP Version Upgrade activities scheduled for FY 
2006. As this will be a complex and challenging task, we plan to work 
with the GAO on these activities and leverage their recommendations. 

We disagree with the statement in the report that "many of the system 
configuration problems caused by the Agency's ineffective requirements 
management and testing processes continue to plague the Core Financial 
module." To suggest that the Core Financial system has significant 
configuration problems is inaccurate. At the present time, very few 
configuration issues remain in the Core Financial system. The most 
recent analysis of data shows that fiscal year 2005 is relatively 
clean. The continued challenges faced by NASA in receiving a clean 
audit opinion stem from problems not directly related to the Core 
Financial SAP system. For the GAO to continue to point to the Core 
Financial SAP system as the source of the NASA's financial accounting 
problems misrepresents the nature of the issues, and could result in 
further weakening of support for IFMP's efforts, both internal and 
external to the Agency. 

Response to GAO report section titled "Detailed Plan for Compliance 
with the Federal Financial Management Improvement Act is Still Needed" 

The Office the CFO's Financial Leadership Plan provides the goals of 
the organization and the framework for achieving those goals. The OCFO 
Strategic Initiatives (formerly known as the Financial Management 
Improvement Plan) established the near term priorities and objectives 
for improving financial management. Through these initiatives, progress 
has been made in addressing material weaknesses in the areas of fund 
balance with Treasury, policies and procedures, financial statement 
preparation, and system issues, including producing transaction level 
detail in support of financial statement account balances, compensating 
controls and procedures for identification of correcting entries, and 
accuracy of transactional postings. The extent of this progress will be 
assessed during the FY05 Financial Audit, which is currently underway. 
An FFMIA Remediation Plan is under development and will be completed by 
the end of calendar year 2005. 

Response to GAO report section titled "Improvements Made To NASA's IFMP 
Life-Cycle Cost Estimate and Processes for Calculating Funding 
Reserves" 

Full Life-Cycle Cost Estimate for IFMP Not Yet Complete: 

As the GAO stated in its report, IFMP has made significant progress in 
preparing a full Life-Cycle Cost Estimate (LCCE) based on industry best 
practices. As noted by the GAO, the IFMP LCCE estimate was a "work in 
progress" at the time the GAO made its assessment, which is why this 
area is considered "partially implemented." We agree with this 
assessment. 

NASA is on the Right Track to Provide an Audit Trail to Support Life- 
cycle Cost Estimate We agree with the GAO's assessment. We are 
continuing to refine the mapping of IFMP data sources to the Work 
Breakdown Structure (VMS) structure in the LCCE. When completed, we 
will have a clear audit trail between detailed WBS estimates and 
Program costs for remaining modules. The next version of the LCCE will 
be finalized by late October 2005. 

Current Work Breakdown Structure (WBS) Not Used To Estimate Costs For 
All Remaining Modules: 

In this case, NASA respectfully disagrees with the GAO findings in this 
area. As the GAO noted, the Program is employing a standard WBS, which 
was updated with its "Schedule Management Framework." That WBS has been 
used in recent cost estimates within new and updated Business Case 
Analyses (BCA). Specifically, 

* The new WBS was used in the two BCAs (Labor Distribution System and 
Contract Management Module) referenced in the draft GAO report. 

* The WBS estimate does include Center Implementation Costs, which are 
Section 2.0 of the WBS. An appendix of each BCA provides a Basis of 
Estimate (BOE) that identifies the cost build that supports the BCA 
cost estimate. 

* Lastly, within the BOE for each BCA are included following costs: 
integration project, civil service salaries and travel, G&A, and 
service pool costs. These cost builds are reflected on a line-item 
basis in the BOE, though rolled up to the respective cost category in 
to the Program's WBS. 

The IFMP will continue to use this current WBS for all remaining 
modules for the current and future budget cycles. 

Progress Made in Establishing a Comprehensive Risk Evaluation 
Methodology: 

As the GAO noted in their report, "NASA has established a comprehensive 
risk evaluation methodology, which is used to facilitate the estimation 
and allocation of financial reserves" and ".. employs a probabilistic 
risk tool." Since the time of the GAO's assessment, IFMP has fully 
implemented and applied the risk methodology and probabilistic tool as 
the basis for reserves for all elements of the Program. This was 
accomplished as part of this year's budget cycle. Though we agree with 
the GAO's findings (based on the timing of their assessment), our 
position, however, is that the two recommendations related to this area 
should be considered "closed." 

Progress Made But Cost Impact of Risk Not Quantified Consistently: 

As the GAO noted in their report, "NASA now requires that the cost 
impact of high severity risks be analyzed more consistently through the 
use of standardized risk reserves templates and be quantified through 
the use of a more rigorous methodology and probabilistic risk tool. As 
we stated above, since the time of the GAO's assessment, IFMP has fully 
utilized the risk methodology and probabilistic tool as the basis for 
reserves for all elements of the Program. Though we agree with the 
GAO's findings (based on the timing of their assessment), our position, 
however, is that the two recommendations related to risks and reserves 
should be considered "closed." 

NASA Has Established a Clear Link Between IFMP's Risk Database and 
Financial Reserves The GAO noted that NASA has successfully established 
linkage between IFMP's risk database and its financial reserves, and 
considered this recommendation closed. We agree with this assessment. 

[End of Enclosure 1] 

Enclosure 2 - Summary of NASA's Position on each GAO Recommendation: 

The tables below reflect NASA's position on each of the GAO 
recommendations. Further comments, in addition to those included in 
Enclosure 1, are also provided below. 

Recommendations to improve NASA's acquisition management practices (GAO-
03-507); 

(1) Establish and implement a methodology for commercial system 
component dependency analysis and decision making. 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

(2) Evaluate the suitability of already acquired, but not yet 
implemented IFMP component products within the context of a component 
dependency analysis methodology. 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

[End of table] 

Recommendations to mitigate risk associated with relying on already- 
deployed components (GAO-03-507): 

(1) Identifying known and potential risks. 
GAO's Position: Open; 
NASA Position: Closed. 

(2) Assessing the severity of the risks on the basis of probability; 
GAO's Position: Open; 
NASA Position: Closed. 

(3) Developing risk mitigation strategies; 
GAO's Position: Open; 
NASA Position: Closed. 

(4) Assigning responsibility for implementing the strategies; 
GAO's Position: Open; 
NASA Position: Closed. 

(5) Tracking progress in implementing these strategies; 
GAO's Position: Open; 
NASA Position: Closed. 

(6) Reporting Progress to relevant Congressional committees; 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: The IFM Program has been reporting progress bi-
annually to the House Science Committee. 

[End of table] 

Recommendations regarding identifying program management needs and 
reengineering business processes (GAO-03-507): 

(1) Engage stakeholders--including program managers; 
GAO's Position: Closed; 
NASA Position: Closed. 

(2) Reengineer acquisition management processes; 
GAO's Position: Open; 
NASA Position: Partially implemented. 

[End of table] 

Recommendations to improve NASA requirement's management and testing 
processes. (GAO-03-507): 

(1) Developing and properly documenting requirements. 
GAO's Position: Partially implemented; 
NASA Position: Partially implemented. 

(2) Conduct thorough regression testing before placing modified 
components into production. 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

(3) Implement a metrics program that will identify and address the root 
causes of system defects. 
GAO's Position: Partially implemented; 
NASA Position: Partially implemented. 

[End of table] 

Recommendations to improve external financial reporting (GAO-04-151): 

Implement a corrective action plan that will produce a financial 
management system that complies substantially with the requirements of 
FFMIA. This includes capabilities to produce timely, reliable, and 
useful financial information related to: (1) property, plant, 
equipment, and materials; 
GAO's Position: Open; 
NASA Position: Open. 

(2) budgetary information including adjustments to prior year 
obligations; 
GAO's Position: Open; 
NASA Position: Open. 

(3) accounts payable and accrued costs; and the full cost of programs 
for financial reporting purposes. 
GAO's Position: Open; 
NASA Position: Open. 

[End of table] 

Recommendations regarding IFMP program life-cycle cost estimates and 
funding reserves (GAO-04-118): 

(1) prepare a full life-cycle cost estimate for the entire IFMP that 
meets NASA's life-cycle cost and full cost guidance. 
GAO's Position: Partially implemented; 
NASA Position: Partially implemented. 

(2) Prepare cost estimates by the current Work Breakdown Structure for 
the remaining modules; 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

(3) Provide a clear audit trail between detailed WBS estimates and the 
program's cost estimate for the remaining modules. 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

(4) Utilize a systematic, logical, and comprehensive tool, such as 
Probabilistic Risk Assessment, in establishing the level of financial 
reserves for the remaining module projects and tailor the analysis to 
risks specific to IFMP. 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

(5) Quantify the cost impact of at least all risks with a high 
likelihood of occurrence and a high magnitude of impact to facilitate 
the continuing analysis necessary to maintain adequate reserve levels. 
GAO's Position: Partially implemented; 
NASA Position: Closed. 

(6) Establish a clear link between the program's risk database and 
financial reserves. 
GAO's Position: Closed; 
NASA Position: Closed. 

[End of table] 

The table below summarizes NASA's position on each of the 
recommendations in GAO report GAO-04-043. In addition, to the comments 
provided in the table summarize comments included in Enclosures 1, 3, 
and 4 of this response. 

It should be noted that several of the GAO's recommendations (e.g., GAO 
8.c, "ensure that IT investments comply with the enterprise 
architecture") can be categorized as "open ended." Given that both the 
NASA IT investment portfolio and the NASA EA are living documents, 
these can only be "closed" at a specific point in time and instance of 
the IT investment portfolio and version of the NEAS EA. NASA's 
responses are based on Version 3.1 of the NASA Enterprise Architecture 
and the March 2005 version of the IT Investment portfolio. NASA has 
established Capital Planning and EA processes to ensure that the NASA 
EA is current and NASA program and projects are measured in a proactive 
manner against the current documents. GAO Recommendation (3) has been 
split into two distinct tasks as follows: 

[End of table] 

* (3a) Ensures that the program's plans are aligned with the initial 
version of the EA. 

* (3b) Ensures that the program's plans are aligned with subsequent 
versions of the EA. 

The original wording included two steps: alignment with the initial 
version of the EA and alignment with subsequent versions of the EA. 
Based on the original wording of the GAO recommendation, it was not 
possible to close any portion of GAO recommendation (3). The whole 
recommendation including completed elements (i.e., alignment with the 
initial enterprise architecture) would remain open for the life of the 
IFMP. 

Recommendations regarding NASA's enterprise architecture (GAO-04-43): 

(1) Establish a NASA enterprise architecture policy and designating a 
NASA architecture board, or comparable body, that is made up of agency 
executives who are responsible and accountable for developing and 
maintaining the architecture. 
GAO's Position: Partially implemented; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 3.0 of the NASA EA. Version 3.0 was approved by the NASA 
Executive Committed and signed by the NASA Administrator. Version 3.0 
of the NASA EA and the signed review documents were provided to the GAO 
IT Issues Team. 

(2) Ensures that the architecture content requirements identified in 
this report are satisfied by first determining the extent to which 
NASA's initial release of an enterprise architecture satisfies these 
content requirements and then developing and approving a plan for 
incorporating any content that is missing. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 3.1 of the NASA EA. Version 3.1 of the NASA EA was provided to 
the GAO IT Issues Team. NASA has not received specific written feedback 
from GAO detailed any specific elements of this action that are not 
closed. 

(3a) Ensures that the program's plans are aligned with the initial 
versions of the enterprise architecture. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 2.0 of the NASA EA. All IFMP elements were reviewed and 
alignment addressed. The IFMP Office provided detailed program 
formulation and project planning documentation to the GAO audit team in 
April/May of 2003 to close this action. NASA has not received specific 
written feedback from GAO outlining any specific elements of this 
recommendation that are considered not closed or completed. 

(3b) Ensures that the program's plans are aligned with subsequent 
versions of the enterprise architecture. 
GAO's Position: Open; 
NASA Position: Open; 
Comments/Reference: This is an ongoing action that will last the life 
of the IFMP. All IFMP elements are reviewed and alignment addressed as 
a part of ongoing EA efforts. The IFMP is included in all versions of 
the NASA EA, including Version 1.0 and all subsequent versions. 
Specific evidence is located in Enclosure 3, Section II, Paragraph 3, 
subparagraphs 3 and 4. 

(4) Immediately map already implemented IFMP components to the agency's 
enterprise architecture and reports to the Program Executive Officer 
any instances of misalignment, the associated risks, and proposed 
corrective actions. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 2.0 of the NASA EA. All IFMP elements were reviewed and 
alignment addressed. The IFMP is included in all subsequent versions of 
the NASA EA. NASA has not received specific feedback from GAO outlining 
any specific elements of this recommendation that are considered not 
closed or completed. 

(5) Include specific milestones, cost estimates, and detailed actions 
to be taken to align the program with the enterprise architecture. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 2.0 of the NASA EA. All IFMP elements were reviewed and 
alignment addressed. NASA has not received specific feedback from GAO 
outlining any specific elements of this recommendation that are 
considered not closed or completed. 

(6) In developing the architecture, the board and the CIO should: 

(a) Establish a written and approved policy for architecture 
development. 
GAO's Position: Open; 
NASA Position: Partially implemented; 
Comments/Reference: NASA's Enterprise Architecture policy directive 
(NPD) and policy guidance (NPG) have been submitted to NASA policy 
approval process. NASA also has a policy addressing the EA 
certification of all staff supporting NASA EA efforts. NASA will close 
this action when the formal NASA review process is complete. Draft 
versions of all policies have been shared with the GAO IT Issues team. 
Specific evidence is located in Enclosure 3, Section IV, Policy and 
Governance. 

(b) Place enterprise architecture products under configuration 
management. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: All EA documents and data are stored under 
configuration control in the NASA EA repository. Specific evidence is 
located in: Enclosure 3, Section II, Develop Architecture Products, 
Paragraph 1. 

(c) Ensure that progress against architecture plans is measured and 
reported. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: * NASA has developed metrics to ensure that 
progress against architecture plans is measured and reported. NASA is 
measuring and reporting progress against approved architecture project 
management plans. Specific evidence is located in: Enclosure 3, Section 
II Develop Architecture Products, Paragraph 1. 

(7) In completing the architecture, the board and the CIO should: 

(a) Establish a written and approved policy for architecture 
maintenance. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: NASA's EA policy directive (NPD) and policy 
guidance (NPG) have been submitted to NASA policy approval process. 
NASA will close this action when the formal NASA review process is 
complete. Draft versions of all policies have been shared with the GAO 
IT Issues team. Specific evidence is located in Enclosure 3, Section 
IV, Policy and Governance. 

(b) Ensure that EA products and management processes undergo 
independent verification and validation. 
GAO's Position: Partially implemented; 
NASA Position: Closed; 
Comments/Reference: NASA established an independent verification and 
validation function to review the architecture and related management 
processes. NASA contracted with SRA for a complete IV&V of the NASA EA 
Program, including the EA products, processes and the EA program 
management processes. SR is under contract to repeat the IV& V process 
in September of 2005. SRA Completed the IV&V in January of 2005 and 
their Remediation Plan in July of 2005. The IV&V and Remediation Plan 
have been provided to the GAO IT Issues team. Detailed evidence is 
provided in Enclosure 3, Section III, Inter-agency Interfaces and 
Reporting, Paragraph 1. 

(c) Ensure that architecture products describe the enterprise's 
business and the data, application, and technology that support it. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: The Agency has ensured that the architecture 
products describe the enterprise in terms of business, performance, 
data, application, and technology. The EA business, performance, data, 
application, and technology descriptions address security. Specific 
evidence is located in Enclosure 3, Section II, Develop Architecture 
Products, Paragraph 3, subparagraphs C and D. 

(d) Ensure that EA products describe the "As Is" environment, the "To 
Be" environment, and a sequencing plan. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: The NASA EA products describe the "As Is" 
environment, the "To Be" environment, and a sequencing plan to 
transition from the "As-Is" to the "To-Be." The EA business, 
performance, data, application, and technology descriptions address 
security. NASA is documenting reference architectures for key 
architectural elements and sequencing plans for the transition between 
the "as-is" and the "to-be" states. Specific evidence is located in 
Enclosure 3, Section II, Develop Architecture Products, Paragraph 3, 
subparagraphs C and D. 

(e) Ensure that business, performance, data, application, and 
technology descriptions address security; 
GAO's Position: Open; 
NASA Position: Partially implemented; 
Comments/Reference: The Agency has ensured that the architecture 
products describe the enterprise in terms of business, performance, 
data, application, and technology. All elements of the NASA EA address 
security as a required cross cutting element. 

(f) Ensure that the CIO approves the enterprise architecture. 
GAO's Position: Closed; 
NASA Position: Closed; 
Comments/Reference: NASA concurs that this action is closed. 

(g) Ensure that the steering committee and/or the investment review 
board has approved the current version of the enterprise architecture. 
GAO's Position: Partially implemented; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 3.0 of the NASA Enterprise Architecture. Version 3.0 was 
approved by the NASA Executive Committee and signed by the NASA 
Administrator. Version 3.0 of the NASA EA and the signed review 
documents were provided to the GAO IT Issues Team. 

(h) Measure and report on the quality of EA products. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: NASA has developed metrics to ensure that progress 
against architecture plans is measured and reported. NASA is measuring 
and reporting progress against approved architecture project management 
plans. Detailed evidence is provided in Enclosure 3, Section I, 
Customer Outreach and Communications, Paragraph 3 and 4. 

(8) In implementing the architecture, the board and the CIO should: 

(a) Establish a written and approved policy for IT investment 
compliance with the enterprise architecture. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: The Agency has completed efforts to ensure that the 
EA is an integral component of IT investment management processes and 
that IT investments comply with the architecture. NASA has a clearly 
documented Capital Planning and Investment Control Process. * NASA has 
established and documented a detailed review process for EA compliance 
and has completed ten EA reviews of projects and steady state services. 
Detailed evidence is provided in Enclosure 3, Section II, Develop 
Architecture Products, Paragraph 3 and Enclosure 3, Section IV, Policy 
and Governance, Paragraph 2. 

(b) Ensure that the EA is an integral component of IT investment 
management processes. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: * The Agency has completed efforts intended to 
ensure that the enterprise architecture is an integral component of IT 
investment management processes and that IT investments comply with the 
architecture. NASA has a clearly documented the Capital Planning and 
Investment Control Process and a set of EA review processes for 
conducting investment alignment reviews. NASA reviews proposed system 
investments for compliance with the architecture and that the results 
of these reviews are reported to the EA review sponsors and the NASA 
CIO. * Detailed evidence is provided in Enclosure 3, Section II, 
Develop Architecture Products, Paragraph 3 and Enclosure 3, Section IV, 
Policy and Governance, Paragraph 2. 

(c) Ensure that IT investments comply with the enterprise architecture. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: NASA has a clearly documented set of EA review 
processes for conducting investment alignment reviews. NASA reviews 
proposed system investments for compliance with the architecture and 
that the results of these reviews are reported to the EA review sponsor 
and the NASA CIO. * NASA has completed ten EA reviews of projects and 
steady state services. Detailed evidence is provided in Enclosure 3, 
Section II, Develop Architecture Products, Paragraph 3. 

(d) Obtain Administrator approval of each enterprise architecture 
version. 
GAO's Position: Partially implemented; 
NASA Position: Closed; 
Comments/Reference: This action was closed during the development of 
version 3.0 of the NASA Enterprise Architecture. Version 3.0 was 
approved by the NASA Executive Committee and signed by the NASA 
Administrator. Version 3.0 of the NASA EA and the signed review 
documents were provided to the GAO IT Issues Team. 

(e) Measure and report EA return on investment. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: The Agency measures and reports on the quality of 
EA products. ROI for the Enterprise Architecture program is reported as 
a part of the Office of Management & Budget business case submission 
and reporting process. Detailed evidence is provided in Enclosure 3, 
Section II, Develop Architecture Products, Paragraph 3. 

(f) Measure and report on EA compliance. 
GAO's Position: Open; 
NASA Position: Closed; 
Comments/Reference: NASA has a clearly documented set of EA review 
processes for conducting investment alignment reviews. NASA reviews 
proposed system investments for compliance with the architecture. 
Review results are documented and available. * NASA has completed ten 
EA reviews of projects and steady state services. Detailed evidence is 
provided in Enclosure 3, Section II, Develop Architecture Products, 
Paragraph 3. 

[End of table] 

[End of Enclosure 2] 

Enclosure 3 - Synopsis of Significant Enterprise Architecture 
Accomplishments (August 2004 to Present): 

I) Customer Outreach & Communications; 

1) New EA Communications Structure: 

A. Based on GAO recommendation, selected and implemented Sharepoint 
software, an Agencywide document management system, used by the Core EA 
team and anyone who wishes to engage. Documents and artifacts are under 
version and configuration control, and users must have authenticated 
accounts to add or modify content. Content is organized into broad 
categories, including documentation, news and articles, action items, 
team lists, and forums. The documentation group is organized as 
follows: 

(1) EA NPD and NPR; 
(2) NASA Federal and Legal Documentation; 
(3) NASA EA Volumes; 
(4) EA Reviews; 
(5) EA Project Management; 
(6) Reference Materials and Links of Interest. 

B. Developed an EA Web Site to better facilitate strategic outreach and 
communication. Completed pilot EA web site. Production site rollout 
planned for July 30. Content is prepared for NASA audiences who seek 
basic understanding and guidance about EA, and includes contact 
information for local and Agency EA Team members. Includes links to EA 
Reviews, policy and guidance, and more. Site conforms to all applicable 
NASA web site standards. (EA Website Initial Content Draft-2-1-1, 
7/7/2005). 

C. Now using email list server lists to broadcast information to pre- 
defined EA communities, including the core team (52 members), leader 
team (26 members), and the overall agency EA contact team with more 
than 152 members. Push information includes citations and links to 
recent EA articles, notification of recent additions to Sharepoint and 
the EA repository, and EA teleconference announcements. 

2) Reorganized the EA core team for better efficiency. There are 52 
core team members (24 Civil Service Employee and 28 Contractors) 
assigned to EA activities. Most are performing EA as an additional 
duty. There are 26 executives on the strategic communications team for 
the Agency. The strategic communications team acts as the management 
interface to quickly disseminate key EA actions through the Agency. 
Assigned tactical level EA leadership to improve product quality of EA 
work products, help define deliverables and completion schedules, and 
consolidate activities among all EA contractor teams to improve 
planning and workforce utilization. 

3) Updated and loaded in Sharepoint a detailed and approved EA work 
plan. Prepared an executive overview briefing of all major work 
activities and deliverables using information derived from our 
integrated project plan TA NASA Master Workplan V1.3, 6-3-2005; NASA EA 
Project Plan, 5-23-2005). The briefing, presented to the NASA CIO 
Board, presents work activities in 120-day cycles, derived from the 
integrated MS Project Plan for EA. 

4) Continued revisions and additions to the NASA EA repository. The 
repository is available to any NASA Civil Service Employees and 
contains NASA information related to the BRM, SRM, TRM, and DRM. The 
ITPM server is located at GRC at 
https://itpm.grc.nasa.gov/a3/index.htm. 

Materials prepared for training NASA staff on the use of the EA 
Repository are located at: 
https://portal.nasa.gov/sites/niie/ea/Documentation/NASA_Agency_EA_Project/Training.

5) Completed the second nationwide tour of each NASA Center. The first 
was completed in FY2004 and the second was completed in FY2005. The 
tour included on-site visits to Headquarters, Glenn Research Center, 
Langley Research Center, Ames Research Center, Dryden Flight Research 
Center, Johnson Space Center, Kennedy Space Center, Marshall Space 
Flight Center, Stennis Space Center, Goddard Space Flight Center, and 
Jet Propulsion Laboratory. Visits were designed to brief each Center's 
executive and line staff on Agency EA direction and strategy, provide 
ongoing training for the EA Repository, and show tangible artifacts of 
how EA is being used around the Agency. Visits are also used as a forum 
to derive customer feedback to guide our future EA activities. Visits 
notes for each Center are posted to Sharepoint 
at:(https://portal.nasa.gov/sites/niie/ea/Documentation/NASA_Agency_EA_Project/Center_Visits/FY_2004; 

https://portal.nasa.gov/sites/niie/ea/Documentation/NASA_Agency_EA_Project/Center_Visits/FY_2005; 

6) Conduct bi-weekly teleconferences with Agencywide EA team members. 
Discussion items include work in progress, planned work, issues 
tracking and resolution, and progress toward milestone completion. Open 
to all Agency Civil Service and contractor staff. Meeting notes of 
every Agency telecon are available at 
https://portal.nasa.gov/sites/niie/ea/Documentation/Forms/Allltems.aspx;

7) August 2004 Workshop. NASA (Office of Chief Engineer) provided 
funding to pay for 75 Civil Service Employee from across the Agency to 
attend a one-week EA training class. The Core EA team prepared and 
presented the entire curriculum to this audience. The workshop 
established a baseline understanding of EA for the participants, and 
provide guidance on the application of EA products. Workshop curriculum 
content, presenter briefings, and case studies are located at: 
https://portal.nasa.gov/sites/niie/ea/Documentation/Forms/Allltems.aspx/sites/niiefea/Documentation/NASA_Agency_EA_Project/EA_Workshop_July_2004
; 

8) August 2005 Workshop [PLANNED]. Anticipate 75-125 participants. The 
NASA EA Core Team will present results of major work activities from 
past year, propose projects for the next year, and receive input from 
participants to define and agree upon next year's work plan. Draft 
curriculum content is located at: 
https://portal.nasa.gov/sites/niie%a/Documentation/Forms/Allltems.aspx/sites/niiefea/Documentation/NASA_Agency_EA_Project/EA_Workshop August 
2005. 

II) Develop Architecture Products: 

1) Documents and data stored in the EA repository are under 
configuration control. A robust user authentication schema is used to 
assign, add, modify, and delete privileges based on user profiles. A 
security profile (proposed) is available to assure appropriate 
protection and access to sensitive NASA information, at: 
https://portal.nasa.gov/sites/niie%a/Documentation/Forms/sites/niiefea/Documentation/NASA_Agency_EA_Project/ITPM_Security_Levels.doc: 

2) March 2005 Workshop. The Core EA Team prepared training materials 
and facilitated a 3-day workshop to define and rationalize NASA's BRM 
and SRM. Created Business sub team for FEA BRM mapping analysis, a 
Services sub team for FEA SRM mapping analysis, and a Value team for EA 
Performance Metrics. The Value team defined the EA guiding principles, 
EA benefits for multiple user communities, and success criteria. Civil 
Service Employees volunteered to lead the respective BRM and SRM sub 
teams and defined deliverables and delivery schedules. Work products 
are captured in the EA repository and will undergo additional 
refinement in subsequent work cycles. Workshop curriculum, presenter 
briefings, and work products are located at: 
https://portal.nasa.gov/sites/niie/ea/Documentation/Forms/Allltems.aspx/sites/niiefea/Documentation/NASA_Agency_EA_Project/EA_Workshop_March_2005; 

3) Instituted EA reviews based on the SRA IV&V remediation plan. Since 
starting the EA Review process we have finished 8 reviews and 10 more 
are in progress for a total of more than $259 million in investments 
under review. The reviews are split between Project Reviews and Service 
Reviews. NASA has found that Service Reviews for the on-going, 
sustained IT operations provides positive focus and change management 
for the bulk of annual IT investment: 

A. EA Project Reviews (EAPR) information is found at 
https://portal.nasa.gov/sites/niie/ea/Documentation/EA_Reviews/EA_Review

s_Tracking_List_7/7/2005. 

(1) To date, 12 EA Project Reviews have been initiated by the Chief 
Enterprise Architect. Six Project Reviews have been completed, and six 
are in progress. More than $93 million in project investments under EA 
review. Completed and approved EAPRs include NAMIS, PBMA, WBS-LPM, N2 
NIBS, CMM, ETPM); 
(2) All completed reviews are posted to Sharepoint and the EA 
repository. The review documents are auditable, stand alone briefings. 
A comprehensive list of EAPR's is maintained on Sharepoint at 
https://portal.nasa.gov/sites/niie/ea/Documentation/EA_Reviews; 
(3) Notable example: The EA Project Review of IFMP's Contract 
Management Module (CMM) resulted in the ROI analysis and approval of a 
$60 Million investment, and the review prompted updates to IFMP 
business plans. 

(4) Notable example: The EA Project Review of NASA Aircraft Maintenance 
Information System (NAMIS) resulted in the ROI analysis and approval of 
a $12.5 million investment. This investment was recently selected as 
the Agency Integrated Asset Management (IAM) solution for NASA 
aircraft. NAMIS was a Center-specific asset management application, and 
the EA Review analysis helped determine that it could be leveraged and 
scaled for Agencywide use. 

(5) NASA Customer feedback for EA Project Reviews. 

(a) Notable example: The Process Based Mission Assurance (PBMA) 
executive sponsor declared the EA Review process very valuable in 
preparing a business case for this IT investment that truly supports 
NASA's Safety and Quality goals. 

(b) Notable example: The NASA Budget System executive sponsor declared 
the EA Review process helped to demonstrate a tangible ROI after only 8 
months by standardizing how budgeting is done across the Agency, and 
how an improved forecasting capability would minimize cycle preparation 
time for POP submission. 

B. EA Service Reviews (EASR): 

(1) To date, six EA Service Reviews have been initiated by the Chief 
Enterprise Architect. Two EA Service Reviews are completed and four are 
in progress. More than $166 million in Service (on-going) investments 
under review. The completed reviews include the NDC and NISN Agencywide 
Services); 
(2) All completed and draft EARS's posted to Sharepoint. The review 
documents are auditable, stand alone briefings. A comprehensive list of 
EASR's is maintained on Sharepoint at: 
https://portal.nasa.gov/sites/niie/ea/Documentation/EA_Reviews; 
(a) Notable example: The NASA Data Center (NDC) completely rationalized 
all services and investments in this services portfolio, totaling a $32 
million annual spend. Feedback from the NDC Manager indicated the EASR 
helped to renew focus and contact with customers, define customer 
densities, and refine pay-for-serve cost recovery models. 

(b) Notable example: The NASA Integrated Services Network (NISN) used 
EA work to reorganize and rationalize its services portfolio, and 
create reference architectures that graphically represent operating 
environments to illustrate system components, relationships between 
components, and definitions of relationships between system components 
and elements external to the system. This activity allowed NASA to gain 
insight into this annual $100 million steady state investment. 

(c) Reference architectures for major NASA IT systems: 
(i) NASA Data Center (NDC) completed 9 Jun 2005, 
(ii) NASA integrated Services Network (NISN) completed 17 Jun 2005, 
(iii)Marshall Space Flight Center Infrastructure in progress, 
(iv)Marshall Space Flight Center IT Security Infrastructure in 
progress. 

C. NASA BRM and SRM mapping. Marshall Space Flight Center (MSFC) has 
mapped their Center infrastructure services to the participating Center 
businesses in order to rationalize and understand the relationship of 
appropriate service for business support. This exercise is the pilot 
model for all NASA Centers. MSFC has also gone on to map the specific 
Center IT services back to Center businesses supported. As we make 
gains in these types of EA mapping we are experiencing a better 
understanding of: 

(1) Current services as they support the current Lines of Business 
(LoB); 
(2) Gaps in services that identify the potential for new services; 
(3) Businesses that may not fully leverage available infrastructure 
support; 
(4) Gaps in future service plans as compared to LoB strategic plans. 

D. Current Agencywide effort to map NASA LoB Agencywide to IT services 
throughout the Agency. We have created three EA sub-teams (working 
groups) to pioneer this work and provide a basis for discussion in NASA 
LoB investments. 

(1) Create NASA IT services reference model - August 2005; 
(2) Use NASA IT Service model for analysis - October 2005; 
(3) Create common IT definitions across the agency - August 2005; 
(4) Work to agreed common definitions throughout agency - October 2005. 

III) Inter-Agency Interfaces & Reporting: 

1) Based on GAO's recommendations in FY2003, the NASA Office of the CIO 
contracted with an outside vendor (SRA) to conduct an Independent 
Validation and Verification (IV&V) assessment of NASA's EA program. The 
IV&V Recommendations and Remediation plan was completed and results are 
incorporated into EA work plan. 

https://portal.nasa.gov/sites/niie/fea/Documentation/Forms/niie/fea/fDocumentation/EA_Reference_Materials_and_Links_of_Interest/NASA_IVV_Report_Final.pdf 

2) To demonstrate depth of EA understanding, NASA's Chief Enterprise 
Architect (John McManus) is involved in the following Federal EA 
working groups: 

A. Chairs the FEA Emerging Technologies working group; 
B. Member of FEA Component Based Architecture Working group. 
C. Member of FEA Governance Sub-Committee; 
D. Past chair of the 2005 Architecture.Gov forum; 
E. Member of Chief Architects Forum (CAF); 
F. Recognized guest speaker at numerous forums and inter-agency 
meetings. 

3) To demonstrate quality of leadership, the NASA Chief Enterprise 
Architect: 

A. Is the recipient of a Federal 100 award cited for EA efforts. 
B. Actively participates in OMB Enterprise Architecture activities. 
C. Frequently acts as OMB reference to other agencies for: 

(1) Instituting and guiding Capital Planning Investment Council (CPIC) 
processes; 
(2) Making EA real to the Agency; 
(3) Instituting and guiding Earned Value Management (EVM); 
(4) Instituting and guiding IT Investment Portfolio management. 

D. NASA's Deputy Enterprise Architect is involved in: 

(1) Member of Chief Architects Forum (CAF); 
(2) Requested Speaker for NARA EA efforts; 
(3) Cited as EA contact in industry magazines. 
(4) NASA is consistently scored a high score for use of EA in the OMB 
Capability Maturity Model. 

IV) Policy & Governance: 

1) NASA has a completed EA NPD and NPR draft that was submitted to 
NODIS on July 6th for final review. Documents are located on Sharepoint 
(EA NPD rev-2, 7/7/2005; EA NPR Draft 7/7/2005). 

2) NASA has a published CPIC policy that is has been proactively used 
since FY2003 to plan and manage IT investments. 

3) Version 3.0 of the NASA EA was approved by the NASA Executive 
Council on August 24, 2004 and signed by the NASA Administrator. 

4) EA Volumes 1-5 have been reviewed by OMB and GAO, and are currently 
being updated with new content to reflect changes in NASA's current and 
future IT environments. (Volume 6 is replaced by the NPR). Volumes are 
located on Sharepoint at 
https://portal.nasa.gov/sites/niie%a/Documentation/Forms/niie/fea/fDocumentation/NASA_Enterprise_Architecture_Volumes_EA_V3.0. This includes 
the following: 

A. Volume 1 Version 3.0, 4/8/2005, Overall Architecture and Governance; 
B. Volume 2 Version 3.0, 4/8/2005, OAIT Investment Category; 
C. Volume 3 Version 3.0, 4/8/2005, Program Unique and Multi 
Program/Project Investment Category; 
D. Volume 4 Version 3.0, 4/8/2005, Structures and Strategies; 
E. Volume 5 Version 3.0, 4/8/2005, EA "To-Be" Guidance; 

5) NASA has an EA Certification Policy with accompanying goals: 

A. Civil Servants goals and standards: 

(1) All Agency level EA Civil Servants to be certified by FY 2007; 
(2) At least 1 Civil Servant for each center certified by FY 2009; 
(3) Civil servants trained to lead all EA reviews by FY 2009. 

B. EA contractor goals and standards: 

(1) 30%-50% of all EA contractors certified by end of FY 2006; 
(2) 60% - 80% of all EA contractors certified by end of FY 2007; 
(3) All contractors certified by end of FY 2008. 

6) NASA's EA policy and guidance documents are prepared in parallel and 
in conjunction with other Agency policy and governance to assure 
uniformity of policy and consistency of application. Specifically, the 
EA: 

A. Is interlocked with Agency IT Security policy, NPG 2810. Scott 
Santiago is NASA's Deputy CIO for IT Security and is the interface to 
the Agency EA team. 

B. Is collaborating with NASA Technical standards team. Walter Kit is a 
senior member of the NASA CIO staff at Headquarters and is the primary 
interface for this effort with the Agency EA team. 

C. Is collaborating with the Knowledge/Records Management team led by 
Dr. Nitin Naik. Dr. Naik is the NASA deputy Chief Technology Officer 
and the primary interface to the EA Agency team for this work. 

D. Is collaborating with the Standards & Technical Information team at 
Langley Research Center. This effort is being sponsored by the Langley 
CIO, Duane Melson. 

E. Is leveraging Jet Propulsion Lab (JPL) NASA taxonomy efforts. Using 
the NASA Taxonomy prepared for use in the NASA portal to assure that 
web content adheres to a pre-defined data schema and that content is 
highly searchable. This body of work was sponsored by the NASA CTO and 
implemented at JPL. The NASA taxonomy work is located at: 
http://nasataxonomyjpl.nasa.gov. 

F. Was developed concurrently with CPIC and Investment portfolio 
documents. Documents are available at NASA Headquarters. These efforts 
were led by the NASA headquarters financial lead Mr. Bill Tufte. 

G. Is interlocked with NASA IT Project management team. This effort, 
lead by Scott Bair, is building a rigorous IT Project Management 
process that will mirror the NASA Provide Aerospace Products And 
Capabilities (PAPAL) process used in all major NASA flight hardware 
investments. 

V) Future Plans: 

1) Workplan for FY 2006 & 2007; 
2) EA Reviews throughout OAIT, Multi-Purpose/Program and Program Unique 
IT; 
3) Better granularity in BRM/SRM mapping; 
4) First full agency draft on DRM/Taxonomy; 
5) Greater use of Center Architects to infuse and affect Agency EA 
policy into local investments and operations. 

[End of Enclosure 3] 

The following are GAO's comments on NASA's letter dated July 27, 2005. 

GAO Comments: 

1. See the Agency Comments and Our Evaluation section of this report. 

2. We stand by our position in our November 2003 report[Footnote 21] 
and would add that in its comments on this report, NASA concurred with 
all of our recommendations. In the 2003 report, we stated that NASA had 
either implemented or was in the process of implementing six of nine 
IFMP modules, and that the enterprise architecture, which NASA had just 
recently begun to develop, lacked sufficient detail to guide and 
constrain investment decisions. Accordingly, we reported that 
significant IFMP components had been acquired and implemented outside 
the context of an enterprise architecture. At that time, NASA's CTO, 
who is currently the Deputy CIO/CTO, concurred with our position that 
the architecture products that had been used to acquire and implement 
the six IFMP modules did not contain sufficient scope and content. 

3. NASA has yet to provide us documentation to support this statement. 

4. In response to our request for the latest version of the 
architecture, NASA provided us Version 3.0. NASA has yet to provide us 
Version 3.1 of its architecture. 

5. We disagree. Based on our review of the evidence provided by NASA, 
the agency has made limited progress over the last 18 months in 
implementing our architecture recommendations. Of the 22 
recommendations, NASA has implemented 1 and partially implemented 4. 
Seventeen remain open. Further, NASA has not provided us the documents 
referred to in its comments or provided us access to the Web sites 
cited so that we could view these documents. 

6. NASA has taken these steps in each of the areas, which we 
acknowledge in our report. However, as we state in the report, the 
approval of the architecture by the architecture board and NASA 
Administrator and the review of the architecture and management 
processes by an independent verification and validation function need 
to be a recurring process before we will consider the associated 
recommendation closed. Also, as we state in the report, the current 
verification and validation function is not independent because it 
reports to the program office and not the architecture board. 

7. NASA has yet to provide documentation of these reviews and proof of 
alignment. 

8. NASA has drafted written policies governing the development, 
maintenance, and implementation of the architecture, as we state in our 
report. In May 2005, we shared our view on these drafts with the Deputy 
CIO/CTO. According to the Deputy CIO/CTO, the policies were to be 
finalized in July 2005. NASA has yet to provide us with either revised 
draft policies or approved policies. 

9. We disagree. The configuration management plan that NASA provided us 
was a draft version and it did not specifically address architecture 
products. Further, NASA has yet to provide us with configuration 
procedures or processes, and the agency's Deputy CIO/CTO stated that 
the configuration management plan was being developed and that not all 
architecture products were being managed using the automated 
configuration management tool, SharePoint. Similarly, the verification 
and validation report[Footnote 22] on the agency's architecture program 
states that NASA used both a paper-based and repository format for its 
architecture products, and that NASA was only beginning to plan for 
agency-wide use and maintenance of an architecture repository. 
Moreover, NASA has yet to provide us with documentation demonstrating 
that actual changes to architecture products were identified, tracked, 
monitored, documented, reported, and audited. 

10. NASA has yet to provide us with documentation of the metrics it 
developed and used to ensure that progress against architecture plans 
and quality of the products are measured and reported. In addition, 
NASA has yet to provide us with its approved architecture project 
management plans. 

11. See comment 4. Also, at the time of our review, the agency had not 
ensured that architecture products described the enterprise's business 
and data, applications, and technology that support it; ensured that 
the products described the "As Is" environment, the "To Be" 
environment, and a sequencing plan; or ensured that the business, 
performance, data, application, and technology descriptions address 
security. The Deputy CIO/CTO stated that the agency was currently 
developing a plan to address this recommendation. 

12. See comment 10. Also, NASA has yet to provide us with the Office of 
Management and Budget business case submission. In addition, NASA's 
Deputy CIO/CTO stated that return on investment would not be reported 
until the end of fiscal year 2005. 

13. See comment 3. Also, at the time of our review, the Deputy CIO/CTO 
stated that--while the agency recognizes that the architecture should 
be an integral part of the investment management process--the policy 
requiring that all investments be aligned with the architecture was 
still being developed and the associated procedures were in draft 
format. This official also stated that the process for conducting these 
reviews was being revised. 

14. See comment 10. According to the Deputy CIO/CTO, these plans were 
being developed and were to be finalized in May 2005. 

15. Although NASA has begun to implement our recommendations to improve 
its requirements management and cost-estimating processes, we continue 
to believe that a comprehensive corrective action plan would aid NASA 
in its effort to stabilize the system and improve the functionality of 
IFMP. Such a plan should include milestones and provide clear 
accountability for each action not completed in a timely and effective 
manner and, as such, would facilitate the expeditious implementation of 
each of our recommendations. 

16. Our conclusion that many of the system configuration problems 
caused by the agency's ineffective requirements management and testing 
processes continue to plague the core financial module is supported in 
large part by assertions made by NASA's Office of the CFO. In the notes 
to NASA's financial statement for the first and second quarter of 
fiscal year 2005 (October 1, 2004 through March 31, 2005), NASA's 
Office of the CFO disclosed, among other things, the following: 

* The financial management system is not currently designed to 
distinguish between current transactions and corrections to prior year 
transactions posted in the current year. 

* Functionality and configuration problems in SAP created inappropriate 
transactional postings, which resulted in abnormal balances and 
misstatement of unobligated and other balances. 

* The financial system as currently configured is unable to properly 
record Recovery of Prior Year Obligations (upward and downward 
obligation adjustments). 

* The configuration and data integrity issues from fiscal years 2003 
and 2004 continue to cause misstatements in accounts that contain 
trading partner data. This has limited NASA's ability to reconcile and 
resolve differences with trading partners (other federal agencies) and 
to eliminate intra-entity transactions (activity between NASA centers). 

* Data anomalies and abnormalities also caused misstatements in many 
budgetary and proprietary accounts. 

* We agree that NASA faces significant challenges in receiving an 
unqualified opinion on its financial statements that do not relate to 
its financial system, but clearly many of these challenges stem 
directly from the core financial system. 

17. Two years after we recommended that NASA prepare a detailed plan to 
provide systems that comply with the requirements of FFMIA, according 
to NASA, the agency has begun this effort. However, NASA's FFMIA 
remediation plan is not projected to be completed until December 2005, 
and therefore, we could not review the plan and have no basis to assess 
the quality of the plan. 

18. The two business case analyses (BCA) referred to in NASA's 
response--Labor Distribution System and Contract Management Module-- 
were updated earlier this year, but did not use the new WBS. Further, 
the BCA cost estimates were based on a different life cycle than the 
estimates in the program's life-cycle cost estimate, and the amounts of 
the estimates in the BCAs and the life-cycle cost estimate differ 
substantially. During our fieldwork, IFMP officials told us that the 
BCAs did not support the program's life-cycle cost estimate but rather 
were intended to ensure that each project was well thought out from an 
investment standpoint. As we stated in the report, each of the WBS cost 
estimates provided for the remaining modules in support of the life- 
cycle cost estimate was either incomplete or incorrect, and only one of 
them was prepared using the new WBS structure. Therefore, we continue 
to believe that IFMP needs to prepare cost estimates for the remaining 
modules using the current WBS. 

19. At the time of our assessment, NASA agreed with our position that 
the recommendations pertaining to utilizing a comprehensive risk 
assessment tool and quantifying the cost impact of risks were partially 
implemented. Since completion of our assessment, according to NASA, it 
has implemented and applied the risk methodology and probabilistic tool 
as the basis for reserves for all elements of IFMP as part of the 
fiscal year 2007 budget cycle, which is referred to in NASA's response 
as this year's budget cycle. As such, NASA considers the recommendation 
closed. We have not reviewed the actions taken by NASA since completion 
of our work, and therefore, have no basis to assess the merits of 
NASA's assertion. We reaffirm that based on the most current 
information available at the time of our assessment, the recommendation 
was partially implemented. 

20. When asked to provide an update on the status of our six 
recommendations intended to mitigate risk associated with relying on 
already-deployed components, NASA officials stated that they had an 
overall risk mitigation strategy related to IFMP that they use for this 
purpose and did not think it necessary to revise their strategy based 
on our recommendations. However, we continue to believe that a 
comprehensive corrective action plan would aid NASA in its effort to 
stabilize the system and improve the functionality of IFMP. Further, 
during the course of our work--including entrance and exit meetings in 
which we discussed each recommendation separately--NASA officials did 
not tell us that the agency reports its progress biannually to the 
House Science Committee. Therefore, we did not request and NASA did not 
provide documentation of its biannual progress briefings. 

21. We considered a recommendation to be partially implemented if the 
documentation provided indicated that NASA had made significant 
progress addressing our concerns. Because NASA was in the very early 
planning stage of implementing our recommendation to reengineer its 
acquisition management process and the details for how NASA would 
accomplish this objective were still vague, we consider this 
recommendation open. 

22. We reaffirm that in order to have an effective regression testing 
program NASA must also develop and properly document requirements. 
Complete, clear, and well-documented requirements are the foundation on 
which an effective testing program is established. Therefore, the 
weaknesses we identified in NASA's core financial module requirements 
impair the quality of NASA's regression testing program. As a result, 
we consider this recommendation partially implemented. 

23. In NASA's detailed response to the recommendation pertaining to the 
audit trail between the WBS estimate and the program's cost estimate, 
NASA agreed with our assessment that the recommendation was partially 
implemented. 

24. At the time of our assessment, NASA agreed with our position that 
the recommendations pertaining to utilizing a comprehensive risk 
assessment tool and quantifying the cost impact of risks were partially 
implemented, but NASA's status table shows them closed because the 
agency has taken additional action to close the recommendations since 
we completed our audit work. We reaffirm that based on the most current 
information available at the time of our assessment, the 
recommendations were partially implemented. 

25. We disagree that several of GAO's recommendations can be 
categorized as "open ended." For the particular example that NASA 
cited, "ensure that IT investments comply with the enterprise 
architecture," closure of this recommendation would require 
documentation showing that a process has been established and that it 
is being followed on a recurring basis. 

26. See comment 6. Also, we state in our report that according to the 
Deputy CIO/CTO, the verification and validation reviews would be 
performed on a recurring basis. However, NASA has yet to provide us 
with either the remediation plan or the evidence referred to in 
enclosure 3 of its comments. 

27. See comments 4 and 11. Also, NASA has yet to provide us with the 
evidence referred to in enclosure 3 of its comments. 

28. NASA has yet to provide us access to its Web site. 

29. See comments 7 and 13. Also, NASA has yet to provide us with the 
remediation plan. 

[End of section] 

(192159): 

FOOTNOTES 

[1] GAO, Business Modernization: Improvements Needed in Management of 
NASA's Integrated Financial Management Program, GAO-03-507 (Washington, 
D.C.: Apr. 30, 2003); Business Modernization: NASA's Integrated 
Financial Management Program Does Not Fully Address Agency's External 
Reporting Issues, GAO-04-151 (Washington, D.C.: Nov. 21, 2003); 
Information Technology: Architecture Needed to Guide NASA's Financial 
Management Modernization, GAO-04-43 (Washington, D.C.: Nov. 21, 2003); 
and Business Modernization: Disciplined Processes Needed to Better 
Manage NASA's Integrated Financial Management Program, GAO-04- 118 
(Washington, D.C.: Nov. 21, 2003). 

[2] Pub. L. No. 104-208, div. A., § 101(f), title VIII, 110 Stat. 3009, 
3009-389 (Sept. 30, 1996). 

[3] GAO-03-507. 

[4] According to relevant guidance, a design process includes examining 
alternative technical solutions with the intent of selecting the 
optimum design based on established criteria. These criteria may be 
significantly different across products, depending on product type, 
operational environment, performance and support requirements, and cost 
or delivery schedules. It also includes a decision analysis and 
resolution process to ensure that alternatives are compared and the 
best one is selected to accomplish the goals of all the other processes 
(e.g., requirements development). Effective design processes use design 
patterns (i.e., recurring solutions to software design problems that 
are constantly found in application development), and iterative 
prototyping to establish the preferred design option (system 
architecture). 

[5] Carnegie Mellon, Software Engineering Institute, Capability 
Maturity Model® Integration for Systems Engineering and Software 
Engineering, Version 1.1 (Pittsburgh, Pa.: December 2001); 
Carnegie Mellon University, Software Engineering Institute, The 
Capability Maturity Model: Guidelines for Improving the Software 
Process (Addison Wesley Longman, Inc., 1994); Jonathan Adams, Srinivas 
Koushik, Guru Vasudeva, and George Galambos, Patterns for e-Business: A 
Strategy for Reuse (IBM Press™, 2001); B. Craig Meyers and Patricia 
Oberndorf, Managing Software Acquisition: Open Systems and COTS 
Products (Addison- Wesley, 2001); Jeffrey A. Hoffer, Joey F. George, 
and Joseph S. Valacich, Modern Systems Analysis and Design (Addison 
Wesley Longman, Inc., 1999); and Kurt Wallnau, Scott Hissam, and Robert 
Seacord, Building Systems from Commercial Components (Addison-Wesley, 
2002). 

[6] A risk management process involves identifying potential problems 
before they occur, so that risk-handling activities may be planned and 
invoked as needed across the life of the product or project to mitigate 
adverse impacts on achieving objectives. 

[7] A requirements development and management process involves 
generating product and product-component requirements and managing all 
of the requirements received or generated by the project, including 
both technical and nontechnical requirements, as well as those 
requirements levied on the project by the organization. 

[8] GAO-04-43. 

[9] An enterprise architecture is an organizational blueprint that 
defines--in both business and technology terms--how an organization 
operates today and how it intends to operate in the future; it also 
provides a plan for transitioning to this future state. 

[10] GAO-04-43. 

[11] According to relevant guidance, an effective configuration 
management process consists of four primary elements: (1) configuration 
identification, which includes procedures for identifying, documenting, 
and assigning unique identifiers (e.g., serial number and name) to 
product types generated for the architecture program, generally 
referred to as configuration items; (2) configuration control, which 
includes procedures for evaluating and deciding whether to approve 
changes to a product's baseline configuration, generally accomplished 
through configuration control boards, which evaluate proposed changes 
on the basis of costs, benefits, and risks and decide whether to permit 
a change; (3) configuration status accounting, which includes 
procedures for documenting and reporting on the status of configuration 
items as a product evolves; and (4) configuration auditing, which 
includes procedures for determining alignment between the actual 
product and the documentation describing it, thereby ensuring that the 
documentation used to support the configuration control board's 
decision making is complete and correct. Each of these elements should 
be described in a configuration management plan and implemented 
according to the plan. 

[12] EVM goes beyond the two-dimensional approach of comparing budgeted 
costs to actuals. Instead, it attempts to compare the value of work 
accomplished during a given period with the work scheduled for that 
period. NASA requires EVM reporting and analysis for research and 
development contracts with a total anticipated final value of $70 
million or more, and for production contracts with a total anticipated 
final value of $300 million or more. 

[13] According to the Software Engineering Institute, requirements 
management is a process that establishes a common understanding between 
the customer and the software project manager regarding the customer's 
business needs that will be addressed by a project. A critical part of 
this process is to ensure that the requirements development portion of 
the effort documents, at a sufficient level of detail, the problems 
that need to be solved and the objectives that need to be achieved. 

[14] Pub. L. No. 104-208, div. A., § 101(f), title VIII, 110 Stat. 
3009, 3009-389 (Sept. 30, 1996). 

[15] NASA assumed a 10-year life cycle beginning in fiscal year 2001, 
but the actual retirement date for the system was unknown, according to 
the Deputy Program Director. 

[16] A WBS is a method of organizing a program into logical 
subdivisions at lower and lower levels of detail. 

[17] SEI is a government-funded research organization that is widely 
considered an authority on software implementations. 

[18] Wallnau, Hissam, and Seacord. 

[19] GAO-04-43. 

[20] GAO, Information Technology: A Framework for Assessing and 
Improving Enterprise Architecture Management, Version 1.1, GAO-03-584G 
(Washington, D.C.: April 2003). 

[21] GAO, Information Technology: Architecture Needed to Guide NASA's 
Financial Management Modernization, GAO-04-43 (Washington, D.C.: Nov. 
21, 2003). 

[22] SRA International, Inc., National Aeronautics and Space 
Administration, NASA EA IV&V Report (Jan. 10, 2005).