Comments on the Office of Personnel Management's February 20, 2008 Report to Congress Regarding the Retirement Systems Modernization
GAO-08-576R: Published: Mar 28, 2008. Publicly Released: Mar 28, 2008.
- Accessible Text:
The Office of Personnel Management (OPM) is modernizing the paper-intensive processes and antiquated information systems it uses to support the retirement of civilian federal employees through the Retirement Systems Modernization (RSM) program. In January 2008, we reported on the agency's management of this program, in which we noted concerns and made recommendations for improvement in four key areas: (1) system testing, (2) system defect resolution, (3) program cost estimation, and (4) program earned value management. The explanatory statement of the House Appropriations Committee regarding the fiscal year 2008 Consolidated Appropriations Act directed OPM to submit to Cpngressional Committees and to GAO not later than February 20, 2008, a report of its actions on the four areas of concern that we identified. Further, the explanatory statement directed that GAO provide to Congressional Committees and to OPM our comments on the agency's report.
Our study determined that OPM initial RSM system test results did not provide assurance that a major system component--the Defined Benefits Technology Solution (DBTS)--would perform as intended, and that the agency's compressed and concurrent testing schedule increased the risk that it would not have sufficient resources or time to verify that the system will work as intended. We recommended that OPM ensure that sufficient resources are provided to fully test functionality, actions for mitigating the risks inherent in concurrent testing are identified, test results verify that all system components perform as expected, and test activities and results are subjected to independent verification and validation. OPM's report did not address the results of other critical tests that the agency had planned to conduct starting in December 2007 and ending in February 2008. Specifically, the report did not discuss the results of the following tests: (1) parallel test to verify that the new system produces the same results as existing systems; (2) integrated product test to confirm that system components meet functional requirements; (3) performance test to confirm that the new system meets performance requirements (e.g., processing volume and execution time); (4) business capability test to confirm the operational readiness of the new system for end users. Our study of RSM determined that trends in identifying and resolving system defects indicated a growing backlog of problems to be resolved prior to deployment of the new system. We recommended that the agency monitor and review DBTS defects to ensure that all urgent and high priority defects are resolved prior to system deployment and that the resolution of urgent and high priority defects is subjected to independent verification and validation. OPM reported progress toward resolving defects between October 2007 and January 2008. However, the agency did not report actual defect resolution data for February 2008; instead it reported the projected defects it expected to remain at the time of system deployment. Our study determined that the reliability of OPM's $421.6 million RSM life-cycle cost estimate was questionable because the agency could not support the estimate with a description of the system to be developed and a description of the methodology used to produce the estimate. We recommended that the agency develop a revised RSM cost estimate that addresses the weaknesses identified and task an independent verification and validation contractor with reviewing the process used to develop the estimate and assessing the reliability of the resulting estimate. The report did not provide new information or describe the progress the agency has made to address the weaknesses in the RSM cost estimate. OPM's report did not address conducting independent verification and validation of its cost estimate and the process used to develop the estimate. Our study determined that OPM's reporting of program progress using Earned Value Management (EVM) was unreliable because the agency did not establish and validate a meaningful performance measurement baseline. We recommended that the agency establish a basis for effective use of earned value management by validating the RSM performance measurement baseline through a program level integrated baseline review, and that it task an independent verification and validation contractor with reviewing the process used to develop the baseline and assessing the reliability of the performance measurement baseline. OPM's report did not provide new information or describe the progress the agency has made toward addressing the three specific weaknesses that we identified in its use of EVM. Without addressing these weaknesses, OPM's use of EVM will continue to be unreliable.