Skip to main content

Military Readiness: DOD Needs to Strengthen Management and Oversight of the Defense Readiness Reporting System

GAO-09-518 Published: Sep 25, 2009. Publicly Released: Sep 25, 2009.
Jump To:
Skip to Highlights

Highlights

The Department of Defense (DOD) reports data about the operational readiness of its forces. In 1999, Congress directed DOD to create a comprehensive readiness system with timely, objective, and accurate data. In response, DOD started to develop the Defense Readiness Reporting System (DRRS). After 7 years, DOD has incrementally fielded some capabilities, and, through fiscal year 2008, reported obligating about $96.5 million. GAO was asked to review the program including the extent that DOD has (1) effectively managed and overseen DRRS acquisition and deployment and (2) implemented features of DRRS consistent with legislative requirements and DOD guidance. GAO compared DRRS acquisition disciplines, such as requirements development, test management, and DRRS oversight activities, to DOD and related guidance, and reviewed the system's current and intended capabilities relative to legislative requirements and DOD guidance. We did not evaluate DOD's overall ability to assess force readiness or the extent that readiness data reflects capabilities, vulnerabilities, or performance issues.

Recommendations

Recommendations for Executive Action

Agency Affected Recommendation Status
Department of Defense To address the risks facing DOD in its acquisition and deployment of DRRS, and to increase the chances of DRRS meeting the needs of the DOD readiness community and Congress, the Secretary of Defense should direct the Deputy Secretary of Defense, as the Chair of the Defense Business Systems Management Committee, to reconsider the committee's recent approval of DRRS planned investment for fiscal years 2009 and 2010, and convene the Defense Business Systems Management Committee to review the program's past performance and the DIO's capability to manage and deliver DRRS going forward.
Closed – Implemented
DOD did not agree with this recommendation and noted that the Undersecretary of Defense for Personnel and Readiness (USD P&R) would ensure that the program is compliant with all acquisition requirements prior to future certification requests from the program. While the Defense Business System Management Council was not convened to review DRRS past performance or its related certified investments program, DRRS' investment was approved through the investment certification process led by the Human Resource Management Investment Review Board (HRM IRB) for planned fiscal year 2009 and 2010 and again in March 2012. In August 2012, the DRRS program was officially realigned from the Business Mission Area to the Warfighter Mission Area and was not subject to the Defense Business System Management Council (DBSMC) review for certification. As a result, the investment's removal from the business mission area means that the DBSMC no longer is required to oversee this investment. However, because the program was subject to review by an independent entity, which was the intent of our recommendation, we are closing this recommendation as implemented.
Department of Defense To fully inform this Defense Business Systems Management Committee review, the Secretary of Defense should direct the Deputy Secretary to have the Director of the Business Transformation Agency, using the appropriate team of functional and technical experts and the established risk assessment methodology, conduct a program risk assessment of DRRS, and to use the findings in our report and the risk assessment to decide how to redirect the program's structure, approach, funding, management, and oversight. In this regard, the Secretary should direct the Deputy Secretary to solicit the advice and recommendations of the DRRS Executive Committee.
Closed – Not Implemented
DOD agreed that a program risk assessment would be constructive and stated that the DRRS Executive Committee (DEXCOM) would continue to provide senior level review for the DRRS implementation effort. In addition, the Business Transformation Agency was invited to perform an appropriate risk assessment on the DRRS effort by mid-2010. However, program officials noted that the Business Transformation Agency was disestablished before an independent risk assessment could be made. They also noted that risk mitigation is a fundamental task to the program's governance body. We were able to review two staff meeting minutes in which risk and/or risk mitigation activities were discussed. Further, the Acquisition Strategy has a brief description of risk management and refers to a risk management plan, but no separate plan was provided. Program officials also noted that they do not maintain a risk register. While the program has multiple ways to discuss and assess risk, it does not have a documented risk assessment methodology and it did not take steps to conduct a complete program risk assessment.
Department of Defense The Secretary of Defense, through the appropriate chain of command, should take steps to ensure that DRRS requirements are effectively developed and managed with appropriate input from the services, Joint Staff, and combatant commanders, including (1) establishing an authoritative set of baseline requirements prior to further system design and development; (2) ensuring that the different levels of requirements and their associated design specifications and test cases are aligned with one another; and (3) developing and instituting a disciplined process for reviewing and accepting changes to the baseline requirements in light of estimated costs, benefits, and risk.
Closed – Not Implemented
DOD did not agree with this recommendation. In its comments on our report, the department stated that the program has an authoritative set of baseline requirements established with an effective governance process. We did not agree because 530 additional requirements had been identified and were still in the review process without a decision concerning whether or not they should be included. Since our report was issued, the DRRS program has developed a process for reviewing and accepting changes to requirements through its governance structure and acquisition strategy. However, it has not taken steps to ensure the requirements for the system have been developed, documented, and managed effectively and with input from appropriate stakeholders. Specifically, the DRRS acquisition approach describes overall management of requirements and the program documents capability requirements; however, the DRRS program has not baselined these requirements and it has not developed a traceability matrix document to allow individual requirements to be traced to system capabilities, business needs, and requirements and their associated design specifications to ensure requirements are implemented as needed.
Department of Defense The Secretary of Defense, through the appropriate chain of command, should take steps to ensure that DRRS testing is effectively managed, including (1) developing test plans and procedures for each system increment test event that include a schedule of planned test activities, defined roles and responsibilities, test entrance and exit criteria, test defect management processes, and metrics for measuring test progress; (2) ensuring that all key test events are conducted on all DRRS increments; (3) capturing, analyzing, reporting, and resolving all test results and test defects of all developed and tested DRRS increments; and (4) establishing an effective test management structure that includes assigned test management roles and responsibilities, a designated test management lead and a supporting working group, and a reliable schedule of test events.
Closed – Implemented
DOD did not concur with this recommendation and in comments for our report the department noted that DRRS testing was already in place and performing effectively. However, we disagreed and stated that the program had not followed a rigorous testing regimen that included documented test plans, cases, and procedures and the program could not produce documentation for all testing that they said had already been conducted. In July 2013, DRRS officials forwarded a Test and Evaluation Master Plan (TEMP) that was approved in October 2012. The TEMP includes Key Performance Parameters, a process for reporting and categorizing deficiencies found during testing, a test and evaluation strategy, a high-level integrated test program schedule, a brief description of developmental test objectives, system integration testing, system assessment testing, and operational testing objectives, and a brief description of test and evaluation management assigned responsibility for testing to the Test and Evaluation Manager, who reports to the DRRS Implementation Office director. Further, program officials provided a developmental test report that includes results against the test objectives and key performance parameters. In addition, there was a problem report summary with status and severity of problems identified. While we were not provided individual test plans and procedures, the TEMP detail and the provided reports show that DOD is taking steps to effectively manage its testing process for the latest increment and has thus, met the intent of the recommendation.
Department of Defense The Secretary of Defense, through the appropriate chain of command, should take steps to ensure that DRRS integrated master schedule is reliable, including ensuring that the schedule (1) captures all activities from the work breakdown structure, including the work to be performed and the resources to be used; (2) identifies the logical sequencing of all activities, including defining predecessor and successor activities; (3) reflects whether all required resources will be available when needed and their cost; (4) ensures that all activities and their duration are not summarized at a level that could mask critical elements; (5) achieves horizontal integration in the schedule by ensuring that all external interfaces (hand-offs) are established and interdependencies among activities are defined; (6) identifies float between activities by ensuring that the linkages among all activities are defined; (7) defines a critical path that runs continuously to the program's finish date; (8) incorporates the results of a schedule risk analysis to determine the level of confidence in meeting the program's activities and completion date; and (9) includes the actual start and completion dates of work activities performed so that the impact of deviations on downstream work can be proactively addressed.
Closed – Implemented
DOD did not concur with this recommendation and in comments on our report the department noted that the DRRS integrated master schedule was current, reliable, and met all of the criteria outlined in our recommendation. However, we disagreed and noted that the DRRS' April 2009 schedule did not, among other things, establish a critical path for all key activities or include a schedule risk analysis. In July 2013, DOD provided screen shots of the Integrated Master Schedule for the DRRS program. We were not able to verify that the DRRS integrated master schedule meets all requirements of a schedule as described in our recommendation because the documentation format did not allow us to perform a complete analysis and clearly identify predecessor and successor activities or identify the critical path for the program. However, in July 2013, program officials noted that developmental testing occurred in early 2013 and operational testing was completed in May 2013. The officials noted that they also expect to achieve full operating capability by December 13, 2013, with fielding in January 2014. This is consistent with both the current schedule from DRRS and the schedule that was reviewed in our original report. Since the program appears to be substantially complete and on schedule for full operating capability, we consider this recommendation closed.
Department of Defense The Secretary of Defense, through the appropriate chain of command, should take steps to ensure that the DRRS program office is staffed on the basis of a human capital strategy that is grounded in an assessment of the core competencies and essential knowledge, skills, and abilities needed to perform key DRRS program management functions, an inventory of the program office's existing workforce capabilities, and an analysis of the gap between the assessed needs and the existing capabilities.
Closed – Not Implemented
DOD stated that the DRRS program planned to increase the number of civilian billets in the DRRS implementation office. A evaluation of needed billets concluded that 5 GS-15 and 4 GS-13 billets were needed. The needed billets included testing, security, operations, and systems engineering personel to provide more government oversight of the program. The department had planed to convert contractor billets to civilian billets during the 2010/2011 timeframe as part of a DOD in-sourcing initiative. However, only one position was filled before department-wide restrictions and manning priorities precluded hiring any other government civilians. As of July 2013, 8 of the 9 positions, which were considered critical to the successful implementation of the program remained filled with contractors rather than government civilians.
Department of Defense The Secretary, through the appropriate chain of command, should take steps to ensure that DRRS is developed and implemented in a manner that does not increase the reporting burden on units and addresses the timeliness, precision, and objectivity of metrics that are reported to Congress.
Closed – Implemented
After our report was issued in 2009, the Marine Corps, like the Army and Navy before it, developed its own version of the DRRS system, DRRS-MC. Now, Marine Corps units, like Army and Navy units, enter their data directly into a single system (DRRS-MC) and the data is then transferred to the strategic DRRS system (DRRS-S). Air Force units report their DRRS data directly into DRRS-S and they have tested functionality that automatically populates data from other authoritative databases as well as functionality that automatically calculates ratings. Going forward, the DRRS system will maintain both traditional resource and training metrics as well as newer capability assessments. Taken together, the two sets of data provide more precision that either individual set of data. As the services continue to add and refine information concerning required conditions and standards, the objectivity of their mission essential task, capability ratings will continue to increase. Finally, as the automatic population and calculation functionality of the DRRS family of systems continues to increase, the timeliness of DRRS information will continue to increase. When our report was issued in 2009, DRSS was projected to reach its full operational capability in 2014. As of July 2013, DRSS program officials were projecting that the system would reach its full operational capability in December of 2013.
Department of Defense To ensure that these and other DRRS program management improvements and activities are effectively implemented and that any additional funds for DRRS implementation are used effectively and efficiently, the Secretary of Defense should direct the Deputy Secretary to ensure that both the Human Resources Management Investment Review Board and the DRRS Executive Committee conduct frequent oversight activities of the DRRS program, and report any significant issues to the Deputy Secretary.
Closed – Implemented
DOD stated that the component acquisition executive (the Under Secretary for Personnel and Readiness) will provide certifications of compliance at appropriate times to the Human Resources Management Investment Review Board and the Deputy Chief Management Officer. In addition, it stated that the current DRRS Executive Committee governance process would continue to provide sustained functional oversight of the DRRS program and that any issues that arise will be elevated for review as appropriate. The DRRS program was transferred to the Warfighter Mission Area in July 2012. This action eliminated the DRRS requirement to be reviewed by the Defense Business Systems Management Council and it was not required to be certified for funding in fiscal year 2013. Since December 2011, the DRRS executive committee has increased its oversight of the program--meeting more frequently than in the past, making decisions, and providing direction to the DRRS implementation office (DIO) rather than just receiving briefings from the DIO. Significant issues that are raised to the DRRS executive committee are communicated to the Deputy Secretary through the Deputy Secretary's Management Advisory Group, which focuses on readiness issues on a regular basis. The Co-chairs of the DRRS executive committee are members of the Deputy Secretary's Management Advisory Group.

Full Report

Office of Public Affairs

Topics

AccountabilityAgency missionsCombat readinessData collectionDecision makingDefense capabilitiesHuman capital planningInformation systemsLegacy systemsManagement information systemsMilitary forcesProgram managementReporting requirementsStaff utilizationStrategic information systems planningSystems designSystems testing