FeedbackWe want to hear from you about this new design! Please e-mail us with your feedback on the look and feel of this report.
Scope and Methodology
To determine GOES-R acquisition status, we evaluated various programmatic and technical plans, management reports, and other program documentation. We reviewed the cost and schedule estimates (including launch dates), planned system requirements, and monthly executive-level management briefings. We also interviewed agency officials from NOAA and NASA to determine key dates for future GOES-R acquisition efforts and milestones and progress made on current development efforts. Furthermore, we analyzed the earned value data on development efforts contained in contractor performance reports obtained from the program. To perform this analysis, we compared the cost of work completed with budgeted costs for scheduled work for a 12-month period to show trends in cost and schedule performances. To assess the reliability of the cost data, we compared it with other available supporting documents (including monthly program management reviews); electronically tested the data to identify obvious problems with completeness or accuracy; and interviewed program officials about the data. For the purposes of this report, we determined that the cost data were sufficiently reliable. We did not test the adequacy of the agency or contractor cost-accounting systems.
To evaluate whether NOAA has established adequate contingency plans, we analyzed relevant continuity planning documentation, agreements with international partners, and meeting reports from the Coordination Group for Meteorological Satellites. In addition, we compared NOAA's continuity of operations plans to federal policy and industry best practices to determine the extent to which the plans will ensure the continuity of critical functions related to geostationary satellites in the event of a satellite failure. We met with NOAA officials responsible for continuity of operations planning and coordination with international partners, as well as GOES data users within NOAA and at other federal agencies to determine the potential impact of NOAA's plans on their data needs.
To determine the adequacy of NOAA's efforts to identify GOES users, prioritize their data needs, and communicate program status, we analyzed relevant program documents, including acquisition plans, user requirements, and GOES user group meeting minutes. We compared NOAA's efforts to industry best practices to determine the extent to which users were appropriately identified and involved in program activities. We also interviewed key users of GOES data to determine whether NOAA's efforts to identify and prioritize their data needs and communicate program status and changes were adequate. In consultation with NOAA officials, we identified key GOES users at organizations within NOAA and other federal agencies that depend on GOES data for their primary mission. We selected three organizations within NOAA that are primarily responsible for environmental satellite data acquisition, processing and exchange, and environmental research. These organizations include the National Weather Service, National Environmental Satellite, Data and Information Service, and the Office of Oceanic and Atmospheric Research. We also identified federal government users outside of NOAA with the largest funding levels for meteorological operations in fiscal year 2009. These agencies were the Department of Defense and the Department of Transportation (including the Federal Aviation Administration). On the basis of discussions with GOES-R program officials and the Office of the Federal Coordinator for Meteorology, we then selected additional federal agencies that rely extensively on GOES data to meet their mission requirements. These agencies include the Department of the Interior (including the U.S. Geological Survey and Bureau of Reclamation), and the U.S. Department of Agriculture (including the U.S. Forest Service).
We primarily performed our work at the Department of Defense, Department of the Interior, Department of Transportation, NOAA, NASA, and U.S. Department of Agriculture offices in the Washington, D.C., metropolitan area. In addition, we conducted work at Department of Defense weather agencies in Offutt Air Force Base, Nebraska and Stennis Space Center, Mississippi. We conducted this performance audit from October 2009 to September 2010, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.
The Coordination Group for Meteorological Satellites is a forum for the international exchange of technical information on geostationary and polar orbiting meteorological satellite systems.
Department of Homeland Security, Federal Continuity Directive 1: Federal Executive Branch National Continuity Program and Requirements (February 2008); and Software Engineering Institute, Capability Maturity Model@ Integration for Acquisition, Version 1.2, CMU/SEI-2007-TR-017 (Pittsburgh, Pa.: November 2007).
Software Engineering Institute, Capability Maturity Model@ Integration for Acquisition, Version 1.2, CMU/SEI-2007-TR-017 (Pittsburgh, Pa.: November 2007).
Office of the Federal Coordinator for Meteorology, The Federal Plan for Meteorological Services and Supporting Research, Fiscal Year 2009, FCM-P1-2008 (Washington, D.C.: October 2008).