Reports & Testimonies
Recommendations Database
GAO’s recommendations database contains report recommendations that still need to be addressed. GAO’s priority recommendations are those that we believe warrant priority attention. We sent letters to the heads of key departments and agencies, urging them to continue focusing on these issues. Below you can search only priority recommendations, or search all recommendations.
Our recommendations help congressional and agency leaders prepare for appropriations and oversight activities, as well as help improve government operations. Moreover, when implemented, some of our priority recommendations can save large amounts of money, help Congress make decisions on major issues, and substantially improve or transform major government programs or agencies, among other benefits.
As of October 25, 2020, there are 4812 open recommendations, of which 473 are priority recommendations. Recommendations remain open until they are designated as Closed-implemented or Closed-not implemented.
Browse or Search Open Recommendations
Have a Question about a Recommendation?
- For questions about a specific recommendation, contact the person or office listed with the recommendation.
- For general information about recommendations, contact GAO's Audit Policy and Quality Assurance office at (202) 512-6100 or apqa@gao.gov.
Results:
Subject Term: "Computer assisted instruction"
GAO-20-154, Nov 14, 2019
Phone: (202) 512-3489
Agency: Department of Defense: Department of the Navy
Status: Open
Comments: The Navy concurred with our recommendation. In March 2020, the Navy provided an estimated implementation date of March 2023, noting that it was considering a fleet-wide survey, timed for a later date when more Surface Warfare Officers have completed new training courses and implemented their training. In addition, the Navy listed other means it employs to collect feedback, such as student surveys at the end of training courses, leadership visits and conferences, and Commanding Officer updates. Our emphasis on collection of fleet-wide feedback from all Surface Warfare Officers and trend analysis remains critical to help the Navy understand the value of its training programs at various career stages and in the diverse operating environments across the fleet.
Agency: Department of Defense: Department of the Navy
Status: Open
Comments: The Navy concurred with our recommendation. In March 2020, the Navy provided an estimated implementation date of March 2023, noting that it was conducting the planned fleet-wide Officer of the Deck competency checks in 2020, and that it intends to use a system of ten career milestone assessments for future performance measurement. The Navy stated that it may or may not hold subsequent rounds of the Officer of the Deck competency assessments depending on performance indicated in other career milestone assessments. In our report we identified the importance of continuing the current Officer of the Deck competency assessments through at least 2024 because that is when new officers that complete the full set of new initial ship-driving training courses will be eligible for assessment. The Navy used the Officer of the Deck competency assessments in 2018 to establish a performance baseline, and we believe that the Navy should apply the same standard to measure performance changes for Surface Warfare Officers that complete new training courses moving forward.
Agency: Department of Defense: Department of the Navy
Status: Open
Comments: The Navy concurred with our recommendation. In March 2020, the Navy provided an estimated implementation date of March 2023. However, in its official comments on the report and in subsequent correspondence, the Navy indicated that its existing policies already meet the intent of the recommendation. Specifically, the Navy stated that its Officer of the Deck Underway Personnel Qualification Standards provide standard evaluation criteria for Officer of the Deck qualification. In our report, we noted that while the Personnel Qualification Standards provide a common list of required experiences, they do not provide a common understanding of proficiency in completing these experiences. Instead, proficiency determination is left to the discretion of the ship's Commanding Officer, which has led to wide variation in ship-driving proficiency across the fleet. Therefore, we continue to believe that the Navy should provide Commanding Officers with standard criteria to inform their evaluation of candidates for their Officer of the Deck qualification and incorporate these criteria into surface fleet guidance.
Agency: Department of Defense: Department of the Navy
Status: Open
Comments: The Navy concurred with our recommendation. In March 2020, the Navy provided an estimated implementation date of March 2023. In official comments on the report and in subsequent correspondence the Navy stated that its Surface Warfare Career Manual establishes guidance for the implementation and use of the Mariner Skills Logbook, and that the logbook will contribute information to allow proficiency trend analysis over time. However, while the Surface Warfare Career Manual identifies the offices responsible for logbook activities, it does not include a specific plan for the use of logbook data to analyze proficiency trends over time or to benefit individual officers. Our emphasis that the Navy develop a plan to analyze and use Surface Warfare Mariner Skills Logbook data to aid decision-making remains valid, and when implemented should assist the Navy in determining the relationship between SWO experience and ship-driving proficiency.
GAO-16-636, Aug 16, 2016
Phone: (202) 512-5431
Agency: Department of Defense
Status: Open
Comments: The Department of Defense (DOD) concurred with this recommendation. As of December 2019, the Army had taken some steps to improve its guidance, as GAO recommended in August 2016, but did not plan to fully address the recommendation until 2021. Officials stated that the Army established target usage rates for existing virtual training devices and issued guidance and tracking tools for recording device usage. However, the Army had not modified the guidance, cited in GAO's August 2016 report, to require that training developers consider the amount of time available to train with or expected usage rates of new virtual training devices. According to Army officials, they will implement GAO's recommendation in a planned update to guidance on the justification and validation of new virtual training devices scheduled for 2021. By updating this guidance, the Army will have the information it requires to evaluate the amount of virtual training capabilities needed to achieve training tasks and proficiency goals during operational training.
GAO-13-698, Aug 22, 2013
Phone: (202) 512-9619
Agency: Department of Defense
Status: Open
Comments: As of 18 Aug 2014, the Army and Marine Corps actions for this recommendation are currently ongoing and the recommendation status currently remains open. On 14 June 2014, the DOD Inspector General reported in the Defense Audit Management Information System that "the Office of the Deputy Assistant Secretary of Defense(Readiness) developed a decision algorithm to determine which military tasks could be taught virtually and which military tasks should only be taught in classroom or field environments (i.e., live). The algorithm was provided to the Services for peer-review and possible implementation. The Army is reviewing its progressive training models through a process called Training Summit IV (TS IV). These models establish how virtual and constructive based training is integrated with live training to optimize training readiness. The TS IV will include training model review by proponent schools, as well as a cross-section of unit commanders and leaders. This effort will be completed in Fiscal Year (FY) 2014 and presented for validation and G-3/5/7 approval at the Army Training General Officer Steering Committee in November 2014. Also, the Marine Corps initiated a request for an internal servicewide study of existing and potential approaches to this topic (4th Quarter FY 2013). The initial focus is in determining how metrics can be better used to assess the impact of simulation based on meeting Marine Corps Training Standards. Furthermore, a targeted study began in the 1st Quarter FY 2014 and is focused initially on enhancing the methodology for assessing individual based simulators against Training and Readiness (T&R) Standards. In FY 2015, the study results will shape policy on how future T&R manuals will identify the appropriateness of simulators and simulations for training."
Agency: Department of Defense
Status: Open
Comments: As of 18 Aug 2014, the Army and Marine Corps actions for this recommendation are currently ongoing and the recommendation status currently remains open. On 14 June 2014, the DOD Inspector General reported in the Defense Audit Management Information System that "the Office of the Deputy Assistant Secretary of Defense (Readiness) has coordinated with the Army and Marine Corps to identify standard approaches to capture costs and cost benefit analysis that could be used DoD-wide. The Army has undertaken a "cost of training" analysis that is an on-going action to determine cost of readiness and/or training. One area of concentration is to look at the "Other Burdened Resources Required for Training Readiness." This area is further broken down into two areas: Investment/Modernization and Installation Services. The Investment/Modernization area will look at Non-System Training Aids, Devices, Simulators and Simulations while Installation Services will look at Post Deployment Software Support. In addition, the Army is gathering data to validate an existing model developed by the Simulations to Mission Command Interoperability Director (Program Executive Office for Simulation, Training, and Instrumentation) for the Value of Simulation Study consisting of five phases: Phase one focused on development of a working methodology to assess both quantitative and qualitative value of simulations used to support collective training (completed). Phase two is currently gathering data for model validation. Phase three will be an expansion to other simulation capabilities. Phase four is data gathering and validation. Phase five is expanded testing/methodology use case study/validation for return on investment use. The Marine Corps established a study, described in response to Recommendation 1, which will evaluate and propose the initial cost factors not currently captured during Programming yet would be relevant in determining the appropriate mix of live and simulated training. The initial results are expected in FY 2015."