Quality of DOD Status of Forces Surveys Could Be Improved by Performing Nonresponse Analysis of the Results
GAO-10-751R: Published: Jul 12, 2010. Publicly Released: Jul 12, 2010.
The Defense Manpower Data Center (DMDC) conducts a series of Web-based surveys called Status of Forces surveys, which help enable decision makers within the Department of Defense (DOD) to (1) evaluate existing programs and policies, (2) establish baselines before implementing new programs and policies, and (3) monitor the progress of programs and policies and their effects on the total force. In recent years, we have discussed the results of these surveys in several of our reports. While we have generally found the survey results to be sufficiently reliable for the purposes of our reporting, several of our reports have discussed low response rates and the potential for bias in the survey results. Nonresponse analysis is an established practice in survey research that helps determine whether nonresponse bias (i.e., survey results that do not accurately reflect the population) might occur due to under- or overrepresentation of some respondents' views on survey questions. When nonresponse analysis is performed, survey researchers can use the results to select and adjust the statistical weighting techniques they use that help ensure that survey results accurately reflect the survey population. Because we have noted, in reports referring to the Status of Forces surveys, the potential for bias and because of DMDC's role in supporting DOD decision making, we initiated this review under the Comptroller General's statutory authority to conduct evaluations on his own initiative. Specifically, our objective was to determine the extent to which DMDC performs nonresponse analysis of the results of its Status of Forces surveys to determine whether reported results of respondents' views might be under- or overrepresented. To address our objective, a team that included GAO social science analysts with survey research expertise and GAO's Chief Statistician (1) reviewed relevant documentation provided by DMDC regarding the survey methods used for the Status of Forces surveys, (2) interviewed DMDC survey officials who had knowledge of or were involved in the development and administration of the surveys, and (3) reviewed the response rates for the Status of Forces surveys conducted since 2003. We conducted this performance audit between November 2009 and May 2010 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.
Although DMDC has conducted some research to assess and monitor the effects of nonresponse bias in its Status of Forces surveys in the past, it lacks guidance specifying when and how additional analysis of the results of its Status of Forces surveys should be performed in order to determine the extent of differences between survey respondents and nonrespondents. Leading survey research professional organizations, such as the American Association for Public Opinion Research, recognize nonresponse analysis as a sound method for assessing whether nonresponse bias might cause under- or overrepresentation of respondents' views on survey questions. Further, survey research guidelines issued by the Office of Management and Budget state that nonresponse analysis should be performed when survey response rate is below 80 percent, so as to identify the possibility of bias in a survey's results. Although these guidelines are not mandated for internal personnel surveys such as the Status of Forces surveys, as we have previously reported, they reflect generally accepted best practices in the field of survey research and are relevant for the purposes of assessing whether the results of a survey are representative of the population being surveyed. In addition to our prior work discussing low response rates and the potential for bias in the Status of Forces surveys, we have also noted the need for caution when interpreting the results of federal surveys with low response rates. In our review of the various Status of Forces surveys conducted since 2003, we found that the response rates have been between 28 percent and 40 percent for the Status of Forces Active Duty Survey; between 25 percent and 42 percent for the Status of Forces Reserve Survey; and between 55 percent and 64 percent for the Status of Forces Survey of Civilian Employees. While response rates alone are not sufficient indicators for determining the quality of survey results, we note--and DMDC survey officials recognize--that the Status of Forces surveys have had generally low response rates as compared with some other federal surveys. By not performing nonresponse analysis to identify the possibility for nonresponse bias in the results of its various Status of Forces surveys, DMDC survey officials may not have the information needed to adjust their statistical weighting techniques so as to ensure their survey results reflect the population being surveyed. As mentioned previously, DMDC lacks guidance specifying when and how agency staff should assess the results of the Status of Forces surveys for nonresponse bias. Further, we found that since DMDC last conducted research on nonresponse bias and its Status of Forces surveys--in a study it conducted in 2007--DMDC has taken no steps to strengthen its understanding of the effects of nonresponse bias, even though its study noted that performing nonresponse analysis should be a priority for the agency. To better determine the effects of nonresponse bias on the Status of Forces survey results, we recommend that the Secretary of Defense direct the Director of DMDC to develop and implement guidance both for conducting nonresponse analysis and for using the results of nonresponse analysis to inform DMDC's statistical weighting techniques, as part of the collection and analysis of the Status of Forces survey results.
Recommendation for Executive Action
Status: Closed - Implemented
Comments: DOD concurred with our recommendation. In February 2018, DOD's Office of People Analytics (OPA/formerly the Defense Manpower Data Center (DMDC)) indicated that it began conducting nonresponse bias studies in 2010 in response to our recommendation. Since that time, the office has conducted approximately 17 nonresponse bias studies on its surveys. Additionally, OPA has developed a schedule that it follows for the different types of nonresponse analyses that the office typically conducts on a variety of its surveys. According to OPA, survey response rates have been declining for the past 15 years, and are currently about 20 percent. By implementing these nonresponse bias studies at specific intervals for its surveys and investigating the presence of nonresponse bias using different methods, the department is better positioned to improve the quality of its surveys, strengthen the quality of its survey results over time, and provide decision makers with usable survey results to better understand the perspectives of DOD personnel, despite the existence of low response rates.
Recommendation: To better determine the effects of nonresponse bias on the Status of Forces survey results, the Secretary of Defense should direct Director of DMDC to develop and implement guidance both for conducting nonresponse analysis and for using the results of nonresponse analysis to inform DMDC's statistical weighting techniques, as part of the collection and analysis of the Status of Forces survey results.
Agency Affected: Department of Defense