A review of major online U.S. military surveys found very low response rates among young enlisted members. The authors identify possible explanations, and note that low response rates do not necessarily yield results with bias. The authors propose ways to understand how well surveys are capturing a representative sample and determine recruitment strategies that would actually target the source of the problem and not just exacerbate it.
Understanding Low Survey Response Rates Among Young U.S. Military Personnel
- What is the relationship between age or rank and response rates in major online surveys of U.S. military populations?
- What could explain lower response rates among younger service members in the junior ranks?
- Do those differences in response rates affect representativeness of data enough to warrant significant investments to increase them?
- What strategies could be used to detect nonresponse bias?
- What strategies could be used to increase response rates among young personnel?
Because both scholars and policymakers draw from military personnel survey results, survey participants need to be representative of the population. This research examined response rates for several major online U.S. Department of Defense military personnel surveys and found that younger service members, particularly younger enlisted personnel, tend to have very low response rates, even when surveys that are only 50 percent complete are defined as completed. The authors identify possible explanations, including military- and nonmilitary-specific situational and technological barriers and motivational factors.
Low response rates do not necessarily yield results with bias. No minimum response rate has ever been established as a scientific threshold for minimizing nonresponse bias. Strategies to increase response rates can be costly, and previous research shows that they might not necessarily change the results in any perceptible or practically significant way. Thus, the authors propose ways to first understand how well surveys are capturing a representative sample of service members. Weighting the data along demographic characteristics might correct for some biases, but significant gaps in attitudes and experiences could remain. If nonresponse biases are present, the authors recommend reporting the limitations along with the results and identifying the factors that contribute to the bias (e.g., lack of access, trust) so that the survey researchers and sponsors invest only in recruitment strategies that would actually target the source of the problem and not just exacerbate it.
The Low Response Pattern Appears Across Several U.S. Department of Defense Surveys
- Lower response rates for younger and more-junior members appear across several online surveys despite differences in survey topics, survey sponsors, sampling strategies, recruitment methods, and survey administration.
- The Air Force is already attempting to reduce the number of surveys it administers and limit their overlap.
Low Response Does Not Necessarily Yield Biased Results
- Survey administrators commonly aim to reduce the risk of nonresponse bias by working to increase response rates, but higher response rates do not rule out nonresponse bias. Extra recruiting measures might succeed only in engaging more participants with similar experiences and views. Studies have found similar or statistically identical findings between surveys with higher and lower response rates. No minimum response rate has ever been established as a scientific threshold for minimizing nonresponse bias.
- The military should seek ways to better understand how well its surveys are capturing a representative sample of the chosen population.
- The military should undertake additional efforts to identify factors potentially contributing to lack of survey participation. This way, strategies designed to address it actually target the source of the problem rather than increase response among the types of people already well represented in the survey.
- Survey analysts should explore whether younger members' noncompletion results from not having begun a survey at all or from beginning it but dropping out early.
- The military should focus efforts to increase response rates on strategies that could benefit the force in other ways, such as ensuring that junior enlisted airmen have routine access to their military email accounts and that their contact information on file is accurate.
- The military should test for nonresponse bias in online surveys.
- The military should not invest significant resources in efforts solely to increase response rates without first testing whether there is any value in doing so.
- Surveys should become mobile-friendly, a shift that will have implications for layout and survey length.
Table of Contents
Response Rates on the 2012 RAND Survey of Airmen on Information and Communication Technologies and Well-Being
Response Patterns by Age or Rank Group in Other Large Recent Surveys of U.S. Military Personnel
Low Response Rates in Survey Research and Their Implications
Conclusion and Recommendations for Future Air Force and Other Military Surveys
Strategies Used to Promote Participation in the 2012 RAND Survey of Airmen on Information and Communication Technologies and Well-Being
Survey Invitation for the 2012 RAND Survey of Airmen on Information and Communication Technologies and Well-Being