Download eBook for Free

FormatFile SizeNotes
PDF file 0.9 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.


Purchase Print Copy

 Format Price
Add to Cart Paperback76 pages $22.50

Research Questions

  1. What is the relationship between age or rank and response rates in major online surveys of U.S. military populations?
  2. What could explain lower response rates among younger service members in the junior ranks?
  3. Do those differences in response rates affect representativeness of data enough to warrant significant investments to increase them?
  4. What strategies could be used to detect nonresponse bias?
  5. What strategies could be used to increase response rates among young personnel?

Because both scholars and policymakers draw from military personnel survey results, survey participants need to be representative of the population. This research examined response rates for several major online U.S. Department of Defense military personnel surveys and found that younger service members, particularly younger enlisted personnel, tend to have very low response rates, even when surveys that are only 50 percent complete are defined as completed. The authors identify possible explanations, including military- and nonmilitary-specific situational and technological barriers and motivational factors.

Low response rates do not necessarily yield results with bias. No minimum response rate has ever been established as a scientific threshold for minimizing nonresponse bias. Strategies to increase response rates can be costly, and previous research shows that they might not necessarily change the results in any perceptible or practically significant way. Thus, the authors propose ways to first understand how well surveys are capturing a representative sample of service members. Weighting the data along demographic characteristics might correct for some biases, but significant gaps in attitudes and experiences could remain. If nonresponse biases are present, the authors recommend reporting the limitations along with the results and identifying the factors that contribute to the bias (e.g., lack of access, trust) so that the survey researchers and sponsors invest only in recruitment strategies that would actually target the source of the problem and not just exacerbate it.

Key Findings

The Low Response Pattern Appears Across Several U.S. Department of Defense Surveys

  • Lower response rates for younger and more-junior members appear across several online surveys despite differences in survey topics, survey sponsors, sampling strategies, recruitment methods, and survey administration.
  • The Air Force is already attempting to reduce the number of surveys it administers and limit their overlap.

Low Response Does Not Necessarily Yield Biased Results

  • Survey administrators commonly aim to reduce the risk of nonresponse bias by working to increase response rates, but higher response rates do not rule out nonresponse bias. Extra recruiting measures might succeed only in engaging more participants with similar experiences and views. Studies have found similar or statistically identical findings between surveys with higher and lower response rates. No minimum response rate has ever been established as a scientific threshold for minimizing nonresponse bias.


  • The military should seek ways to better understand how well its surveys are capturing a representative sample of the chosen population.
  • The military should undertake additional efforts to identify factors potentially contributing to lack of survey participation. This way, strategies designed to address it actually target the source of the problem rather than increase response among the types of people already well represented in the survey.
  • Survey analysts should explore whether younger members' noncompletion results from not having begun a survey at all or from beginning it but dropping out early.
  • The military should focus efforts to increase response rates on strategies that could benefit the force in other ways, such as ensuring that junior enlisted airmen have routine access to their military email accounts and that their contact information on file is accurate.
  • The military should test for nonresponse bias in online surveys.
  • The military should not invest significant resources in efforts solely to increase response rates without first testing whether there is any value in doing so.
  • Surveys should become mobile-friendly, a shift that will have implications for layout and survey length.

Research conducted by

The research reported here was commissioned by the Air Force Office of the Surgeon General (AF/SG) and conducted within the Manpower, Personnel, and Training Program of RAND Project AIR FORCE.

This report is part of the RAND research report series. RAND reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND reports undergo rigorous peer review to ensure high standards for research quality and objectivity.

This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit

RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.