United States Service Academy Admissions

Selecting for Success at the Air Force Academy and as an Officer

Chaitra M. Hardison, Susan Burkhauser, Lawrence M. Hanser

ResearchPublished Jul 22, 2016

This report examines admission standards at the United States Air Force Academy (USAFA) and officer outcomes to evaluate whether the USAFA admissions formula should be adjusted. USAFA admissions are partly determined based on scores calculated as a weighted combination of three elements: academic composite, leadership composite, and selection panel score. The authors explored relationships between various admissions factors (SAT/ACT composite scores, high school rank divided by class size, selection composite score, leadership composite score, academic composite score, and selection panel score) and USAFA and officer outcomes (grade point average; failure to graduate for academic reasons; failure to graduate because of a desire for a career change; military performance average; overall performance average; and promotion to O-4, O-5, and O-6). The data included records on the nearly 35,000 cadets who attended USAFA from 1980 to 2011 from three Air Force data sources: USAFA registrar admissions records, USAFA cadet records, and Air Force personnel records. Recommendations include removing the selection panel score from the selection composite formula, collecting additional information on applicants, and improving data retention and maintenance.

Key Findings

Most of the Admissions Formula Elements Are Useful in Predicting Outcomes at the United States Air Force Academy (USAFA); Selection Panel Score Is the One Exception

  • SAT, academic composite, leadership composite, and selection composite are significant predictors of failure to graduate for academic reasons, graduation versus choosing a career change, grade point average (GPA), military performance average (MPA), and overall performance average (OPA).
  • In the overwhelming majority of the analyses, selection panel score was significant, but in the wrong direction: Higher scores were associated with a lower likelihood of success in many of the outcomes.

The Analyses Showed Interesting Differences Depending on Which Outcome Was Predicted

  • The admissions factors do a much better job of predicting GPA and OPA than they do of predicting MPA. They also do a better job of predicting graduation versus failure for academic reasons than they do of predicting graduation versus choosing a career change.
  • For predicting failure for academic reasons, the academic composite should be weighted more heavily than leadership composite, and the selection panel score should be excluded. For predicting graduating versus choosing a career change, the results suggest weighting leadership composite slightly less than the academic composite.
  • Academic composite should be weighted twice that of leadership composite for predicting MPA and nine times that of the leadership composite for predicting OPA and GPA.
  • For predicting promotions to O-4, O-5, and O-6, USAFA outcomes (GPA, MPA, and OPA) were much better predictors than the admissions factors, and MPA was a slightly better predictor than USAFA GPA.

Recommendations

  • Adjust the selection composite formula. The authors saw repeated evidence that the use of the selection panel score (generated from the selection panel's evaluation of the candidate, which includes a review of admissions liaison officer evaluations, writing samples, teacher evaluations, recommendations, and the candidate fitness assessment) was not helping to select the best candidates; in fact, it was hindering it. They recommend removing the selection panel score from the selection composite formula.
  • Collect additional information on applicants. A wide variety of measures (including personality tests, situational judgment tests, critical thinking performance tasks, and writing tests) have been identified as useful tools for predicting performance in workplace settings. Some of these measures may add value to the United States Air Force Academy (USAFA) selection process as well.
  • Improve data retention and maintenance. USAFA's archival data sources should contain scores on all possible measures taken at the time of application to the institution, should include data on people who are not admitted to USAFA, and should track cadet and officer performance over time. The authors also recommend that the Air Force consider developing a method for capturing systematic performance evaluations on its officers at various points in their careers for research purposes only. If the Air Force hopes to conduct a study such as this one again in the future, it should ensure that the data being collected and retained today are maintained in such a way that they can provide data-driven recommendations for policy changes in the future.

Order a Print Copy

Format
Paperback
Page count
94 pages
List Price
$17.50
Buy link
Add to Cart

Topics

Document Details

  • Availability: Available
  • Year: 2016
  • Print Format: Paperback
  • Paperback Pages: 94
  • Paperback Price: $17.50
  • Paperback ISBN/EAN: 978-0-8330-9182-6
  • DOI: https://doi.org/10.7249/RR744
  • Document Number: RR-744-OSD

Citation

RAND Style Manual
Hardison, Chaitra M., Susan Burkhauser, and Lawrence M. Hanser, United States Service Academy Admissions: Selecting for Success at the Air Force Academy and as an Officer, RAND Corporation, RR-744-OSD, 2016. As of October 8, 2024: https://www.rand.org/pubs/research_reports/RR744.html
Chicago Manual of Style
Hardison, Chaitra M., Susan Burkhauser, and Lawrence M. Hanser, United States Service Academy Admissions: Selecting for Success at the Air Force Academy and as an Officer. Santa Monica, CA: RAND Corporation, 2016. https://www.rand.org/pubs/research_reports/RR744.html. Also available in print form.
BibTeX RIS

This research was sponsored by the Director of Accession Policy in the Office of the Under Secretary of Defense for Personnel and Readiness and conducted within the Forces and Resources Policy Center of the RAND National Defense Research Institute, a federally funded research and development center sponsored by the Office of the Secretary of Defense, the Joint Staff, the Unified Combatant Commands, the Navy, the Marine Corps, the defense agencies, and the defense Intelligence Community.

This publication is part of the RAND research report series. Research reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND research reports undergo rigorous peer review to ensure high standards for research quality and objectivity.

This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.

RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.