Cover: Do Differing Analyses Change the Decision?

Do Differing Analyses Change the Decision?

Using a Game to Assess Whether Differing Analytic Approaches Improve Decisionmaking

Published Apr 8, 2019

by Elizabeth M. Bartels, Igor Mikolic-Torreira, Steven W. Popper, Joel B. Predd

Download eBook for Free

FormatFile SizeNotes
PDF file 0.3 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

Research Questions

  1. Can a structured comparison game provide evidence about the relative value of different analytic products?
  2. In the tabletop game conducted, did teams make different decisions when presented with scenario-based versus RDM analysis? How did the decisionmaking process and resulting decisions differ?

The decision analysis community faces obstacles in moving new methods, tools, and paradigms from the academy to the boardroom or the White House Situation Room. Change requires investments of time and resources that are hard to justify if one cannot show the value proposition of an innovative approach — that is, that it truly improves decisionmaking processes and the resulting decisions. However, without an application to evaluate, such evidence may be in short supply, and the approach may therefore not be adopted. In this report, RAND researchers examine that value proposition by comparing the effects of two different analytic inputs on a sample U.S. Department of Defense decision about future force structure. The researchers designed a structured comparison game in which groups of mid-level and senior players simulating defense officials were briefed with two different types of pregame analysis: (1) a traditional, scenario-based analysis, common in force planning efforts, and (2) a novel analysis using RAND's Robust Decision Making (RDM) method. The type of analysis presented appeared to influence the decisionmaking process and resulting decisions; however, the influence of player experience seemed to be larger than that of the analysis presented. The researchers found that the analytic community could use such structured comparison games to assess how policymakers ingest and use analysis. This kind of approach could increase the utility of policy recommendations in the future by providing a window into how analysis is interpreted and used by decisionmakers.

Key Findings

  • Type of analysis appeared to affect the decisionmaking process. In the game prompted by scenario-based analysis, players spent considerable time debating the validity of underlying assumptions, arguing that the assumed demands for, and supply of, forces were optimistic and that shortfalls were larger than presented. In contrast, in the game informed by RDM analysis, players debated the likelihood of different scenarios, the importance of the different scenarios, and the implied risk of the options.
  • The type of analysis — scenario-based versus RDM &mdash also had some effect on what decision was eventually made, but the effect was not strong.
  • Player experience seemed to have a larger effect on the decisionmaking process and resulting decisions than did the type of analysis presented to the players. In the experimental and control conditions, both the content of the debate and the decisions differed between groups of mid-level and senior officials.
  • Games of this design could be used to help research teams hone the communication of analytical approaches and important findings to decisionmakers.

Recommendations

  • The game described here focuses solely on the final decisionmaking process and resulting decisions. In principle, future work could examine the effect of a new approach to decision analysis on elements of force planning, such as scenario selection, staffing during baseline assessments, and staffing in the lead-up to a major decision meeting.
  • More repetitions of the game will be required to better understand the value proposition of analysis for decisionmaking. In addition, access to a wider population of players who better represent the population of decisionmakers would improve the credibility of game results.
  • The analytic community could use such structured comparison games to assess how policymakers ingest and use analysis. This kind of approach could increase the utility of policy recommendations in the future by providing a window into how analysis is interpreted and used by decisionmakers.

This project is a RAND Venture. Funding was provided by gifts from RAND supporters and income from operations. The research was conducted within the RAND National Security Research Division (NSRD) of the RAND Corporation.

This report is part of the RAND research report series. RAND reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND reports undergo rigorous peer review to ensure high standards for research quality and objectivity.

This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.

RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.