Do Differing Analyses Change the Decision?
Using a Game to Assess Whether Differing Analytic Approaches Improve Decisionmaking
ResearchPublished Apr 8, 2019
The decision analysis community faces obstacles in moving new methods, tools, and paradigms from theory to practice, partly because of the difficulties in demonstrating value proposition. In this report, RAND researchers use a structured comparison game to examine the value proposition of different analytic inputs (scenario-based versus Robust Decision Making) on a sample U.S. Department of Defense decision about force structure.
Using a Game to Assess Whether Differing Analytic Approaches Improve Decisionmaking
ResearchPublished Apr 8, 2019
The decision analysis community faces obstacles in moving new methods, tools, and paradigms from the academy to the boardroom or the White House Situation Room. Change requires investments of time and resources that are hard to justify if one cannot show the value proposition of an innovative approach — that is, that it truly improves decisionmaking processes and the resulting decisions. However, without an application to evaluate, such evidence may be in short supply, and the approach may therefore not be adopted. In this report, RAND researchers examine that value proposition by comparing the effects of two different analytic inputs on a sample U.S. Department of Defense decision about future force structure. The researchers designed a structured comparison game in which groups of mid-level and senior players simulating defense officials were briefed with two different types of pregame analysis: (1) a traditional, scenario-based analysis, common in force planning efforts, and (2) a novel analysis using RAND's Robust Decision Making (RDM) method. The type of analysis presented appeared to influence the decisionmaking process and resulting decisions; however, the influence of player experience seemed to be larger than that of the analysis presented. The researchers found that the analytic community could use such structured comparison games to assess how policymakers ingest and use analysis. This kind of approach could increase the utility of policy recommendations in the future by providing a window into how analysis is interpreted and used by decisionmakers.
This project is a RAND Venture. Funding was provided by gifts from RAND supporters and income from operations. The research was conducted within the RAND National Security Research Division (NSRD) of the RAND Corporation.
This publication is part of the RAND research report series. Research reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND research reports undergo rigorous peer review to ensure high standards for research quality and objectivity.
This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.
RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.