Download eBook for Free
Full Document
Format | File Size | Notes |
---|---|---|
PDF file | 3.7 MB | Use Adobe Acrobat Reader version 10 or higher for the best experience. |
Summary Only
Format | File Size | Notes |
---|---|---|
PDF file | 0.2 MB | Use Adobe Acrobat Reader version 10 or higher for the best experience. |
Public health emergencies often involve making difficult decisions, including when to notify the public of threats, when to close schools or suspend public events, when to dispense medication, and how to allocate scarce resources. Yet, public health practitioners often have little experience or training in crisis decision making and can be uncomfortable with the need to make decisions based on often-incomplete information and short time lines. Unfortunately, there are no established tools for identifying, measuring, and improving public health crisis decision making.
This technical report describes the development and first generation of a tool to measure key aspects of crisis decision making in public health emergencies, based on performance in exercises (e.g., tabletops, functional exercises, full-scale exercises) and real incidents (e.g., outbreaks of waterborne disease). The tool is a paper-and-pencil assessment form that is intended to allow public health practitioners to assess their baseline crisis decision making capabilities and identify shortfalls and shortcomings that may represent opportunities for internal process improvements. The items in the tool focus on the processes of public health crisis decision making that the scientific and practical literatures identify as key components of effective crisis decision making--developing situational awareness, action planning, and using process controls--that, taken together, represent a continuous loop within public health emergency preparedness decision making. The tool focuses on the quality of decision making processes — how decisions are made — as opposed to the quality of the decisions themselves (which is exceedingly difficult to determine, except in retrospect) or the characteristics of the individuals and organizations involved in the decision (which tell us little about the ability to actually make decisions). To allow for objective observation and coding of performance, the tool focuses on group decision making and overt behaviors, such as explicit discussion among decision makers and completion of Incident Command System (ICS) forms. Thus, the tool requires decisions that require deliberation among two or more individuals, at a location in which decision-making processes can be directly observed. The assessment-tool items assess the execution of specific observable activities, which can be categorized within the three general processes.
The assessment tool is currently best suited for exploratory analysis and process improvement. Future testing, including field-testing and more formal pilot-testing, and refinement of the tool might provide a tool ready for accountability. Although the tool was designed for use in measuring crisis decision making for public health emergencies, and primarily at the local and state levels, it could ultimately have applications at the federal level and beyond public health emergency preparedness, such as other areas of homeland security and emergency management.
In the future, the tool might also be adapted to serve as a real-time decision aid or operational tool, or as a complement to computer-based simulation approaches to measuring crisis decision making in public health emergencies. The tool's use as a process-improvement resource could also be enhanced by pairing it with decision aids and suggested strategies for overcoming problems revealed by use of the tool.
This document will be of primary interest to those in public health. However, those involved in homeland security and emergency response will recognize familiar structures (e.g., the Incident Command Structure) and challenges (e.g., the need for contingency planning). The tool was designed to be used in a wide variety of decision-making-group sizes and structures, including those in Emergency Operations Centers and distributed groups (to the extent that group discussion can be observed).
Table of Contents
Chapter One
Introduction
Chapter Two
Which Aspects of Crisis Decision Making Are Worth Measuring?
Chapter Three
How Are the Key Processes Measured?
Chapter Four
How Should the Assessment Tool be Used?
Chapter Five
Conclusion and Next Steps
Appendix A
Draft Public Health Emergency Response Decision-Making Assessment Tool
Appendix B
Summary of the Empirical Evidence Base Behind the Assessment Tool
Appendix C
Additional Technical Detail
This work was prepared for the U.S. Department of Health and Human Services Office of the Assistant Secretary for Preparedness and Response. The research was conducted in RAND Health, a division of the RAND Corporation.
This report is part of the RAND Corporation Technical report series. RAND technical reports may include research findings on a specific topic that is limited in scope or intended for a narrow audience; present discussions of the methodology employed in research; provide literature reviews, survey instruments, modeling exercises, guidelines for practitioners and research professionals, and supporting documentation; or deliver preliminary findings. All RAND reports undergo rigorous peer review to ensure that they meet high standards for research quality and objectivity.
This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.
The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.