Download

Download eBook for Free

FormatFile SizeNotes
PDF file 1 MB

Use Adobe Acrobat Reader version 10 or higher for the best experience.

Purchase

Purchase Print Copy

 FormatList Price Price
Add to Cart Paperback156 pages $25.00 $20.00 20% Web Discount

Research Questions

  1. What kinds of planning and assessment, monitoring, and evaluation (AME) processes and practices are being conducted inside and outside the U.S. Department of Defense (DoD) that are relevant to security cooperation? What are their strengths and weaknesses?
  2. How can AME methods be applied, integrated, and implemented by major security cooperation organizations so that they conform as closely as possible to analytic best practices and existing DoD policies, plans, and processes?

At a time when the United States is increasingly relying on foreign partners for its security and attempting to build their military capacity, security cooperation activities and expenditures can no longer be justified with anecdotal evidence. This report seeks to address the challenge of creating a U.S. Department of Defense–wide (DoD-wide) system for security cooperation assessment, monitoring, and evaluation (AME): first, by analyzing existing planning and AME processes and practices inside and outside DoD to understand what works and what does not in contexts relevant to security cooperation; and second, by presenting a conceptual framework that explains how AME methods might be applied, integrated, and implemented by major security cooperation organizations so that they conform as closely as possible to analytic best practices and existing DoD policies, plans, and processes.

Without leadership from the Office of the Secretary of Defense (OSD), the results of improved AME will be almost impossible to regularize and aggregate in a manner useful for security cooperation planning and management at various levels or for coordination and collaboration with security sector assistance partners outside of DoD. OSD should clarify AME roles, responsibilities, and reporting relationships with respect to security cooperation. OSD should work with the Defense Security Cooperation Agency (DSCA) to develop general theories of change and a set of logic models for common partner capability development areas, such as engagements, exercises, education, train and equip activities, and institution-building. OSD and DSCA should identify funding for a centralized, independent evaluation organization, as well as an organization to support and synchronize performance and effectiveness monitoring.

Key Findings

Combatant Command Planning and AME

  • The AME systems of the combatant commands (CCMDs) are not comprehensive or standardized; they currently focus more on theater-level functional and mission objectives than they do on partner country activities and objectives.

Relevant AME Frameworks

  • Inside DoD, the Army and Air Force have struggled to develop reliable and comprehensive partner country capability assessments that are useful for service, joint, and interagency planning.
  • Inside DoD, the Army and Air Force have struggled to develop reliable and comprehensive partner country capability assessments that are useful for service, joint, and interagency planning.The Office of the Secretary of Defense has developed an AME process for the counterterrorism-focused Section 1206/2282 program that contains several useful elements, including a handbook, a logic model, and a team of independent evaluators.
  • The U.S. Agency for International Development's Interagency Security Sector Assessment Framework identifies capability and capacity shortfalls and barriers to change and prioritizes responses.
  • The World Bank Group dedicates 1.3 percent of program funding to AME. It uses decentralized self-evaluations reviewed by independent experts.

DoD-Wide AME Framework

  • The framework proposes a "hybrid" AME management model, in which the assessment and monitoring functions are executed by security cooperation stakeholders — guided by policy, training, technical assistance, guidance, tools, and templates — and the evaluation function is centralized in OSD to ensure independence and evaluations that are prioritized based on set criteria.
  • The framework includes a five-step cycle for integrating AME into larger security cooperation processes, beginning with an initial environmental assessment, then incorporation of AME results into planning (step two) and program design (step three), then monitoring of plan and program implementation (step four), and finally centralized evaluation (step five).

Recommendations

  • OSD should update planning guidance to direct the development of AME reporting in support of civilian oversight requirements, while allowing CCMDs and services to tailor aspects of reporting for their own needs.
  • OSD should adjust DoD security cooperation guidance and leverage planning and programming reviews to increase senior DoD leader focus on getting useful reporting from CCMDs.
  • Joint Staff should work with service security cooperation planners to develop an approach for injecting service partner country assessments into joint and interagency planning and programming.
  • OSD should task the Defense Security Cooperation Agency (DSCA) to develop a handbook for program-level AME that could be modeled on the Section 1206/2282 handbook.
  • OSD should work with DSCA to develop general theories of change and a set of logic models for common capability development areas, such as engagements, exercises, education, train and equip activities, and institution-building.
  • OSD should consider incorporating aspects of the Millennium Challenge Corporation's approach to producing a monitoring and evaluation plan developed jointly by the United States and partner nation at the start of any program to strengthen host nation participation and political will.
  • OSD should expedite issuance of security cooperation AME policy guidance, including establishment of roles and responsibilities.
  • OSD and DSCA should identify funding for a centralized, independent evaluation organization, as well as an organization to support and synchronize performance and effectiveness monitoring.
  • OSD should lead an effort to develop a template with a small, focused set of standardized SMART objectives and performance/effectiveness indicators to be used as a model.

Table of Contents

  • Chapter One

    Introduction

  • Chapter Two

    Combatant Command Planning and AME

  • Chapter Three

    Analysis of Relevant AME Frameworks

  • Chapter Four

    Proposed DoD-Wide Security Cooperation AME Framework

  • Chapter Five

    Recommendations for Implementing an AME Framework

This research was sponsored by the Deputy Assistant Secretary of Defense for Security Cooperation and conducted within the International Security and Defense Policy Center of the RAND National Defense Research Institute, a federally funded research and development center sponsored by the Office of the Secretary of Defense, the Joint Staff, the Unified Combatant Commands, the Navy, the Marine Corps, the defense agencies, and the defense Intelligence Community.

This report is part of the RAND Corporation research report series. RAND reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND reports undergo rigorous peer review to ensure high standards for research quality and objectivity.

Permission is given to duplicate this electronic document for personal use only, as long as it is unaltered and complete. Copies may not be duplicated for commercial purposes. Unauthorized posting of RAND PDFs to a non-RAND Web site is prohibited. RAND PDFs are protected under copyright law. For information on reprint and linking permissions, please visit the RAND Permissions page.

The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.