Assessing the Use of Data Analytics in Department of Defense Acquisition

Philip S. Anton, Megan McKernan, Ken Munson, James G. Kallimani, Alexis Levedahl, Irv Blickstein, Jeffrey A. Drezner, Sydne J. Newberry

Research SummaryPublished Aug 13, 2019

Key Points

  1. U.S. Department of Defense data analytics currently support acquisition decisionmaking across a broad spectrum of traditional acquisition functions and newly emerging areas.
  2. Continuing to advance the appropriate application of data analytics will require strategic planning and long-term investments, overcoming barriers to data sharing, installing modern analytic software, and realistic assessments of the capabilities of data analytics.

In 2016, Congress raised concerns about whether the U.S. Department of Defense (DoD) is making optimal use of data analytics in its acquisition decisionmaking. The Joint Explanatory Statement of the Committee of Conference accompanying the fiscal year (FY) 2017 National Defense Authorization Act directed the Secretary of Defense "to brief the Armed Services Committees of the Senate and House of Representatives on the use of data analysis, measurement, and other evaluation-related methods in DOD acquisition programs."[1]

As part of this effort, the Office of the Under Secretary of Defense for Acquisition and Sustainment asked the National Defense Research Institute (NDRI), a federally funded research and development center (FFRDC) operated by the RAND Corporation, to inform the secretary's briefing to the committees. In its study, NDRI took a broad view of the role of — and support for — data analytics in defense acquisition,[2] reaching the following conclusions:

  • The DoD has made progress in improving its data and analytic capabilities. Data analytics currently support acquisition decisionmaking across a broad spectrum of traditional acquisition functions (e.g., market research, cost estimation, risk analysis, basic science and engineering, test and evaluation, security, supply chain management, contracting, production, auditing, and sustainment). DoD research is exploring other possible acquisition applications (e.g., early detection of program problems, data integration for risk analysis, supply-chain network analysis, and text understanding of news stories). Data governance is maturing, and pockets of analytic capabilities exist in the Office of the Secretary of Defense (OSD) and the military departments (e.g., for analysis of program status, cost estimation, contracting, contractor performance, the industrial base, and logistics). Training in data analytics is expanding. Attempts to apply more-advanced commercial data analytics approaches to DoD acquisition data are just beginning.
  • Some of the biggest barriers to expanding and refining the use of data analytics in the acquisition sphere include the lack of data sharing because of cultural, security, and micromanagement concerns; inconsistent data access across the DoD and for FFRDCs and support contractors; and difficulty installing modern analytic software because of security concerns.
  • Long-term investments and strategic planning are needed — both for data governance and for analytic capabilities — as well as concerted efforts by Congress and the DoD to address the culture of not sharing data.
  • Expectations of what data analytics can do for DoD acquisition need to be moderated. Most of the problematic programs examined had issues stemming from strategic acquisition decisions rather than from a lack of data analytics; data analysis may or may not be equally weighted against other factors that DoD leadership must consider when making decisions.

Scope

The scope of this research was determined by the definition of the terms acquisition and data analytics, which may mean different things to different people. NDRI embraced broad definitions of both, reflecting the issues framed in the conference report and DoD parlance.[3] In particular, NDRI adopted the definition of acquisition used by the Defense Acquisition University (DAU):

The conceptualization, initiation, design, development, test, contracting, production, deployment, integrated product support (IPS), modification, and disposal of weapons and other systems, supplies, or services (including construction) to satisfy DoD needs, intended for use in, or in support of, military missions.[4]

Similarly, based on Congress's conference report, NDRI adopted a broad conception of data analytics for acquisition: data analysis, measurement, and other evaluation-related methods (i.e., techniques to assess and analyze data) to inform acquisition decisions, policymaking, program management, evaluation, and learning. Notably, the focus was neither on "big data" or advanced analytics nor on specific data elements or techniques the DoD should be using. Rather, the study was scoped to focus on data and analytics in their broadest sense across the acquisition system.

Research Approach

NDRI relied on a mixed-method approach to address the broad scope. NDRI reviewed and synthesized an array of policy, legislation, defense budgets, published literature, research findings, data on IT systems supporting acquisition, and educational institutions' course curricula. NDRI also conducted semistructured interviews with a variety of subject-matter experts throughout the DoD. NDRI used multiple analyses to measure the overall extent of DoD data analytics, including a functional decomposition and a map of data and applied analytics to acquisition functions and decisions; examinations of the availability and use of data analytics in selected major programs; quantitative analysis of budgets for the analytic workforce, major information systems, and R&D for analytic capabilities; examination of progress and trends in acquisition information and analytic systems; and assessment of the maturity of DoD efforts relative to various maturity models. NDRI assessed the DoD relative to published best practices.

This research approach embraces the breadth of congressional inquiry with limitations on the depth. NDRI did not try to assess what specific acquisition data or analytic techniques are needed. A survey (a data call) was proposed to solicit specific examples of data analytics underway in the DoD acquisition community, but it was deemed infeasible within the available time and resources and likely to produce insufficient insight. Instead, the experience, knowledge, and judgment of the authors were used to synthesize and analyze available information and fill gaps in primary data, published research, and other secondary data.

Study Results

Question 1: What is the extent to which data analytics capabilities have been implemented across the DoD to provide technical support for acquisition program management?

Conclusion 1.1: The DoD is applying a breadth of data analytics across the whole acquisition life cycle.

NDRI found that some manner of data analytics techniques is being applied across the whole acquisition life cycle, including market research, cost estimation, risk analysis, basic science and engineering, test and evaluation, security, supply-chain concerns, contracting, production, auditing, and sustainment. Techniques vary widely and include quantitative analysis, qualitative analysis, predefined formula and forms, systems analysis, data mining, statistical analysis, classification, clustering, outlier detection, filtering, text analytics, visual analysis, and machine learning. Data analytics contribute to major program decisions throughout the entire chain of command, from program management to acquisition executives and other stakeholders across the DoD and Congress, along with other considerations.

Conclusion 1.2: The DoD has implemented an array of data governance and management practices needed for data analytics, but major challenges remain.

The DoD has implemented some aspects of data governance and management needed to enable analytics. These include strategizing and planning; establishing data requirements and use cases; authoritative sourcing; archiving, curating, and data sharing; managing security issues; working on backups and recovery; developing training and support; establishing data definitions and standards; and assessing, auditing, cleaning, transforming, and purging data. However, the maturity of these practices varies across DoD acquisition organizations.

One challenge in data management across the DoD is ensuring common data definitions to allow cross- organizational data analysis. Although some business practices provide standardization, other domains need more-active governance and management. A particular challenge is associated with the collection and use of unstructured data — that is, those that are not in fixed locations but are in free-form text, in contrast to structured data, which are easily identified and located within an electronic structure, such as a relational database.

Conclusion 1.3: The maturity of DoD data analytics capabilities ranges from simple data archives and data plotting to integrated data with statistical analytic tools to research on advanced applications.

Applications of data analytics in the acquisition environment are continuously evolving and span a range of maturity levels, from the use of simple isolated systems for archiving data about procurement to research on more-advanced analytics, such as machine learning and predictive (risk) analysis. Modern commercial off-the-shelf analytic software, such as business intelligence tools, are increasingly replacing preexisting analytic and visualization tools and dashboards.

Many data analytics capabilities have been implemented across OSD and the individual military services in recent years; these examples illustrate the trends:

  • OSD has moved to the Defense Acquisition Visibility Environment (DAVE) for acquisition program information, which contains a recently added "analytic layer" for data scientists to directly apply statistical and other analytic functions and visualization to the acquisition data in the system.
  • OSD has also matured its cost analysis capabilities with the Cost Assessment Data Enterprise (CADE) over the past several years. For example, CADE archives historical manufacturing cost data to enable the user to directly employ cost-analysis algorithms and approaches to estimate costs of proposed weapon systems.
  • The Air Force has moved toward an advanced business intelligence capability for program data called Project Management Resource Tools (PMRT).
  • The Army and the Navy are leveraging existing systems and are pursuing options to improve data availability and the analytic capabilities for their acquisition workforces.
  • With its DIBNow system, OSD Industrial Policy has created an ability to combine and visualize program, contract, and contractor data to assess industrial-base status and performance.

Exploratory research efforts — including advanced analytics — are being pursued at the Defense Advanced Research Projects Agency (DARPA), DoD labs, FFRDCs, university-affiliated research centers, and universities.

Conclusion 1.4: The DoD spends an estimated $11–$15 billion per year on the analytic-related workforce and about $3 billion per year on information systems for acquisition.

Separately measuring the extent of analytic capabilities supporting acquisition is difficult, given that they are not accounted for as such in the DoD's workforce and operation budgets. However, NDRI developed estimates based on parametric analysis of the size of the acquisition workforce, its functions, and readily available budgetary data. This analysis suggest the DoD spends about $11–$15 billion per year on analytic workforce capabilities. The DoD also spends about $3 billion per year (about $0.5 billion for acquisition systems and about $2.5 billion for logistics and supply-chain systems) on major information systems supporting acquisition and sustainment (not desktop computing). These systems involve a mix of acquisition process support, data collection and archiving, and data analytic layers, shedding light on the resources and capabilities that ultimately inform acquisition decisions during execution, management, and oversight.

Question 2: What is the potential to increase the use of analytic capabilities to improve acquisition outcomes?

Conclusion 2.1: Expanded data analytics have the potential to address some acquisition challenges.

NDRI proposed some example topics where expanded analysis could potentially improve acquisition outcomes:

  • Assessing the role of externalities: Some existing metrics do not distinguish effects of external and internal factors (e.g., whereas fuel efficiency is a cost factor internal to a weapon system's design, the cost of purchasing the fuel is external). Analysis might differentiate these factors for decisionmakers.
  • Assessing program performance at the mission level (versus program level): The DoD is exploring how to shift from assessing individual program performance in isolation to assessing performance as it pertains to the integrated set of systems that field mission-level capabilities.
  • Fully implementing "framing assumptions" analysis to enable policymakers to analyze key conceptual risks when approving major defense acquisition programs: This analysis is actually codified in current DoD policy, but expanding its use could enable Congress to better understand risks in newly authorized and funded programs.[5]
  • Conducting performance analysis: The DoD could continue applying data analytics to understand significant trends at the institutional performance level.
  • Assessing data needs: Analyze what data are actually needed, and then determine the comparative costs and benefits of various ways of collecting and managing those data.

Conclusion 2.2: Some recent advanced data analytics might not be applicable to military acquisition problems.

Recent highly publicized advances in commercial data analytics — including those involving artificial intelligence, machine learning, and big data — make it tempting to consider applications of these techniques to acquisition program management. But for a variety of reasons, DoD acquisition programs are not easily amenable to such applications. For example, DoD programs tend to fail for different reasons, and their numbers are low compared with the huge "training" data sets needed for predictive analytics. In addition, commercial successes using data analytics tend to emanate from highly planned efforts on the part of leadership (that is, top down).

Conclusion 2.3: Developing a data analytics strategy that bridges acquisition domains could enable more DoD-wide acquisition analyses.

Many of the individual acquisition functional domains have developed their own data management strategies. However, an overarching data analytics strategy is needed that provides key strategic questions and identifies the data needed to address those questions.

Conclusion 2.4: Continuing to grow and mature data collection, access, and analytic layers within systems requires data governance that could enable greater data sharing.

By leveraging private-sector best practices, the DoD has made progress in maturing data collection, access, and analysis in existing systems, although further progress has been hampered by concerns about data sharing. The importance of data governance in such areas as standardizing data definitions has been recognized. The DoD's program information managers recognize the importance of developing use cases to illustrate the need for data collection and analysis.

A persistent barrier to improving acquisition analytics uniformly and sharing data across the various functional communities is the stovepiping of acquisition data management.

Conclusion 2.5: Cybersecurity concerns have hampered the use of commercially available analytic tools, but partial solutions are available.

Concerns about cybersecurity limit the expanded use of commercial software that would increase analytic capabilities. One possible solution is increased testing of commercial software and disseminating lists of safe analytic tools. Alternatively, the use of virtual computing environments can be used to run commercial software in isolation from DoD networks. Virtual environments solve the problem of isolating security concerns, but they impede data and information flowing in and out of the virtual environments.

Conclusion 2.6: Mechanisms are needed to authorize and ensure protected access to data for both the DoD and external analysts.

Security concerns, as well as concerns about excessive oversight and distractions, have limited access to and sharing of data — not only with contractors who conduct data analytics for DoD acquisition domains but even across programs within the DoD and between the DoD and Congress. Although some recognize the need for data sharing, statutory authorities may be needed to establish and enforce sharing.

Data accessibility can be increased through several mechanisms. For example, Congress could grant permanent access to analysts in FFRDCs. However, other nongovernment analysts need access to particular data sources. An alternate idea is to develop DoD-wide data access categories, in which analysts would be granted blanket access by appropriate government officials.

Conclusion 2.7: Improving incentives and understanding of data analytics could encourage decisionmakers to make better use of data in decisionmaking.

Decisionmakers may benefit from ensuring that they have the incentives and authorities needed to appropriately balance insights from data analytics against other strategic considerations (e.g., related to policies, strategies, budgets, missions, urgency, and threats). Also, providing rising decisionmakers with the training and tools to understand how to interpret, weigh the strengths and limitations of, and apply relevant data to decisions could help strengthen the benefits of data analytics for decisionmaking.

Question 3: What is the amount of funding for intramural and extramural R&D activities to develop and implement data analytics capabilities in support of improved acquisition outcomes?

Conclusion 3.1: NDRI identified roughly $200 million per year in program element budgets, and about $520 million per year in major information system budgets, to develop new acquisition data analytics capabilities.

The DoD's chart of accounts for research, development, test, and evaluation does not specifically track R&D for acquisition data analytics. NDRI analyzed the DoD FY 2019 budget request for indications of program elements that involved data analytics for acquisition. NDRI estimated that, across 31 program elements, approximately $200 million was requested based on analysis of the extent of data analytics in these program elements.

As for information technology systems related to acquisition, about $520 million was requested in FY 2019, an increase of $207 million from FY 2017.

Four topics related to acquisition data analytics were also identified in the January 2019 Small-Business Innovation Research (SBIR) and Small-Business Technology Transfer (STTR) solicitations. NDRI also found anecdotal evidence of exploratory research on acquisition analytics applications across the DoD.

These investments do not include R&D for military operations or other areas outside acquisition (e.g., budgeting, requirements, or intelligence).

Question 4: What potential improvements, based on private-sector best practices, in the efficiency of current data collection and analysis processes could minimize collection and delivery of data by, from, and to government organizations?

Conclusion 4.1: A number of private-sector best practices could improve DoD efficiency by minimizing the collection and delivery of data.

NDRI studied the findings of consulting companies that assess, survey, and review the field for lessons learned and noted a fairly consistent set of common practices, including the following:

  • Develop a data strategy (i.e., preemptively plan and prioritize the data that need to be collected to weigh the costs and benefits of the various alternatives and make informed decisions about the most pressing questions, and to develop use cases).
  • Identify the critical data needed and establish common data definitions across the organizations. Implement automatic data collection from operational systems for subsequent analysis. Automatic data collection can provide more accurate, current data than would manual data reporting.
  • Designate which data system is the single authoritative source for a particular datum,[6] then share that datum via technical means to other systems that need to use it. This practice increases transparency, ensures that everyone is using the same data, and reduces duplicative — and potentially erroneous — data entry.
  • Perhaps most importantly, recognize data as enterprise-wide assets that should be shared, with appropriate privacy protections in place to improve the efficiency of the organization.

Conclusion 4.2: Although DoD information managers implement many of these practices, the level of maturity of these practices varies widely.

Information managers seek use cases to identify what data are needed and for what purposes. Designating authoritative data sources and sharing data across acquisition systems are becoming more common. The use of common program management software suites that can automatically share project or program data could be expanded.

Conclusion 4.3: Opportunities for improvement lie in continuing to improve data sharing and security issues.

Although the DoD has made progress in opening its data acquisition systems and sharing data, challenges to sharing remain. The most difficult problem is a culture that resists sharing. This resistance stems from a number of concerns, including security (both from elevated classification because of data aggregation and from unauthorized release of sensitive information), trust in how data are used, and appropriate data labeling. The DoD could encourage data sharing by emphasizing that these data are DoD enterprise assets, developing approaches to resolve security and sensitivity issues, and ensuring that oversight staff will not use data to micromanage programs.

Question 5: What steps are being taken to expose anonymized acquisition data to researchers and analysts?

Conclusion 5.1: The DoD provides some anonymized personnel data.

Some anonymized personnel data (including acquisition workforce data) — which would otherwise be sensitive, personally identifiable information (PII) with legal releasability restrictions — are being made available through the Defense Manpower Data Center and the Office of Personnel Management.

Conclusion 5.2: Although the DoD has made some progress in improving data sharing, for various reasons it is not generally anonymizing data.

Practical reasons explain why anonymization has not been widespread. Anonymization is not always reliable: Advances in analytic tools can sometimes identify data. Also, much of the metadata that would be removed in anonymization are important for analyzing potential causes of identified trends. In addition, DoD data generally lack data-sensitivity metadata at the data-element level, making it hard to select which data cannot be shared and why. Furthermore, government procedures for categorizing and handling sensitive data are complicated, slow, and not well understood by staff; incentives drive conservatism to block sharing (e.g., what exactly can and cannot be asserted as proprietary information by a contractor, how can markings be changed, and what are the personal risks involved?). Finally, other data are available without being anonymized. These include some program and budget data that are publicly released.

Question 6: Do training institutions include appropriate courses on data analytics and other methods and their application to defense acquisition?

Conclusion 6.1: The primary DoD acquisition training institutions offer at least some data analytics courses with acquisition applications.

NDRI reviewed the curricula at four defense institutions: DAU, the Naval Postgraduate School, the Air Force Institute of Technology, and the National Defense University. Three of the four schools (DAU, the Air Force Institute of Technology, and the Naval Postgraduate School) offer a broad array of acquisition courses, ranging in depth and applicability from courses in acquisition theory and processes to hands-on applied data analytics courses, such as cost analysis, which represent the majority of the courses offered. These universities also offer courses in general purpose data analytics. The National Defense University focuses primarily on defense strategy, not acquisition.

DAU also has official partnerships with a number of civilian-sector universities and private-sector companies to offer classes to the DoD workforce, such as more-advanced coursework in data analytics. For example, partnerships with four universities in the District of Columbia area, Stanford University, the University of Michigan, and the Georgia Institute of Technology offer a wide selection of courses related to data analytics for acquisition, ranging from applied training to courses in policy. Private-sector partnerships include Google and IBM.

Conclusion 6.2: Not everyone in acquisition can (or should) become a deep data scientist.

These applied and general-purpose courses should increase the ability of the acquisition workforce to conduct simple analysis while becoming smart consumers of analysis conducted by specialists. Still, it is unreasonable to expect or want most acquisition personnel to become experts in data analytics.

Conclusion 6.3: Successful application of data analytics requires expertise in both data analytics and acquisition, which is hard to find.

Personnel with expertise in both data analytics and the application domain are a rarity — not only in the DoD but in the private sector as well. Thus, a more achievable goal may be to develop an acquisition workforce that possesses the necessary range of skills and expertise to conduct, understand, and apply the findings of acquisition data analysis while growing a cadre of application specialists.

Summing Up: Steps Congress and the DoD Can Take to Improve the Data Analytics Capabilities for Defense Acquisition

According to the findings of the report, DoD leaders need to identify what they want data analytics to accomplish, which will help define what specific acquisition data and analytic capabilities they need and what Congress and others can do to help. In the spirit of helping to address those questions, NDRI offers several suggested opportunities and next steps, categorized by stakeholder group.

Congress

Congress can take the following steps to help the DoD move acquisition data analytics forward.

Opportunities and actions

  • Clarify in 10 U.S.C. 2222(e) that all acquisition and sustainment data are common enterprise data and thus available across the DoD.[7]
  • Make permanent FFRDCs' access to sensitive data under Section 235 of the FY 2017 National Defense Authorization Act.[8]
  • Identify DoD acquisition leadership structures that streamline acquisition while balancing conflicting incentives and other strategic motivations (to minimize instances in which acquisition decisions contradict the data).
  • Determine the changes in statutes needed to allow efficient access to sensitive data for university-affiliated research centers, contractors working for DoD labs, and other support contractors while ensuring appropriate data protections.

The Undersecretary of Defense for Acquisition & Sustainment, Chief Management Officer, Chief Information Officers, Chief Data Officers, and Service Acquisition Executives

Opportunities and actions

  • Address disincentives to data sharing.
  • Enable appropriate DoD-wide access to sensitive data for analysts.
  • Facilitate access to analytic tools through virtual computing environments and an approved list of software for installation on DoD computers.
  • Continue R&D on improving data and analytic systems and new acquisition applications.
  • Develop a data analytics strategy across acquisition domains.
  • Identify how to address disincentives to data sharing.
  • Perform policy and process analysis on data aggregation and classification upgrades to ensure more-consistent application.
  • Analyze policies and approaches for granting DoD-wide access to various DoD information systems for government and contractor analysts.
  • Identify the minimum data needed, at what level, and for what purposes, given costs and benefits.
  • Conduct detailed analysis to create a cross-domain DoD data analysis strategy.

DoD Information Managers

Opportunities and actions

  • Continue to pursue project and program management and process software suites with data outputs that feed oversight information systems.
  • Continue to mature data collection, access, and analytics.
  • Continue to compile and share catalogs of available data.

Defense Acquisition Training Institutions

Opportunities and actions

  • Continue to offer courses in data science and applied data analytics for staff, management, and rising leaders.
  • Assess the quality and practical utility of data analytics courses.

DoD Data Analysts

Finally, NDRI recommends that DoD data analysts consider developing or expanding five areas of data analysis:

  • Explore better ways to objectively separate effects of uncertainties and externalities in sustainment metrics.
  • Explore mission-level analyses.
  • Optimize use of framing assumptions and their metrics.
  • Analyze institutional performance.
  • Identify the core data needed to answer important questions.

Although some of these recommended efforts are well underway, some will require further research to develop optional implementation approaches.

Notes

  • [1] U.S. House of Representatives, National Defense Authorization Act for Fiscal Year 2017: Conference Report to Accompany S. 2943, Washington, D.C., Report 114-840, November 30, 2016, pp. 1125–1126.
  • [2]The Joint Explanatory Statement of the Committee of Conference said, "The briefing shall address the extent to which data analytics capabilities have been implemented within the military services, DOD laboratories, test centers, and Federally Funded Research and Development Centers to provide technical support for acquisition program management; the potential to increase the use of analytical capabilities for acquisition programs and offices to improve acquisition outcomes; the amount of funding for intramural and extramural research and development activities to develop and implement data analytics capabilities in support of improved acquisition outcomes; any potential improvements, based on private-sector best practices, in the efficiency of current data collection and analysis processes that could minimize collection and delivery of data by, from, and to government organizations; steps being taken to appropriately expose acquisition data in an anonymized fashion to researchers and analysts; and an assessment of whether the curriculum at the National Defense University, the Defense Acquisition University, and appropriate private-sector academic institutions includes appropriate courses on data analytics and other evaluation-related methods and their application to defense acquisitions" (U.S. House of Representatives, 2016, p. 1126).
  • [3] U.S. House of Representatives, 2016, pp. 1125–1126.
  • [4] DAU, DAU Glossary, Fort Belvoir, Va., February 9, 2017.
  • [5] Department of Defense Instruction 5000.02, Operation of the Defense Acquisition System, incorporating Change 3, Washington, D.C.: U.S. Department of Defense, January 7, 2015, effective August 10, 2017.
  • [6] Leandro DalleMule and Thomas H. Davenport, "What's Your Data Strategy?" Harvard Business Review, May–June 2017, pp. 112–121.
  • [7] United States Code, Title 10, Section 2222, Defense Business Systems: Architecture, Accountability, and Modernization.
  • [8] Public Law 114-328, National Defense Authorization Act for Fiscal Year 2017, December 23, 2016. Note that this study was conducted by an FFRDC.
Cover: Assessing the Use of Data Analytics in Department of Defense Acquisition

Available for Download

Topics

Document Details

Citation

RAND Style Manual
Anton, Philip S., Megan McKernan, Ken Munson, James G. Kallimani, Alexis Levedahl, Irv Blickstein, Jeffrey A. Drezner, and Sydne J. Newberry, Assessing the Use of Data Analytics in Department of Defense Acquisition, RAND Corporation, RB-10085-OSD, 2019. As of October 11, 2024: https://www.rand.org/pubs/research_briefs/RB10085.html
Chicago Manual of Style
Anton, Philip S., Megan McKernan, Ken Munson, James G. Kallimani, Alexis Levedahl, Irv Blickstein, Jeffrey A. Drezner, and Sydne J. Newberry, Assessing the Use of Data Analytics in Department of Defense Acquisition. Santa Monica, CA: RAND Corporation, 2019. https://www.rand.org/pubs/research_briefs/RB10085.html.
BibTeX RIS

This publication is part of the RAND research brief series. Research briefs present policy-oriented summaries of individual published, peer-reviewed documents or of a body of published work.

This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.

RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.