Report
Velocity Management: An Approach for Improving the Responsiveness and Efficiency of Army Logistics Processes
Jan 1, 1995
The Change Methodology That Has Propelled the Army's Successful Velocity Management Initiative
Velocity Management (VM)—a term coined by logistics analysts at the RAND Arroyo Center and adopted by the U.S. Army—brings a new way of doing business to U.S. Army logistics. By combining a renewed focus on the customers of the logistics system with a powerful improvement methodology, in just a few years Army logisticians have achieved dramatic and continuous performance gains in key logistics processes.
Today, at facilities and installations across the United States and abroad, a high-velocity, streamlined order and ship process delivers Army repair parts in well under half the time it took to deliver them just three years earlier. The repair process is also experiencing fewer and shorter delays, and cost savings from improved inventory management mean a broader array of ordered items are more readily available.
In recognition of these achievements, in 1998 the Combined Arms Support Command received Vice President Al Gore's Golden Hammer award from the National Partnership for Reinventing Government.
This research note focuses on a key element of the Velocity Management story: the Define-Measure-Improve methodology that has propelled the Army's successful change.
There are two components to successful change: The first is to begin, and the second is to persist. Given a powerful initial stimulus, it is relatively easy for an organization to begin change. For example, when improvement is called for by a powerful constituency, a common organizational response is to launch a number of immediate initiatives. A list of ongoing initiatives can be used to demonstrate responsiveness and to promise quick results. However, most improvement initiatives that are undertaken without sufficient preparation either end inconclusively or quietly fail once the initial enthusiasm passes.
The purpose of the Army's Define-Measure-Improve (D-M-I) methodology is to enable change efforts not only to begin but to persist. Under D-M-I, change is begun by sharpening the organization's ability to improve, focusing on a particular process. This is done by increasing knowledge about the targeted process, in terms of both the expertise of specific individuals and the quantitative data on process performance. Only when this groundwork has been laid are specific improvements implemented.
The performance of processes is improved by applying three readily understood and executed steps:
This cycle is repeated continuously. Figure 1 indicates the key activities under each of the three steps.
Many firms have demonstrated the benefits of using an institutionalized improvement methodology to help sustain change. Perhaps the best known example is Motorola's use of the Six Sigma approach. Embedding an improvement method into an organization's culture makes the expectation and search for improvement into a standard operating procedure, the accustomed way of doing business.
As Figure 2 illustrates, D-M-I is a streamlined version of similar methods that have propelled successful change initiatives in large commercial firms. Powerful, simple, and well tailored to the Army's highly diverse and predominantly young workforce, it has proved instrumental in building consensus, achieving results, and sustaining the momentum required for successful change.
"Define," the first step, emphasizes the need to understand explicitly the needs of customers and the outputs of the particular logistics process under study. Detailed walkthroughs of the process by teams of technical experts are important for improving the organization's understanding of what actual procedures and actions are currently followed and how they affect performance. Since logistics processes can be complex, they need to be understood in great detail if the sources of performance problems are to be accurately identified and solutions correctly developed and implemented.
Group walkthroughs that draw membership from across organizational units also foster competitive improvement incentives while sharing expertise, experience, and insights. It is eye-opening for the participating experts to discover that they all have different and limited views of the same process. Shared questions and suggestions lead to collective insights and develop a common sense of purpose. Even with the friendly competition, a broader organizational cohesion is developed that helps to break down stovepipes and disruptive process handoffs among units. As a result, improvement teams learn that previous improvement efforts focused on particular segments of the process may have been working at crosspurposes and suboptimizing the effectiveness and efficiency of the process as a whole.
Many simple and easy-to-fix issues may be immediately exposed during walkthroughs of specific sites, and these "low-hanging fruit" should of course be gathered. More important is the development of a cadre of individuals who collectively embody a new level of expertise. They share an end-to-end understanding of the process, a common framework for assessing process performance that focuses on customer satisfaction, well-informed hypotheses about the sources of persistent performance deficits, and the collective authority to devise and recommend innovations to improve process performance. Such a cadre constitutes in and of itself an enhanced capability to change that the organization formerly lacked.
Walkthroughs need to be repeated periodically for several reasons. First, even seemingly simple changes to a process may change the flows of materiel, information, and funds, sometimes in unexpected ways. As improvements are made, additional opportunities for improvement are exposed. Also, because of Army rotation policies and other sources of attrition, there is always the need to train new and replacement members of the local Site Improvement Teams. Repeated walkthroughs provide training opportunities for new team members. Finally, there is the need to reemphasize that D-M-I represents a "continuous improvement" effort and that as each local goal is achieved, a new goal should be set to drive toward even better performance. A periodic walkthrough provides visible evidence to the whole organization of this continuing commitment.
Like the Define step, the "Measure" step of the D-M-I method represents an investment that must be made in an organization's capability to improve before a dramatically higher level of performance can be reasonably expected or achieved.
The central activity to foster improvement, measurement is essential to driving change in the right direction and then sustaining that change. The choice of metrics is critical because what gets measured is what gets attended to. But performance measurement must go beyond that: it must enable change agents to diagnose the drivers of weak performance. And as changes are made, measurement must continue to determine which of the changes lead to improvement.
The most critical aspect of measurement is the development and implementation of appropriate metrics that span the full process and reflect key customer values. Metrics are the lingua franca by which all the stakeholders in a process communicate with one another about the goals and status of their improvement efforts. VM advocates the use of multiple metrics to guide improvement on all dimensions of process performance—time, quality, and cost. And because improvement aims to reduce the variability in process performance, metrics as a rule should measure median performance and variance, not only average performance.
Data sources to support the metrics must be identified and evaluated. To date, the VM implementation has been able to proceed utilizing data that are available from standard Army information systems, though frequently these data have been combined and used in innovative ways. A beneficial byproduct of using data to support process improvement is that the quality of the data improves very quickly: those who are trying to use the data uncover previously unnoticed data quality problems, and those who are responsible for inputting and maintaining the data are alerted to the importance of its accuracy, completeness, and timeliness. These data improvements, in turn, often benefit process performance by increasing the rate of successful transactions.
Since trends in performance are more interesting and useful than single snapshots, data must be archived and continually reanalyzed. Analysis is important not only for determining the sources of performance deficits, but also for monitoring and evaluating the effects of improvement efforts. Establishing baseline performance is essential for gauging improvement accurately.
It is best to combine short feedback cycles with longer-term trend data to maximize the use of the data available. Prompt feedback permits a focus on controlling variability and implementing changes as intended, while the trend data help to identify opportunities for improvement and provide the historical perspective to track improvement.
Measurement includes reporting, another activity critical to sustaining continuous improvement. By displaying evidence of the value of an improvement initiative, measurement reporting helps to build support and maintain momentum over a long period.
Measurement offers maximum benefit when the results are widely shared among stakeholders in the process. Customers need to know the level of service they are getting from the different sources they use, and providers need to understand the process-wide effects of their improvement efforts. Improvement is difficult to guide and sustain unless performance feedback is consistent and rapid. As reporting time lengthens, it becomes increasingly difficult to link a specific intervention to an outcome. In work settings where the activity to be improved occurs many times a day, daily feedback is most useful. For activities that are less frequent or more extended, weekly or even monthly feedback may be preferable. Feedback less frequent than monthly is often very difficult to use to evaluate specific improvement actions.
"Improve," the third step of the process, capitalizes on the knowledge developed during the first two steps. By conducting the Define and Measure steps of the D-M-I method, teams develop a much improved understanding of the performance deficits in a process and where improvement efforts might most profitably begin. Through direct observation of the process as it is actually performed, they acquire the detailed knowledge needed to develop innovative alternatives to current ways of doing business and to identify sites where these alternatives can be implemented for initial demonstration.
Through the use of mutually agreed upon metrics, the improvement teams are able to measure whether performance improves at these sites after the implementation of process changes. Moreover, by comparing performance trends at these implementation sites to those at similar sites, they are able to create quasi-experimental demonstrations of the beneficial effects of a given intervention. When data are reported that show a clear performance advantage of participating sites, a compelling motivation is created for other sites to join the process improvement initiative.
Interventions that succeed should be quickly propagated across the organization. Speed is important to process improvement for several reasons. One is to end as quickly as possible at least some of the delays, errors, and waste associated with the current way of doing business. Another is to keep the organization from perpetuating two or more competing ways of doing business that are inconsistent and may evolve into de facto competing standards. A third is to prepare the entire organization for the next wave of innovation, which may come quickly as the D-M-I method is reapplied to the improved process. And a fourth is to keep the organization's change capabilities exercised, so that the improvement of process performance is understood as a continuous goal, not a one-time transition to a new target state.
To implement a demonstrated process improvement, other sites need the same capabilities that motivated the early innovators to change their process and enabled them to succeed in the effort to improve performance: these capabilities include an improvement team willing to apply the D-M-I method locally; access to the data needed to support the pertinent metrics (and perhaps assistance in analyzing that data); and documentation of the successful innovation that is sufficiently detailed to permit them to replicate it. This documentation can also be used in the training and schooling of personnel so that the improved process is not continuously infused with new personnel who know only the old ways of doing business.
This report is part of the RAND Corporation Research brief series. RAND research briefs present policy-oriented summaries of individual published, peer-reviewed documents or of a body of published work.
This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.
The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.