As programs grow larger and the interdependencies of the program activities get complex, tools are needed to introduce more rigor into program management to reduce schedule risk. The authors of this report describe novel methods delivered to the Ground Based Strategic Deterrent program office to reduce the likelihood of rework in program execution and provide insights into schedule risk and how to restructure task dependencies to manage risk.
Graph Theoretic Algorithms for the Ground Based Strategic Deterrent Program
Prioritization and Scheduling
Purchase Print Copy
|Add to Cart||Paperback80 pages||$35.50||$28.40 20% Web Discount|
- What are quantitative methods that can be used to make the unified certification process for nuclear systems more rigorous and efficient? Specifically, what numerical methods are faster than traditional Monte Carlo simulations and can be used efficiently on very large graphs?
- How does the critical path for program execution depend on the overall structure of how tasks are scheduled?
- How can the reallocation of resources change the risk of meeting expected schedules?
During the execution of programs, managers must orchestrate the planning and performance of upwards of tens of thousands of interdependent tasks. Many of these tasks have interdependencies such that failure to meet scheduling expectations in one task can affect the execution of many others. As programs grow larger and the interdependencies of the program activities get complex, tools are needed to introduce more rigor into program management to reduce schedule risk.
The Ground Based Strategic Deterrent (GBSD) currently under development — a complete replacement for the aging LGM-30 Minuteman III intercontinental ballistic missile system — is a good example of a large, complex program with numerous interdependent tasks. Given the tight schedule by which GBSD is bound and the numerous organizations participating, it is necessary that the required certification process for nuclear systems proceeds in an efficient manner. In this report, the authors present the underlying rationale behind algorithms already delivered to the GBSD program office that were designed to reduce the likelihood of rework in program execution, provide better insight into schedule risk, and provide insights into how to restructure task dependencies to manage schedule risk. Although the novel methods described in this report were developed for the GBSD program, they are general and can be applied to any program.
The authors examined three classes of graph topology: a simple topology of parallel chains, random graphs, and scale-free graphs
- For the simple chain topology, although performing tasks in parallel can theoretically reduce the overall time, it does not do so as much as expected; the likelihood of having at least one bad chain increases with the number of chains.
- Parallelizing development reduces variance — but also increases end-to-end time — more than expected.
- Although working tasks in parallel remains desirable, the project manager's expectations for time savings should be tempered.
- For Erdős-Rényi topology (random graphs), analysis shows the same trade-off between variance and average completion time as seen in the simple chain topology, though it is more subtle. In this case, interlinking paths make it less likely that a single chain will dominate.
- For the scale-free graph topology, analysis shows that graphs with a single, dominant longest path are much more likely to complete on time than graphs with several parallel paths that may each be critical paths. This result means that the project manager can reduce the overall completion time with only a marginal increase in the corresponding risk by placing important tasks in series with one another.
The authors present a novel numerical approach to find the critical path in a large graph
- The fastest, and preferred, method to find the critical path uses Chebyshev polynomials as a basis. The other method uses trapezoidal integration but is preferred only when software packages limit the ability to manipulate Chebyshev polynomials.
Table of Contents
Analytic Approach for Calculating the Criticality Index
Equivalence of Representations
The PERT and CPM Algorithms
Proof That Convolution of Independent, Identical Beta Distributions Converges to a Normal Distribution
Derivation of Maximum Order Statistic F