Cover: An Optimal Termination Testing Procedure for Discrete Event Simulations

An Optimal Termination Testing Procedure for Discrete Event Simulations

Published in: Mathematics and Computers in Simulation, v. 44, 1997, p. 81-98

Posted on RAND.org on January 01, 1997

by Phillip M. Feldman, Adviti Muni, Glen Swindle

In this paper the authors consider discrete-event simulations which yield results until a termination condition is satisfied. The simulation can proceed beyond this time, but no useful information is generated. The time at which the termination condition will be satisfied is not known initially, and is taken to be randomly distributed with some prescribed density. It is necessary, therefore, to periodically check the termination condition, and this consumes CPU time. The question that is addressed in this paper is how to distribute checking time to minimize expected CPU expenditure. The authors do this by taking a limit in which the cost of checking is small, and then minimizing the limiting expected CPU expenditure. In general, uniformly distributed checking times are not optimal. The layouts of checking times which are generated by the minimization procedure can significantly outperform constant checking intervals.

This report is part of the RAND Corporation External publication series. Many RAND studies are published in peer-reviewed scholarly journals, as chapters in commercial books, or as documents published by other organizations.

The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.