A Markov Decision Process to Dynamically Match Hospital Inpatient Staffing to Demand
Published in: IIE Transactions on Health Systems Engineering, v. 1, no. 2, Oct. 2011, p. 116-130
Posted on RAND.org on September 30, 2011
Read MoreAccess further information on this document at IIE Transactions on Health Systems Engineering
This article was published outside of RAND. The full text of the article can be found at the link above.
Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.