A MARKOV DECISION MODEL TO OPTIMIZE HOSPITAL BED CAPACITY UNDER STOCHASTIC DEMAND

 

Paul Kizito Mubiru

 

Abstract

 

Hospitals continually face the challenge of planning and managing bed capacities within an environment of demand uncertainty. In this paper, an optimization method for allocating hospital bed capacities is proposed. The model, based on Markov decision process approach, matches demand to bed availability levels of a health care system. Adopting such an approach, the states of a Markov chain represent possible states of demand for bed occupancy. The decision of whether or not to admit additional patients is made using dynamic programming. This approach demonstrates the existence of an optimal state-dependent capacity level, and produces an optimal admission policy for patients as well as the corresponding total capacity costs.

 

Lecture Notes in Management Science (2011) Vol. 3: 425-434

3rd International Conference on Applied Operational Research, Proceedings

© Tadbir Operational Research Group Ltd. All rights reserved.

www.tadbir.ca

 

ISSN 2008-0050 (Print)

ISSN 1927-0097 (Online)

 

ARTICLE OUTLINE

 

·         Introduction

·         Literature Review

·         Model Development

·         Problem Description, Assumptions And Notation

·         Model Variables And Parameters

·         Finite -Period Dynamic Programming Formulation

·         Computing An Optimal Admission Policy

·         Optimisation Policy During Period 1

·         Optimisation Policy During Period 2

·         Implementation

·         Case Description

·         Data Collection

·         Computation Of Model Parameters

·         The Optimal Admission Policy For Patients

·         Conclusion

·         References

 

Full Text PDF