State-of-the-art infrastructure management systems use Markov decision processes (MDPs) as a methodology for maintenance and rehabilitation (M&R) decision making. The underlying assumptions in this methodology are that an inspection is performed at the beginning of every year and that inspections reveal the true condition state of the facility, with no error. As a result, after an inspection, the decision maker can apply the activity prescribed by the optimal policy for that condition state of the facility. Previous research has developed a methodology for M&R activity selection that accounts for the presence of both forecasting and measurement uncertainty. This methodology is the latent Markov decision process (LMDP), an extension of the traditional MDP that does not necessarily assume the measurement of facility condition to be free of error. Both a transient and a steady-state formulation of the facility-level LMDP are presented. The methodology is extended to include network-level constraints. This can be achieved by extending the LMDP model to the network-level problem through the use of randomized policies. In addition, both a transient and a steady-state formulation of the network-level LMDP are presented. A case study application demonstrates the expected savings in life-cycle costs that result from increasing the measurement accuracy used in facility inspections and from scheduling inspection decisions optimally.
ASJC Scopus subject areas
- Civil and Structural Engineering
- Mechanical Engineering