Markov chains are commonly applied in OR modeling due to the rational appeal of the Markov property and the computational conveniences that can often be derived in such applications as queuing, inventory, maintenance, and manpower planning decisions. Under certain lumpability conditions the state space of a Markov chain can be partitioned into selected subsets of states to form a smaller chain that retains the Markov property, thus providing even greater simplicity in modeling operational systems. In this presentation we will discuss methods for constructing lumped chains and their application. Examples will be provided for manpower planning and DNA testing applications.
Thursday, March 1
Marlin U. Thomas is professor of Operational Sciences and dean, Graduate School of Engineering and Management at the Air Force Institute of Technology. He received his B.S.E. at the University of Michigan-Dearborn, and M.S.E. and Ph.D. at the University of Michigan. He has held several other appointments in academics, industry, and government, including program director for operations research at the National Science Foundation. He has served on a number of editorial boards that include area editor, Operations Research; consulting editor, McGraw-Hill; and department editor for IIE Transactions. He has also served on numerous government and university advisory panels for research, and academic program reviews, and is a past-member of the Army Science Board. His primary area of research is reliability and stochastic modeling and he is the author or co-author of over 80 papers and textbook contributions. He is a Fellow of the ASQ, INFORMS, and IIE; and a Captain (retired), Civil Engineer Corps, U.S. Navy Reserve.
Decision Systems Seminar presented by School of Computing, Informatics, and Decision Systems Engineering