This book presents a unified theory of dynamic programming and Markov decision processes and its application to a major field of operations research and operations management: inventory control.
Bibliography
Includes bibliographical references.
Contents
Static Problems -- Markov Chains -- Optimal Control in Discrete Time -- Inventory Control Without Set Up Cost -- Ergodic Control in Discrete Time -- Optimal Stopping Problems -- Impulse Control -- Inventory Control With Set Up Cost -- Ergodic Control of Inventories With Set Up Cost -- Dynamic Inventory Models With Extensions -- Inventory Control With Markov Demand -- Lead Times and Delays -- Continuous Time Inventory Control -- Inventory Control With Diffusion Demand -- Mean-Reverting Inventory Control -- Two Band Impulse Control Problems.
Local Note
eBooks on EBSCOhost EBSCO eBook Subscription Academic Collection - North America