Optimal Dynamic Control of a Useful Class of Randomly Jumping Processes

Vermes, D. (1980). Optimal Dynamic Control of a Useful Class of Randomly Jumping Processes. IIASA Professional Paper. IIASA, Laxenburg, Austria: PP-80-015

[thumbnail of PP-80-015.pdf]
Preview
Text
PP-80-015.pdf

Download (1MB) | Preview

Abstract

The purpose of the paper is to present a complete theory of optimal control of piecewise linear and piecewise monotone processes. The theory consists of a description of the processes, necessary and sufficient optimality conditions and existence and uniqueness results, as well as extremal and regularity properties of the optimal strategy. Mathematical proofs are only outlined (they will appear elsewhere), but hints concerning efficient determination of the optimal strategy are included.

Piecewise linear (monotone) processes are discontinuous Markov processes whose state components stay constant or change linearly (monotonically) between two consecutive jumps. All processes of inventory, storage, queuing, reliability and risk theory belong to these classes. The processes will be controlled by feedback (Markov) strategies based on complete state observations. The expected value of a performance functional of integral type with additional terminal costs is to be minimized.

The semigroup theory of Markov processes will be used as the uniform mathematical tool for the whole theory, and the control problem will be reduced to the integration of a system of ordinary differential equations. Special emphasis will be given to the description of the processes by their infinitesimal characteristics which are available explicitly in applied models-no finite dimensional distributions are used.

Item Type: Monograph (IIASA Professional Paper)
Research Programs: System and Decision Sciences - Core (SDS)
Depositing User: IIASA Import
Date Deposited: 15 Jan 2016 01:48
Last Modified: 27 Aug 2021 17:10
URI: https://pure.iiasa.ac.at/1510

Actions (login required)

View Item View Item