|
This article is cited in 1 scientific paper (total in 2 paper)
Topical issue (end)
Optimal control of a discrete-time stochastic system with a probabilistic criterion and a non-fixed terminal time
V. M. Azanov Moscow Aviation Institute (National Research University), Moscow, Russia
Abstract:
This paper considers an optimal control problem for a discrete-time stochastic system with the probability of first reaching the boundaries of a given domain as the optimality criterion. Dynamic programming-based sufficient conditions of optimality are formulated and proved. The isobells of levels 1 and 0 of the Bellman function are used for obtaining two-sided estimates of the right-hand side of the dynamic programming equation, two-sided estimates of the Bellman function, and two-sided estimates of the optimal-value function of the probabilistic criterion. A suboptimal control design method is proposed. The conditions of equivalence to an optimal control problem with a probabilistic terminal criterion are established. An illustrative example is given.
Keywords:
discrete-time systems, stochastic optimal control, probabilistic criterion, dynamic programming, Bellman function.
Citation:
V. M. Azanov, “Optimal control of a discrete-time stochastic system with a probabilistic criterion and a non-fixed terminal time”, Avtomat. i Telemekh., 2020, no. 12, 3–23; Autom. Remote Control, 81:12 (2020), 2143–2159
Linking options:
https://www.mathnet.ru/eng/at15612 https://www.mathnet.ru/eng/at/y2020/i12/p3
|
Statistics & downloads: |
Abstract page: | 157 | Full-text PDF : | 21 | References: | 33 | First page: | 18 |
|