|
Avtomatika i Telemekhanika, 2017, Issue 10, Pages 139–154
(Mi at14617)
|
|
|
|
This article is cited in 8 scientific papers (total in 8 papers)
Stochastic Systems
On the existence of optimal strategies in the control problem for a stochastic discrete time system with respect to the probability criterion
A. I. Kibzun, A. N. Ignatov Moscow Aviation Institute, Moscow, Russia
Abstract:
We study a control problem for a stochastic system with discrete time. The optimality criterion is the probability of the event that the terminal state function does not exceed a given limit. To solve the problem, we use dynamic programming. The loss function is assumed to be lower semicontinuous with respect to the terminal state vector, and the transition function from the current state to the next is assumed to be continuous with respect to all its arguments. We establish that the dynamic programming algorithm lets one in this case find optimal positional control strategies that turn out to be measurable. As an example we consider a two-step problem of security portfolio construction. We establish that in this special case the future loss function on the second step turns out to be continuous everywhere except one point.
Keywords:
dynamic programming, stochastic system, discrete time, measurable positional strategy.
Citation:
A. I. Kibzun, A. N. Ignatov, “On the existence of optimal strategies in the control problem for a stochastic discrete time system with respect to the probability criterion”, Avtomat. i Telemekh., 2017, no. 10, 139–154; Autom. Remote Control, 78:10 (2017), 1845–1856
Linking options:
https://www.mathnet.ru/eng/at14617 https://www.mathnet.ru/eng/at/y2017/i10/p139
|
Statistics & downloads: |
Abstract page: | 319 | Full-text PDF : | 59 | References: | 45 | First page: | 26 |
|