|
Program Systems: Theory and Applications, 2011, Volume 2, Issue 1, Pages 63–70
(Mi ps28)
|
|
|
|
This article is cited in 1 scientific paper (total in 1 paper)
Optimization Methods and Control Theory
Sufficient conditions of optimality for optimal control problems of logic-dynamic systems
N. S. Maltuguevaab a Institute for System Dynamics and Control Theory SB RAS, Irkutsk
b Program System Institute named by A. K. Ailamazyan of Russian Academy of Sciences, Pereslavl-Zalessky
Abstract:
This article deals with logic-dynamic systems, it's a special class of discrete-continuous control systems. Discrete component in these systems is an integer/valued function, which has a finite number of discontinuity points. The optimal control problem is formulated for this kind of systems. The problem under consideration differs from the classical optimal control problem that the right-hand sides of differential equations and functional have the discrete variables. In articles of A.S. Bortakovskii sufficient conditions of optimality are proved for the Bellman function. But this theorem is true for any function Krotov, and the author of this work showed this. Also in the article it's described an approach to the construction of computational procedures for this problem.
Key words and phrases:
control systems, nonlocal improvement.
Received: 13.02.2011 Accepted: 11.03.2011
Citation:
N. S. Maltugueva, “Sufficient conditions of optimality for optimal control problems of logic-dynamic systems”, Program Systems: Theory and Applications, 2:1 (2011), 63–70
Linking options:
https://www.mathnet.ru/eng/ps28 https://www.mathnet.ru/eng/ps/v2/i1/p63
|
Statistics & downloads: |
Abstract page: | 412 | Full-text PDF : | 147 | References: | 43 | First page: | 1 |
|