|
Trudy Instituta Matematiki i Mekhaniki UrO RAN, 1998, Volume 5, Pages 301–318
(Mi timm482)
|
|
|
|
This article is cited in 7 scientific papers (total in 7 papers)
Mathematical theory of optimal control and differential games
Hamilton–Jacobi–Bellman equation for a nonlinear impulse control problem
A. V. Stefanova
Abstract:
A minimum problem for a functional of Bolza type along trajectories of nonlinear systems of differential equations governed by impulse controls with integral constraints is considered. A definition of a solution to such systems uses the closure of the set of absolutely continuous trajectories in the topology of pointwise convergence. It is shown that the value function of such a system is Lipschitz continuous and is a unique viscosity solution to a partial first order differential equation (a Hamilton–Jacobi–Bellman equation). Boundary conditions satisfied by the solution are obtained.
Received: 08.09.1997
Citation:
A. V. Stefanova, “Hamilton–Jacobi–Bellman equation for a nonlinear impulse control problem”, Trudy Inst. Mat. i Mekh. UrO RAN, 5, 1998, 301–318
Linking options:
https://www.mathnet.ru/eng/timm482 https://www.mathnet.ru/eng/timm/v5/p301
|
|