An approximation of the Hamilton-Jacobi-Bellman equation connected with the infinite horizon optimal control problem with discount is proposed. The approximate solutions are shown to converge uniformly to the viscosity solution, in the sense of Crandall-Lions, of the original problem. Moreover, the approximate solutions are interpreted as value functions of some discrete time control problem. This allows to construct by dynamic programming a minimizing sequence of piecewise constant controls. © 1983 Springer-Verlag New York Inc.
On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming / CAPUZZO DOLCETTA, Italo. - In: APPLIED MATHEMATICS AND OPTIMIZATION. - ISSN 0095-4616. - 10:1(1983), pp. 367-377. [10.1007/bf01448394]
On a discrete approximation of the Hamilton-Jacobi equation of dynamic programming
CAPUZZO DOLCETTA, Italo
1983
Abstract
An approximation of the Hamilton-Jacobi-Bellman equation connected with the infinite horizon optimal control problem with discount is proposed. The approximate solutions are shown to converge uniformly to the viscosity solution, in the sense of Crandall-Lions, of the original problem. Moreover, the approximate solutions are interpreted as value functions of some discrete time control problem. This allows to construct by dynamic programming a minimizing sequence of piecewise constant controls. © 1983 Springer-Verlag New York Inc.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


