Skip to main content
Article
Hamilton–Jacobi–Bellman Equations and Approximate Dynamic Programming on Time Scales
IEEE Transactions on Systems, Man, and Cybernetics, Part B (2008)
  • John Seiffertt, Providence College
  • S Sanyal
  • Donald C. Wunsch, Missouri University of Science and Technology
Abstract
The time scales calculus is a key emerging area of mathematics due to its potential use in a wide variety of multidisciplinary applications. We extend this calculus to approximate dynamic programming (ADP). The core backward induction algorithm of dynamic programming is extended from its traditional discrete case to all isolated time scales. Hamilton-Jacobi-Bellman equations, the solution of which is the fundamental problem in the field of dynamic programming, are motivated and proven on time scales. By drawing together the calculus of time scales and the applied area of stochastic control via ADP, we have connected two major fields of research
Publication Date
June 27, 2008
DOI
10.1109/TSMCB.2008.923532
Citation Information
John Seiffertt, S Sanyal and Donald C. Wunsch. "Hamilton–Jacobi–Bellman Equations and Approximate Dynamic Programming on Time Scales" IEEE Transactions on Systems, Man, and Cybernetics, Part B Vol. 38 Iss. 4 (2008) p. 918 - 923 ISSN: 1083-4419
Available at: http://works.bepress.com/john-seiffertt/6/