URN to cite this document: urn:nbn:de:bvb:703-epub-7284-4
Title data
Sperl, Mario ; Mysliwitz, Jonas ; Grüne, Lars:
Approximation of Separable Control Lyapunov Functions with Neural Networks.
Bayreuth
,
2023
. - 12 S.
|
|||||||||
Download (3MB)
|
Project information
Project title: |
Project's official title Project's id Curse-of-dimensionality-free nonlinear optimal feedback control with deep neural networks. A compositionality-based approach via Hamilton-Jacobi-Bellman PDEs 463912816 |
---|---|
Project financing: |
Deutsche Forschungsgemeinschaft |
Abstract
In this paper, we investigate the ability of deep neural networks to provide curse-of-dimensionality-free approximations of control Lyapunov functions. To achieve this, we first prove an error bound for the approximation of separable functions with neural networks. Subsequently, we discuss conditions on the existence of separable control Lyapunov functions, drawing upon tools from nonlinear control theory. This enables us to bridge the gap between neural networks and the approximation of control Lyapunov functions as we identify conditions that allow neural networks to effectively mitigate the curse of dimensionality when approximating control Lyapunov functions. Moreover, we present a suitable network architecture and a corresponding training algorithm. The training process is illustrated using two 10-dimensional control systems.
Further data
Available Versions of this Item
- Approximation of Separable Control Lyapunov Functions with Neural Networks. (deposited 06 Nov 2023 07:09) [Currently Displayed]