URN to cite this document: urn:nbn:de:bvb:703epub67618
Title data
Grüne, Lars ; Sperl, Mario:
Examples for separable control Lyapunov functions and their neural network approximation.
Bayreuth
,
2022
.  6 S.
This is the latest version of this item.


Download (445kB)

Project information
Project title: 
Project's official title Project's id Curseofdimensionalityfree nonlinear optimal feedback control with deep neural networks. A compositionalitybased approach via HamiltonJacobiBellman PDEs GR 1569/231 

Project financing: 
Deutsche Forschungsgemeinschaft Deutsche Forschungsgemeinschaft 
Abstract
In this paper, we consider nonlinear control systems and discuss the existence of a separable control Lyapunov function. To this end, we assume that the system can be decomposed into subsystems and formulate conditions such that a weighted sum of Lyapunov functions of the subsystems yields a control Lyapunov function of the overall system. Since deep neural networks are capable of approximating separable functions without suffering from the curse of dimensionality, we can thus identify systems where an efficient approximation of a control Lyapunov function via a deep neural network is possible. A corresponding network architecture and training algorithm are proposed. Further, numerical examples illustrate the behavior of the algorithm.
Further data
Available Versions of this Item

Examples for existence and nonexistence of separable control Lyapunov functions. (deposited 27 Sep 2022 07:08)
 Examples for separable control Lyapunov functions and their neural network approximation. (deposited 16 Nov 2022 11:09) [Currently Displayed]