An Introduction to the Mathematical Theory of Nonlinear Control Systems
by
Alberto Bressan
S.I.S.S.A., Via Beirut 4, Trieste 34014 Italy
and
Department of Mathematical Sciences, NTNU, N-7491 Trondheim, Norway
bressan@sissa.it
contents:
1. Definitions and examples of nonlinear control systems
2. Relations with differential inclusions
3. Properties of the set of trajectories
4. Optimal control problems
5. Existence of optimal controls
6. Necessary conditions for optimality: the Pontryagin Maximum Principle
7. Viscosity solutions of Hamilton-Jacobi equations
8. Bellman’s Dyanmic Programming Principle and sufficient conditions for optimality