Academia.eduAcademia.edu

Optimal Control Theory: Introduction to the Special Issue

Games

Abstract

Optimal control theory is a modern extension of the classical calculus of variations [...]

Key takeaways

  • Optimal control theory is a modern extension of the classical calculus of variations.
  • An optimal control problem includes a calculation of the optimal control and the synthesis of the optimal control system.
  • Investigations of new classes of optimization problems, optimal control of nonlinear systems, as well as the task of reconstructing input signals are also presented.
  • This issue collects the papers that allow the use of analytical methods to study the various problems of optimal control and its evaluation, as well as applications of optimal controls and differential games to describe complex nonlinear phenomena.
  • This approach allows us to use some efficient optimal control methods for the analysis of the resulting optimal control problem.