Optimal Control: Calculus of Variations, Optimization (mathematics), Control Theory, Continuous Signal, Discrete Time, Dynamic Programming, Bellman Equation, Trajectory Optimization - Tapa blanda

 
9786130306885: Optimal Control: Calculus of Variations, Optimization (mathematics), Control Theory, Continuous Signal, Discrete Time, Dynamic Programming, Bellman Equation, Trajectory Optimization

Sinopsis

Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. A control problem includes a cost functional that is a function of state and control variables. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. The optimal control can be derived using Pontryagin''s maximum principle (a necessary condition), or by solving the Hamilton-Jacobi-Bellman equation (a sufficient condition).

"Sinopsis" puede pertenecer a otra edición de este libro.

Présentation de l'éditeur

Please note that the content of this book primarily consists of articles available from Wikipedia or other free sources online. Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. A control problem includes a cost functional that is a function of state and control variables. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional. The optimal control can be derived using Pontryagin''s maximum principle (a necessary condition), or by solving the Hamilton-Jacobi-Bellman equation (a sufficient condition).

"Sobre este título" puede pertenecer a otra edición de este libro.