What is the Hamilton Jacobi Bellman equation?
In optimal control theory, the Hamilton-Jacobi-Bellman (HJB) equation gives a necessary and sufficient condition for optimality of a control with respect to a loss function. It is, in general, a nonlinear partial differential equation in the value function, which means its solution is the value function itself.
What is Riccati equation in control system?
The algebraic Riccati equation determines the solution of the infinite-horizon time-invariant Linear-Quadratic Regulator problem (LQR) as well as that of the infinite horizon time-invariant Linear-Quadratic-Gaussian control problem (LQG). These are two of the most fundamental problems in control theory.