Least squares
Least squares, also known as ordinary least squares analysis, is a method for linear regression that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared residuals (the difference between the predicted and observed values). This method was first described by Carl Friedrich Gauss. The least-squares approach to regression analysis has been shown to be optimal in the sense that it satisfies the Gauss-Markov theorem.
A related method is the least mean squares (LMS) method. It occurs when the number of measured data is 1 and the gradient descent method is used to minimize the squared residual.
Many other types of optimization problems can be expressed in a least squares form, by either minimizing energy or maximizing entropy. The least squares method is important in estimation of model parameters from measured data.