Least squares: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Igor Grešovnik
m (Created the article)
 
imported>Igor Grešovnik
mNo edit summary
Line 1: Line 1:
'''Least squares''', also known as '''ordinary least squares''' analysis, is a method for [[linear regression]] that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared [[Residual (mathematics)|residuals]] (the difference between the predicted and observed values). This method was first described by [[Carl Friedrich Gauss]]. The least-squares approach to regression analysis has been shown to be optimal in the sense that it satisfies the [[Gauss-Markov theorem]].  
'''Least squares''', also known as '''ordinary least squares''' analysis, is a method for [[linear regression]] that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared [[Residual (mathematics)|residuals]] (the difference between the predicted and observed values). This method was first described by [[Carl Friedrich Gauss]]. It can be shown that the least-squares approach to regression analysis is optimal in the sense that it satisfies the [[Gauss-Markov theorem]].  


A related method is the '''[[least mean squares]]''' (LMS) method. It occurs when the number of measured data is 1 and the [[gradient descent]] method is used to minimize the squared residual.
A related method is the '''[[least mean squares]]''' (LMS) method. It occurs when the number of measured data is 1 and the [[gradient descent]] method is used to minimize the squared residual.


Many other types of optimization problems can be expressed in a least squares form, by either minimizing [[energy]] or maximizing [[Information entropy|entropy]]. The least squares method is important in estimation of model parameters from measured data.
Many other types of optimization problems can be expressed in a least squares form, by either minimizing [[energy]] or maximizing [[Information entropy|entropy]]. The least squares method is particularly important in estimation of model parameters from measured data.

Revision as of 21:52, 23 November 2007

Least squares, also known as ordinary least squares analysis, is a method for linear regression that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared residuals (the difference between the predicted and observed values). This method was first described by Carl Friedrich Gauss. It can be shown that the least-squares approach to regression analysis is optimal in the sense that it satisfies the Gauss-Markov theorem.

A related method is the least mean squares (LMS) method. It occurs when the number of measured data is 1 and the gradient descent method is used to minimize the squared residual.

Many other types of optimization problems can be expressed in a least squares form, by either minimizing energy or maximizing entropy. The least squares method is particularly important in estimation of model parameters from measured data.