Least squares: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Igor Grešovnik
mNo edit summary
 
(5 intermediate revisions by one other user not shown)
Line 1: Line 1:
{{subpages}}
'''Least squares''', also known as '''ordinary least squares''' analysis, is a method for [[linear regression]] that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared [[Residual (mathematics)|residuals]] (the difference between the predicted and observed values). This method was first described by [[Carl Friedrich Gauss]]. It can be shown that the least-squares approach to regression analysis is optimal in the sense that it satisfies the [[Gauss-Markov theorem]].  
'''Least squares''', also known as '''ordinary least squares''' analysis, is a method for [[linear regression]] that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared [[Residual (mathematics)|residuals]] (the difference between the predicted and observed values). This method was first described by [[Carl Friedrich Gauss]]. It can be shown that the least-squares approach to regression analysis is optimal in the sense that it satisfies the [[Gauss-Markov theorem]].  


Line 14: Line 15:
:<math>y=f(\bold{x};\bold{a}) ,</math>  
:<math>y=f(\bold{x};\bold{a}) ,</math>  


where ''y'' is the dependent variable, '''x''' are the independent variables, and '''a''' are the adjustable parameters of the model. We wish to find the values of these parameters such that the model best fits the data according to a defined error criterion. The least squares method minimizes the sum of squares of errors,
where ''y'' is the dependent variable, '''x''' is the vector of independent variables, and '''a''' are the adjustable parameters of the model. We wish to find the values of these parameters such that the model best fits the data according to a defined error criterion. The least squares method minimizes the sum of squares of errors,
:<math> S(a) = \sum_{i=1}^n (y_i - f(\bold{x}_i;\bold{a}))^2 </math>,
 
:<math> S(\bold{a}) = \sum_{i=1}^n (y_i - f(\bold{x}_i;\bold{a}))^2 ,</math>


with respect to the adjustable parameters of the model '''a'''.
with respect to the adjustable parameters of the model '''a'''.
Line 23: Line 25:
*[[Moving least squares]]
*[[Moving least squares]]
*[[Regression analysis]]
*[[Regression analysis]]
*[[Function approximation]][[Category:Suggestion Bot Tag]]

Latest revision as of 16:00, 10 September 2024

This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Least squares, also known as ordinary least squares analysis, is a method for linear regression that determines the values of unknown quantities in a statistical model by minimizing the sum of the squared residuals (the difference between the predicted and observed values). This method was first described by Carl Friedrich Gauss. It can be shown that the least-squares approach to regression analysis is optimal in the sense that it satisfies the Gauss-Markov theorem.

A related method is the least mean squares (LMS) method. It occurs when the number of measured data is 1 and the gradient descent method is used to minimize the squared residual.

Many other types of optimization problems can be expressed in a least squares form, by either minimizing energy or maximizing entropy. The least squares method is particularly important in estimation of model parameters from measured data.

Problem statement

Consider the problem of adjusting a model function to best fit a data set. The chosen model function has adjustable parameters. The data set consist of n points

The model function has the form

where y is the dependent variable, x is the vector of independent variables, and a are the adjustable parameters of the model. We wish to find the values of these parameters such that the model best fits the data according to a defined error criterion. The least squares method minimizes the sum of squares of errors,

with respect to the adjustable parameters of the model a.

See also