We consider the problem , where is an matrix with rank , is an vector, and is the vector to be determined.
The sign stands for the least squares approximation, i.e. a minimization of the norm of the residual .
or the square
i.e. a differentiable function of . The necessary condition for a minimum is:
These equations are called the normal equations , which become in our case:
The solution is usually computed with the following algorithm: First (the lower triangular portion of) the symmetric matrix is computed, then its Cholesky decomposition . Thereafter one solves for and finally is computed from .
Unfortunately is often ill-conditioned and strongly influenced by roundoff errors (see [Golub89]). Other methods which do not compute and solve directly are QR decomposition and singular value decomposition.
Originally from The Data Analysis Briefbook (http://rkb.home.cern.ch/rkb/titleA.htmlhttp://rkb.home.cern.ch/rkb/titleA.html)
|Date of creation||2013-03-22 12:04:28|
|Last modified on||2013-03-22 12:04:28|
|Last modified by||akrowne (2)|