|
- In this chapter we present the most important methods for solving
fitting problems by the method of least squares.
However, since this site is meant as a help for beginners in numerical analysis,
no large and sparse solvers are included here.
In any case you might play with our prepared examples or
provide your own data and models.
For linear least squares problem we present:
- The QR-decomposition
and the normal equations method , applied to the same problem.
The QR decomposition is in use also in many other fields,
e.g. in optimization in order to compute null space and range space
orthogonal bases.
- The
singular values decomposition (SVD), the ultimate tool for this
task, which also has other applications, e.g. the definition of a
''numerical rank'' and in regularization problems.
It also provides one with the computation of the Moore-Penrose generalized
inverse.
As a simple application you might try orthogonal distance fitting of
a plane in 3-space.
- The principle of least squares has applications far beyond curve fitting
in the plane and we show here fitting of an explicitly defined surface in 3-space.
The prepared example is a polynomial surface, but you may equally well
try your own model with your own data.
- The use of orthonormal bases is helpful also in least squares: if
A=U × R with U orthonormal and R upper triangular
then the job ||Ax-b||=minx is done by R x = UTb.
We present here three well known methods for computing this decomposition,
the classical Gram-Schmidt method, the modified Gram-Schmidt method and
once more Householders reduction to triangular form.
- The oldest method for nonlinear least squares problems is the
Gauss-Newton method, consisting in iterative application of
the linearization principle. We present it here in an advanced version
guaranteing global convergence under reasonable conditions, using a
stepsize control.
- The Levenberg-Marquardt method is the most popular method for
nonlinear least squares, providing an automatic regularization
technique for rank deficient Jacobians. In contrast to the preceding
code this is a trust region method, using again linearization plus
a size control for the solution of the linearized problem.
This makes the correction direction dependent on this size control.
- Both the Gauss-Newton and the Levenberg-Marquardt method may perform very bad if the
residuals are large. Hence a method which comes nearer to a full regularized
Newton's method without the need of computing second derivatives of the model function
is desirable. Here we present a specialized quasi-Newton method for this, the
code nl2sol from Dennis, Gay and Welsch, (with a modernized FORTRAN coding).
- The standard least squares problem assumes random errors only in the
''dependent'' variable (the right hand side of the (linearized) problem).
In more general cases, with random errors in all variables, the orthogonal
distance regression is the adequate tool. We present it here in the implementation
of ODRPACK from netlib, but with restrictions on the model to make it easy to use.
- As an application, we present orthogonal distance regression
for fitting an ellipse to twodimensional data, using another solver.
You will find out here much about the sensitivity these problems might have in case
of incomplete data ranges.
- There is another application of orthogonal least distance regression,
fitting a plane ellipse to data in threedimensional space. This is interesting
for its special choice of model equations and also for more experiments.
- In applications the problem of ''parameter identification'' occurs quite often.
Here one has data (ti,yi) and knows that y
obeys a physical law y'=f(t,y,p),(ODE with parameter p)
with the structure of f
known, but the value of the parameter unknown. Using least squares for identifying
p is a natural choice. We realize this here using a combination of donlp2
and dassl.
|