|
Application: orthogonal distance regression: fitting an ellipse |
|
|
|
- Reconstruction of curves or surfaces in plane or space from given
discrete measurements is an often needed task. Normally the data cannot be
represented exactly by the given model and hence one needs to agree on a
method of measuring the error of this reconstruction. There are lots of
possibilities to fix this, but the most natural one in the purely geometric
setting might be orthogonal distance regression. Here the sum of squares
of orthogonal distances from the given points to the geometric object is minimized.
- This leads to a nonlinear least squares problem of high dimension,
since not only the parameters which describe the object but also
the projection points on it are unknowns.
- Here we present a prepared numerical experiment, which also shows the limitations
of nonlinear least squares solvers.
- You begin with specifying the data of an ellipse in the plane:
center point (z1,z2) , length and direction of the half axis (as a, b, α).
The points of the ellipse are (xe,ye)
computed from
xe= z1 + cos(α)*a*cos(φ) + sin(α)*b*sin(φ)
ye= z2 -sin(α)*a*cos(φ) + cos(α)*b*sin(φ).
(counterclockwise rotation of an axis parallel ellipse with center (0,0) by an angle
α followed by a shift (z1,z2); φ is the curve parameter ranging from 0
to 2*π).
Then you specify a number N of points and a range for φ.
N values for φ are then drawn randomly from this range and
the points (xe,ye) are computed.
The smaller this range is chosen the harder the problem will be.
You also specify a level δ of perturbation for the data.
This is used to perturb the coordinates with random numbers as
xi = xei+δ*rxi,
yi = yei+δ*ryi,
where rxi, ryi are randomly drawn from ]-1,1[.
It should be clear that a large δ and a small range for the angle
makes the problem ill defined.
Afterwards, the
ellipse is fitted back. This is done using the model
β3*(x-β1)**2+β4*(x-β1)*(y-β2)+β5*(y-β2)**2-1=0. (**)
- The definiteness of the model (**) is not part of the minimization model.
- The orthognal distance regression is realized by minimizing
∑i=1 to N { dxi2+dyi2 }
subject to the N nonlinear equality constraints that the points
(xi+dxi,yi+dyi) satisfy the model equation
(**) exactly. The total number of unknowns hence is 2N+5 and the
problem is highly nonlinear.
- The minimization routine used is based on the Gauss-Newton method, the code being
nlscon from the Elib of ZIB-Berlin.
|
|
|
Input |
|
- The half axis a,b of the original ellipse. In order
to make the problem not too intricate these are restricted to a >= 0.001·b and b >= 0.001·a !
- The rotation of the ellipse in the plane: α around (0,0) and the shift
of the center point z1, z2.
- The range for the curve parameter φ.
- The number N of data points: 5 <= N <= 50 !
- The level δ by which the original data points become perturbed.
Of course δ <= sqrt(a**2+b**2) but you should use only much smaller
values.
- As for all nonlinear problems an initial guess for the
parameters must be specified. Either you decide to use the data from which the original
ellipse is generated as such or you specify input data
β1,..,β5
for the model (**) yourself. β1 and
β2 must be given via a guess of the center point.
|
|
|
Output |
|
- An evaluation statistic for the minimization process, the termination reason
and the parameters found.
- A plot which which shows the computed ellipse, the perturbed data points and their
projections on the ellipse.
|
|
|
What may be interesting ? |
|
- How much might you reduce the range for the angle without getting a breakdown of the process?
- How depends the performance on the perturbation or the quality of the initial guess?
|
|
|
|
|