|
Newton's method |
|
|
|
- Newton's method is the mostly used method for finding zeroes of general smooth
functions: f(x*) = 0.
- Here we have a solver for scalar real equations.
- Assuming that f is two times continuously differentiable, that a solution
x* exists and that f'(x*) < > 0
we get that f has an inverse function f-1 which
is two times continuously differentiable too, at least locally. We assume that
our current approximation xk is sufficiently near to the solution
in order to apply this. Then we get
f(xk) = f(x*) + f'(xk~)(xk-x*)
= O(|xk-x*|)
and since
(f -1)'(ξ)ξ=f(xk) = 1/f'(xk)
x*-xk = f -1(f(x*))-f -1(f(xk))
= ((f -1)'(ξ)ξ=f(xk))(f(x*)-f(xk))+(1/2)((f -1)''(η))(f(xk)-f(x*))2
= -f(xk)/f'(xk) + O(f(xk)2)
and hence, setting
xk+1 = xk - f(xk)/f'(xk)
(the zero of the tangent of (x,f(x)) at xk
xk+1 - x* = O(f(xk)2) = O(|xk-x*|2).
Hence from our assumptions there exists a constant C such that
|xk+1 - x*| <= C |xk-x*|2.
- Hence the method will converge if the error in the first guess is less than 1/C
and it will converge then of order two.
A merely superlinear convergence can be shown if f is merely once continuously
differentiable.
- For bad choices of the initial guess anything can occur: definite divergence,
divergence by cycling, chaotic behaviour. However, should f be strictly monotonic
and convex right of x* or strictly monotonic and concave left of
x* and the initial guess in that same position, then convergence will occur
independently of the magnitude of its error. However, convergence can be quite slow
initially.
Example: exp(x)-x-w for x, w >> 0
- If there are zeroes of higher order a more detailed analysis shows that the convergence
is merely linear locally. In addition there might occur numerical problems at least
near the zero because in the limit we get the situation 0/0 in the correction.
However, f will go faster to zero than f'. Hence often this
works also in this situation.
- Should hold f(x*)=0, f'(x*) < > 0, f''(x*)=0
then convergence order is even cubic (for example for arctan at x=0).
|
|
|
Input |
|
- We have prepared four cases for f and you also have the possibility to
specify f yourself by a piece of code. In this case your input must obey the rules of
FORTRAN.
There is no need to provide the derivative since we compute this by a high precision
numerical differentiation.
- The interval [a,b], where the zero should lie. We use this in order to detect
possible divergence of the method.
- The initial guess x0 in [a,b].
|
|
|
Output |
|
- A table showing xk, f(xk), f'(xk).
- An animated picture which shows you the run of the iteration with zooming: you will see
several iterates (as bars), the graph of the function and the tangents.
We work with a rather high precision requirement internally: difference of iterates
less than 1.0e-8 times the length of the bounding interval, final length of zoomed interval
shown less than
(1/200) of the length of the bounding interval, difference of function values in zoom area less than 0.00004,
motivated by the
possibility to zoom into the solution and the limitations of the screen output. Hence sometimes you will see little in last pictures since the
speed of convergence doesn't allow to show several iterates in one picture.
- In order to limit the amount of output we terminate if 99 steps are not
sufficient to reach our criteria.
|
|
|
Questions?! |
|
- Verify the graphical interpretation of the method.
- For which initial guesses and which functions trouble will occur ?
- Which kind of problems might be encountered?
- In which cases the method works indepently of the initial guess?
|
|
|
|