next up previous contents index
Next: The Quasi-Newton Method. Up: Outline of the Available Previous: The Newton-Raphson Method.

The Modified Gauss-Newton Method.

From a first guess of the parameters a(1), a sequence $a^{(2)},a^{(3)},\ldots $ is generated and is intended to converge to a local minimum of $\chi^2(a)$. At each iteration, one computes

\begin{displaymath}a^{(k+1)}~=~a^{(k)}~+\alpha^{(k)}~d^{(k)}\end{displaymath}

where d(k) is a certain descent direction and $\alpha^{(k)}$ is a real coefficient which is chosen such that $\chi^2(a^{(k)}+\alpha^{(k)}~d^{(k)})$ is approximately minimum. The direction d(k) is ideally the solution of the Newton equation

H(a(k)d(k) = -g(a(k))

which can also be rewritten

[J(a(k))T J(a(k)) + B(a(k))] d(k) = -J(a(k)r(a(k))  .

Neglecting the second derivatives matrix B(a(k)), we obtain the ``normal equations'' and the Gauss-Newton direction

J(a(k))T J(a(k)d(k) = -J(a(k)r(a(k))

This so-called Gauss-Newton method is intended for problems where $\Vert B(a)\Vert$ is small. If the Jacobian J(a) is singular or near singular or if $\Vert r(a)\Vert$ is very large (the so-called large residuals problem), the Gauss-Newton equation is not a good approximation of the normal equations and the convergence is not guaranteed.

The algorithm implemented here is a modification of that Gauss-Newton method, that allows convergence even for rank deficient Jacobians or for large residuals. The Gauss-Newton direction is computed in $V_1~=~\Im m~[J(a^{(k)})^T~J(a^{(k)})]$, the invariant space corresponding to the non-null eigenvalues. A correction is taken in V2, the orthogonal of V1, according to the second derivatives if the decrease of the objective function at the last iteration is considered too small. The Hessian matrix is estimated using finite differences of the gradient.

This method requires the availability of the derivatives and as the number of gradient evaluations is almost p at each iteration, it is recommended for problems with a small number of parameters, let us say $p~\leq~10$


next up previous contents index
Next: The Quasi-Newton Method. Up: Outline of the Available Previous: The Newton-Raphson Method.
http://www.eso.org/midas/midas-support.html
1999-06-09