From: Alessandro on
Hi
This is a long story, I make it short:
I am working in a project where I need to find a matrix defined by a third degree polynomial, the solution can be found iteratively using a gradient descent technique, I am using the golden section line search already implemented in matlab (with the code described below).
The algorithm looks powerful (it finds automatically the perfect step), but unfortunately the golden section line search does not avoid being stuck in local minima. How can I implement more efficiently this. The problem is that sometime it converges and sometimes no (sometimes is not science :-).

%Initial third degree polynomial
Cest = initial_guess;
normdev=inf ;
stepsize=inf;

%Stopping condition
stopping_condition = 10^(-5) *norm(X*X'/no_samples,'fro');

while abs(normdev*stepsize) > stopping_condition
%Third degree polynomial
dnew = Cest - 1/no_samples*(X*X' - 2/sigma^2 * (Cest*Cest'*Cest-Cest*B'*Cest));
%Find the best stepsize as a minimum using the goldensection line search
stepsize = fminbnd( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),-.1,.1);

%Update
Cest = Cest + stepsize*dnew;
normdev = norm(dnew,'fro');
end

function error = step(stepsize,Cest,dnew,XX,B,sigma,no_samples)
Cest = Cest + stepsize*dnew;
error = norm(Cest - 1/no_samples*(XX - 2/sigma^2 * (Cest^3-Cest*B*Cest)),'fro');

I tried :
%Quasi-Newton
stepsize = fminunc( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),dnew);
But matlab get stuck (no heap memory) probably due the fact that this function is suppose to be used when we don't have a trust region.

Any suggestions ?
From: Matt J on
Alessandro <alecrimi(a)diku.dk> wrote in message <1733159734.251693.1275258056984.JavaMail.root(a)gallium.mathforum.org>...

> The algorithm looks powerful (it finds automatically the perfect step), but unfortunately the golden section line search does not avoid being stuck in local minima. How can I implement more efficiently this. The problem is that sometime it converges and sometimes no (sometimes is not science :-).
==============

The only way to avoid local minima is

(1) To have a good initial guess

(2) To have a good guess about the size of the capture basin where your initial guess lies, so that you don't step out of it.

You can then limit your line searches to a fixed maximum step size, i.e. , minimize
f(x+t*d) over t when t is confined to an interval [0,s] at every iteration. If the maximum stepsize s is selected in accordance with (2) above, it should reduce your chances of stepping out of the local capture basin.