Prev: احصل على ماستر كارد MasterCard مجانا هديه
Next: Gradient descent, anything better than golden section line search
From: Alessandro on 30 May 2010 18:18 Hi This is a long story, I make it short: I am working in a project where I need to find a matrix defined by a third degree polynomial, the solution can be found iteratively using a gradient descent technique, I am using the golden section line search already implemented in matlab (with the code described below). The algorithm looks powerful (it finds automatically the perfect step), but unfortunately the golden section line search does not avoid being stuck in local minima. How can I implement more efficiently this. The problem is that sometime it converges and sometimes no (sometimes is not science :-). %Initial third degree polynomial Cest = initial_guess; normdev=inf ; stepsize=inf; %Stopping condition stopping_condition = 10^(-5) * norm(X*X'/no_samples,'fro'); while abs(normdev*stepsize) > stopping_condition %Third degree polynomial dnew = Cest - 1/no_samples*(X*X' - 2/sigma^2 * (Cest*Cest'*Cest-Cest*B'*Cest)); %Find the best stepsize as a minimum using the goldensection line search stepsize = fminbnd( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),-.1,.1); %Update Cest = Cest + stepsize*dnew; normdev = norm(dnew,'fro'); end function error = step(stepsize,Cest,dnew,XX,B,sigma,no_samples) Cest = Cest + stepsize*dnew; error = norm(Cest - 1/no_samples*(XX - 2/sigma^2 * (Cest^3-Cest*B*Cest)),'fro'); I tried : %Quasi-Newton stepsize = fminunc( @(stepsize) step(stepsize,Cest,dnew,X*X',B,sigma,no_samples),dnew); But matlab get stuck (no heap memory) probably due the fact that this function is suppose to be used when we don't have a trust region. Any suggestions ? |