From: Srimal Jayawardena on
Hi

I'm optimising a function that uses a 6d variable using fminsearch.

Even when I give the starting point as a known optimum, it still takes more about 130 iterations to spit out the solution - which is of course the same as the starting point.

For example if I start with,

X0 = [320.000 491.7420 372.6962 -131.4394 -0.6187 -0.5948];
X = X0;

My output says this after about 5 minutes:

iterations: 130
funcCount: 219
algorithm: 'Nelder-Mead simplex direct search'
message: [1x194 char]


I store the history of the X values using an @outputfunction and the values are always the starting point values.

Ex:.
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054
320.0000 491.7420 372.6962 -131.4394 -0.6187 -0.5948 0.0054

for 219 rows.. (I tried plotting it and I see constant flat lines).
(The 7th coloumn is the value of the objective function- which also seems to remain constant)

I use the following to store the history,

history=[];
function stop = keephistory(x,optimvalues,state);
stop = false;
if state == 'iter'
% history = [history; [x]];MaxIter
history = [history; [x optimvalues.fval ] ];
end
end


What seems to be taking so long for fminsearch to terminate when the starting values is infact the optimum point (and fminsearch doesnt seem to change this).

I tried setting TolX',1 and 'TolFun', 1 but this didnt seem to help.

Changing MaxIter stops the optimisation after the said MaxIter but that is not what I want.

I need the optimisation to stop at the identified optimum . When I give the starting point as the identified optimal point, I dont want fminsearch to be wasting 5 miunutes just to give me the same values as the output after 5 minutes!

Any advice would be appreciated very much.

Thanks

Srimal
From: Jomar Bueyes on
On Jan 26, 9:23 pm, "Srimal Jayawardena" <srim...(a)gmail.com> wrote:
> Hi
>
> I'm optimising a function that uses a 6d variable using fminsearch.
>
> Even when I give the starting point as a known optimum, it still takes more about 130 iterations to spit out the solution - which is of course the same as the starting point.
>
> For example if I start with,
>
> X0 = [320.000  491.7420  372.6962 -131.4394   -0.6187   -0.5948];
> X = X0;
>
> My output says this after about 5 minutes:
>
> iterations: 130
>      funcCount: 219
>      algorithm: 'Nelder-Mead simplex direct search'
>        message: [1x194 char]
>
> I store the history of the X values using an @outputfunction and the values are always the starting point values.
>
> Ex:.
>  320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
>
> for 219 rows.. (I tried plotting it and I see constant flat lines).
> (The 7th coloumn is the value of the objective function- which also seems to remain constant)
>
> I use the following to store the history,
>
> history=[];
> function stop = keephistory(x,optimvalues,state);
>         stop = false;
>         if state == 'iter'
> %           history = [history; [x]];MaxIter
>           history = [history; [x optimvalues.fval ] ];
>         end
> end
>
> What seems to be taking so long for fminsearch to terminate when the starting values is infact the optimum point (and fminsearch doesnt seem to change this).
>
> I tried setting TolX',1 and 'TolFun', 1 but this didnt seem to help.
>
> Changing MaxIter stops the optimisation after the said MaxIter but that is not what I want.
>
> I need the optimisation to stop at the identified optimum . When I give the starting point as the identified optimal point, I dont want fminsearch to be wasting 5 miunutes just to give me the same values as the output after 5 minutes!
>
> Any advice would be appreciated very much.
>
> Thanks
>
> Srimal

Hi Srimal,

Fminsearch uses an algorithm that does not require derivatives of the
merit function, namely the simplex algorithm. This algorithm starts
its search from the initial guess looking for a neighboring point
where the merit function decreases. As the iteration progresses, the
search volume decreases with iteration. However, when the algorithm
starts at a point close to the optimum, it has a hard time finding a
point that is better than the initial estimate. The algorithm keeps
reducing the search hypervolume and searching around the initial
estimate untill the search hypervolume is smaller than the tollerance
you set for the search. In other words, the simplex optimizaion not
only does not benefit from an initial estimate close to the minimum,
such initial estimate actually hurts the algorithm. If you have access
to the optimizaiton toolbox, use the 'fsolve' function which uses much
better algrithms which benefit from an initial estimate that is close
to the minimizing point. I'd suggest using the Levenberg-Marquardt
option.

HTH

Jomar
From: Srimal Jayawardena on
Hi

Thanks for the information.

I wish to change the size of the initial simplex. How can I do this with fminsearch?

I've read in Numerical Recipes that when the algorithm gets stuck at a local minima, it helps to reinitialize the simplex at the point where you get stuck. However, this did not seem to work with fminsearch. Is there some alternative for the with fminsearch?.

My merit function does is a set of data points of unknown derivative information. Hence the choice to use simplex.

Thanks

Srimal.
Jomar Bueyes <jomarbueyes(a)hotmail.com> wrote in message <93b5cfe6-9adf-4d3c-9293-98504156b878(a)e37g2000yqn.googlegroups.com>...
> On Jan 26, 9:23 pm, "Srimal Jayawardena" <srim...(a)gmail.com> wrote:
> > Hi
> >
> > I'm optimising a function that uses a 6d variable using fminsearch.
> >
> > Even when I give the starting point as a known optimum, it still takes more about 130 iterations to spit out the solution - which is of course the same as the starting point.
> >
> > For example if I start with,
> >
> > X0 = [320.000  491.7420  372.6962 -131.4394   -0.6187   -0.5948];
> > X = X0;
> >
> > My output says this after about 5 minutes:
> >
> > iterations: 130
> >      funcCount: 219
> >      algorithm: 'Nelder-Mead simplex direct search'
> >        message: [1x194 char]
> >
> > I store the history of the X values using an @outputfunction and the values are always the starting point values.
> >
> > Ex:.
> >  320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >   320.0000  491.7420  372.6962 -131.4394   -0.6187   -0.5948    0.0054
> >
> > for 219 rows.. (I tried plotting it and I see constant flat lines).
> > (The 7th coloumn is the value of the objective function- which also seems to remain constant)
> >
> > I use the following to store the history,
> >
> > history=[];
> > function stop = keephistory(x,optimvalues,state);
> >         stop = false;
> >         if state == 'iter'
> > %           history = [history; [x]];MaxIter
> >           history = [history; [x optimvalues.fval ] ];
> >         end
> > end
> >
> > What seems to be taking so long for fminsearch to terminate when the starting values is infact the optimum point (and fminsearch doesnt seem to change this).
> >
> > I tried setting TolX',1 and 'TolFun', 1 but this didnt seem to help.
> >
> > Changing MaxIter stops the optimisation after the said MaxIter but that is not what I want.
> >
> > I need the optimisation to stop at the identified optimum . When I give the starting point as the identified optimal point, I dont want fminsearch to be wasting 5 miunutes just to give me the same values as the output after 5 minutes!
> >
> > Any advice would be appreciated very much.
> >
> > Thanks
> >
> > Srimal
>
> Hi Srimal,
>
> Fminsearch uses an algorithm that does not require derivatives of the
> merit function, namely the simplex algorithm. This algorithm starts
> its search from the initial guess looking for a neighboring point
> where the merit function decreases. As the iteration progresses, the
> search volume decreases with iteration. However, when the algorithm
> starts at a point close to the optimum, it has a hard time finding a
> point that is better than the initial estimate. The algorithm keeps
> reducing the search hypervolume and searching around the initial
> estimate untill the search hypervolume is smaller than the tollerance
> you set for the search. In other words, the simplex optimizaion not
> only does not benefit from an initial estimate close to the minimum,
> such initial estimate actually hurts the algorithm. If you have access
> to the optimizaiton toolbox, use the 'fsolve' function which uses much
> better algrithms which benefit from an initial estimate that is close
> to the minimizing point. I'd suggest using the Levenberg-Marquardt
> option.
>
> HTH
>
> Jomar
 | 
Pages: 1
Prev: NFFT Lib
Next: To MATLAB Central Newsreader