From: student melaku fekadu on
Hi John,

I used fminunc (instead of fminsearch), and got the following message:

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
{ Warning: Gradient must be provided for trust-region algorithm;
using line-search algorithm instead.}
> In fminunc at 347
In minestimation_simplex at 31
First-order
Iteration Func-count f(x) Step-size optimality
0 61 46.4514 0

Initial point is a local minimum.

Optimization completed because the size of the gradient at the initial point
is less than the selected value of the function tolerance.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

My optimization set is:
my_options = optimset;
my_options.Display = 'iter';
my_options.TolFun = 1e-3;
my_options.MaxIter = 500;
my_func = @(x) objective_function(x, bla, blabla, blablabla)
disp(sprintf('now minimizing (simplex) %s parameters...')
[x fval exitflag output] = fminunc(my_func, x0, my_options)

It did not do any iteration. What I can do to fix this? I removed a random number draw from my function to avoid complication when fminunc computes gradient (I don’t know if it does matter). There remains a use of max inside my function to compare and choose max value among choices (I don’t know if it complicates).
John, can I send you my function by mail for you to see what the problem is? It will be much of a help for me.

Thanks a lot,
melaku
From: Matt J on

> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> { Warning: Gradient must be provided for trust-region algorithm;
> using line-search algorithm instead.}
> > In fminunc at 347
> In minestimation_simplex at 31
> First-order
> Iteration Func-count f(x) Step-size optimality
> 0 61 46.4514 0
>
> Initial point is a local minimum.


fminunc thinks your initial point is already a local minimum. Why don't you make some plots of your function in the neighbourhood of this point to see if that is true.


> IThere remains a use of max inside my function to compare and choose max value among choices (I don’t know if it complicates).
=======

It sounds like a bad thing. fminunc is designed for differentiable functions. Functions composed of max operations are not differentiable.
From: John D'Errico on
"student melaku fekadu" <melaku.fekadu(a)gmail.com> wrote in message <i2fj4v$l4f$1(a)fred.mathworks.com>...
> Hi John,
>
> I used fminunc (instead of fminsearch), and got the following message:
>
> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> { Warning: Gradient must be provided for trust-region algorithm;
> using line-search algorithm instead.}
> > In fminunc at 347
> In minestimation_simplex at 31
> First-order
> Iteration Func-count f(x) Step-size optimality
> 0 61 46.4514 0
>
> Initial point is a local minimum.
>
> Optimization completed because the size of the gradient at the initial point
> is less than the selected value of the function tolerance.
> >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>
> My optimization set is:
> my_options = optimset;
> my_options.Display = 'iter';
> my_options.TolFun = 1e-3;
> my_options.MaxIter = 500;
> my_func = @(x) objective_function(x, bla, blabla, blablabla)
> disp(sprintf('now minimizing (simplex) %s parameters...')
> [x fval exitflag output] = fminunc(my_func, x0, my_options)
>
> It did not do any iteration. What I can do to fix this? I removed a random number draw from my function to avoid complication when fminunc computes gradient (I don&#8217;t know if it does matter).

YES IT DOES MATTER.


> There remains a use of max inside my function to compare and choose max value among choices (I don&#8217;t know if it complicates).
> John, can I send you my function by mail for you to see what the problem is? It will be much of a help for me.

You had a random component in this function?

You do realize that this makes your function not only
not differentiable, but it would no longer even be
continuous? No optimizer will handle that problem
well. And even stochastic optimizers, like genetic
algorithms, simulated annealing, partical swarms, etc.,
may have serious problems with such an objective.

Similarly, the use of max will make your function
non-differentiable. Again, this precludes the use of
fminunc. Even fminsearch may have difficulties, even
if you could use it on a problem with 54 dimensions,
which you simply cannot do.

Other things that will cause any optimizer to fail are
parameters which must be integer, rounding operations
inside the objective, etc. Again, these things cause an
optimizer which will try to differentiate your function
to fail.

You can send it to me by mail, but I give no assurance
that I will be able to dig through a complex objective
function with any success.

John
From: student melaku fekadu on
Hi Matt,

Thanks for your response. i will check to see if it is really a local minimum which i feel it is not true. i need also to think how i get rid of the max operator in my function. i dont know if that helps but i will try.

Thanks a lot,
Melaku


"Matt J " <mattjacREMOVE(a)THISieee.spam> wrote in message <i2fk2v$ik7$1(a)fred.mathworks.com>...
>
> > >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> > { Warning: Gradient must be provided for trust-region algorithm;
> > using line-search algorithm instead.}
> > > In fminunc at 347
> > In minestimation_simplex at 31
> > First-order
> > Iteration Func-count f(x) Step-size optimality
> > 0 61 46.4514 0
> >
> > Initial point is a local minimum.
>
>
> fminunc thinks your initial point is already a local minimum. Why don't you make some plots of your function in the neighbourhood of this point to see if that is true.
>
>
> > IThere remains a use of max inside my function to compare and choose max value among choices (I don&#8217;t know if it complicates).
> =======
>
> It sounds like a bad thing. fminunc is designed for differentiable functions. Functions composed of max operations are not differentiable.