From: Mads on
Hi,

I am trying to minimize a non-linear function in two varibles. To test whether it works, I have tried using a function to which I know the solution. When my starting guess IS the solution, fminunc of course returns the starting guess as it should. However, when I vary the starting guess just slightly, the precision of the result is bad. Is there a way to increase the precision of the result?

Best regards and thanks a lot!

Mads
From: Marcus M. Edvall on
You can setup the problem with tomSym, as per this page for example:
http://tomsym.com/nlp_programming_matlab.html

Then 1st and 2nd order derivatives will be generated automatically and
your precision will become the best it can be.

Best wishes, Marcus
http://tomopt.com/
http://tomdyn.com/
From: Alan Weiss on
Mads wrote:
> Hi,
>
> I am trying to minimize a non-linear function in two varibles. To test whether it works, I have tried using a function to which I know the solution. When my starting guess IS the solution, fminunc of course returns the starting guess as it should. However, when I vary the starting guess just slightly, the precision of the result is bad. Is there a way to increase the precision of the result?
>
> Best regards and thanks a lot!
>
> Mads
There are some ideas on improving results here:
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/br44i2r.html

In particular, try changing to central finite differences
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/br44i2r.html#br544um-1
or, even better, supply a gradient and Hessian if you can
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/br44i2r.html#br544vw-1
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/br44i2r.html#br544qb-1

Of course, you can always fool around with tolerances
http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/br44i2r.html#br5440b-1

Alan Weiss
MATLAB mathematical toolbox documentation
 | 
Pages: 1
Prev: Non-Normally distributed data
Next: loop a matrix