From: Marco on 25 Mar 2010 12:40 Hi, I'm using the interior-point algorithm in FMINCON with the analytical expression of the Hessian, setting the following option ....'Hessian','user-supplied','HessFcn',@funHess... To reduce the amount of memory, I define the Hessian matrix as a sparse matrix in the function @funHess I also notice that it is possible to provide the function with the sparsity pattern of the Hessian, using the option "HessPattern". Is that always convenient, even if you provide the analytic expression of the (sparse) Hessian? Actually I have tried it with some problems, and I haven't noticed any improvement in terms of computing speed. By the way, the Matlab help mentions the KNITRO toolbox which seems to be more powerful than FMINCON in case of large and sparse Hessian. Is there anyone who tested the two functions on large systems? Best Marco
|
Pages: 1 Prev: creating slprj directory and model advisor Next: dynamic regexp |