Prev: Fixed point number truncation ... Help !!!!!
Next: An exact simplification challenge - 99 (EllipticK/Pi)
From: Matt J on 27 Jul 2010 17:19 "Samuel Edwards" <DJeter1234(a)AOL.com> wrote in message <i2nauv$bfo$1(a)fred.mathworks.com>... > I have rescaled the problem so fmincon inputs (mean, s.d.1, s.d.2, s.d.3, constant) with a neutral starting guess of [0,1,1,1,0]. I run fmincon, and then rescale by the last guess so that fmincon starts at [1,1,1,1,1] and minimizes new_guess.*(mean, s.d.1, s.d.2, s.d.3, constant). Is this an good way to deal with flat functions? Is there a better way? I'm fairly new to modeling, so I apologize if this is trivial. ============== The ideal way to correct for bad scaling is to supply a user-defined analytical Hessian to fmincon. Since this is only a 5-parameter problem, that doesn't seem to be a very difficult thing to do. A cheaper knock-off would be to evaluate the Hessian at your neutral starting guess and use the diagonal of its inverse as scale factors.
From: Samuel Edwards on 27 Jul 2010 17:22
Thanks for the input. I had tried running iterated lsqnonlin, but for some reason or another it seems to converge on a worse answer that the iterated fmincon, albeit more quickly. The difference in sum squared is very small, but the difference in estimated parameters is large. Neither seems to find and actual minimum, because increasing FunTol and increasing the iterations find better answers. Alan Weiss <aweiss(a)mathworks.com> wrote in message <i2ndqb$k7c$1(a)fred.mathworks.com>... > Your procedure seems OK. > > However, I wonder why you don't use one of the least squares solvers, > such as lsqnonlin, which can do a better job of overcoming the > well-known problem of flatness near the solution. Perhaps you need some > nonlinear constraints. But if you only need bounds, try lsqnonlin. > http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/brhkghv-18.html#brhkghv-19 > > Alan Weiss > MATLAB mathematical toolbox documentation > > On 7/27/2010 3:10 PM, Samuel Edwards wrote: > > I am running a least squares minimization with fmincon to solve a least > > squares problem. Essentially, a 5-tuple of parameters maps to an 8-tuple > > of percentages. The parameters are of the form (mean, 1/variance1, > > 1/variance2, 1/variance3, constant scaled to mean). Unfortunately, it > > seems that the sum of squares function is very flat around the minimum > > over a pretty big range of variance1,2,3 and constant. > > > > I have rescaled the problem so fmincon inputs (mean, s.d.1, s.d.2, > > s.d.3, constant) with a neutral starting guess of [0,1,1,1,0]. I run > > fmincon, and then rescale by the last guess so that fmincon starts at > > [1,1,1,1,1] and minimizes new_guess.*(mean, s.d.1, s.d.2, s.d.3, > > constant). Is this an good way to deal with flat functions? Is there a > > better way? I'm fairly new to modeling, so I apologize if this is trivial. |