Prev: which toolbox
Next: DSP
From: Yi Cao on 20 Mar 2010 09:04 You always can scale your problem. For example, your currentl problem requires the solution x to be discrete between 1 to 1000, then you can scale it as y = x / 1000 so that the solution range for y is between 0 - 1. When you calculate cost or constraints, you can convert y back to x as x = round(1000 * y). Following the same principle, you can even scale your problem to a range 0 - a << 1 so that you can get more resolution from fmincon. Although fmincon is for continous variables, but it should still be feasible to solve discrete problem. All you have to do is integer relaxiation. Unfortunately, as I know, matlab optimization toolbox does not have any solver for integer problems. HTH Yi "Jan " <fremenzone(a)poczta.onet.pl> wrote in message <ho2brc$510$1(a)fred.mathworks.com>... > Which minimizer would be the best for such task? > > Let me clarify, that minimized function returns doubles, not integers, however this doesn't change much since these doubles are calculated based on integers. I'm afraid that scaling of my problem is not possible. > > "Bruno Luong" <b.luong(a)fogale.findmycountry> wrote in message <ho27cq$1b9$1(a)fred.mathworks.com>... > > "Jan " <fremenzone(a)poczta.onet.pl> wrote in message <ho25kj$66m$1(a)fred.mathworks.com>... > > >How can I solve this problem (removing rounding is not an option in the final version of my algorithm, I've done it only for debugging purposes). > > > > Then FMINCON is a wrong tool for your problem. FMINCON assumes a differentiable cost function. If you have rounding, then you should use other minimizer. > > > > Bruno
From: Matt J on 20 Mar 2010 09:55 "Jan " <fremenzone(a)poczta.onet.pl> wrote in message <ho25kj$66m$1(a)fred.mathworks.com>... How can I solve this problem (removing rounding is not an option in the final version of my algorithm, I've done it only for debugging purposes). ================= It might be worth hearing why you can't remove rounding. You might have a false impression that this is insurmountable. Other than that, you could try fminsearch() which doesn't require derivatives.
From: James Allison on 22 Mar 2010 12:10 The issue with using fmincon here is the rounding aspect of the objective function. If you were to plot the objective function against the optimization variable, you would probably see a stair-step looking response, instead of a smooth (i.e., differentiable) curve. You might have to zoom in to see this. fmincon is a gradient-based algorithm, which requires that objective and constraint functions are differentiable with respect to the optimization variable(s). Any function that involves rounding is not differentiable, so fmincon is not appropriate for solving this problem. When you start fmincon, it takes a small step to calculate a finite-difference approximation to the gradient. With default algorithm settings, these small steps are very small (as you have observed), and it looks like the step is taken in a small flat region of your objective function; fmincon sees this as a zero gradient, and thinks that you are at a stationary point (potential local optimum). I would suggest trying patternsearch from the Global Optimization Toolbox (previously the GADS toolbox): http://www.mathworks.com/products/global-optimization/ It can handle non-smooth functions, and typically converges faster than the genetic algorithm (ga), a heuristic optimization algorithm in the Global Optimization Toolbox that also handles non-smooth problems. I can think of two other alternatives: 1) Adjust fmincon parameters to handle this problem using optimset. You will need to increase the bounds on the finite difference step size, as well as loosen the tolerances on the optimization variable and objective/constraint functions. This will help the algorithm 'skip over' the non-smoothness in the response, and you may be able to get fmincon to converge very quickly. The tradeoff is your solution will not be as precise. You may want to compare results from this approach with patternsearch to see if the precision is acceptable. 2) Use a successive surrogate modeling approach. Sample the objective and constraint functions (using lhsdesign, etc.), and fit a smooth surrogate function to the data (such as a polynomial surface or neural network model). Find the minimum of the surrogate model using fmincon. Resample the objective and constraint functions near the approximate solution, and fit a new model. Repeat this iteratively, shrinking the sampling region, until you have sufficient precision. This is a more advanced approach, but if you are developing an algorithm that will be used repeatedly, this might be a worthwhile investment. I hope this helps, -James Jan wrote: > Which minimizer would be the best for such task? > > Let me clarify, that minimized function returns doubles, not integers, > however this doesn't change much since these doubles are calculated > based on integers. I'm afraid that scaling of my problem is not possible. > > "Bruno Luong" <b.luong(a)fogale.findmycountry> wrote in message > <ho27cq$1b9$1(a)fred.mathworks.com>... >> "Jan " <fremenzone(a)poczta.onet.pl> wrote in message >> <ho25kj$66m$1(a)fred.mathworks.com>... >> >How can I solve this problem (removing rounding is not an option in >> the final version of my algorithm, I've done it only for debugging >> purposes). >> >> Then FMINCON is a wrong tool for your problem. FMINCON assumes a >> differentiable cost function. If you have rounding, then you should >> use other minimizer. >> >> Bruno
From: James Allison on 22 Mar 2010 12:15 fminsearch applies only to unconstrained problems; Jan mentioned that the optimization problem has constraints. I suggest patternsearch from the Global Optimization Toolbox. See my more detailed remarks in response to Jan's second post. -James Matt J wrote: > "Jan " <fremenzone(a)poczta.onet.pl> wrote in message > <ho25kj$66m$1(a)fred.mathworks.com>... > > How can I solve this problem (removing rounding is not an option in the > final version of my algorithm, I've done it only for debugging purposes). > ================= > > It might be worth hearing why you can't remove rounding. You might have > a false impression that this is insurmountable. > > Other than that, you could try fminsearch() which doesn't require > derivatives.
From: Matt J on 22 Mar 2010 13:05
James Allison <james.allison(a)mathworks.com> wrote in message <ho8533$p5k$2(a)fred.mathworks.com>... > fminsearch applies only to unconstrained problems; Jan mentioned that > the optimization problem has constraints. ========== Then perhaps this constrained version from the FEX? http://www.mathworks.com/matlabcentral/fileexchange/13469-fminsearchcon In any case, I think we need to hear from the OP whether this problem originated by discretizing a differentiable cost function. If so, then it might be possible to pass an analytical derivative to fmincon(), possibly post-discretized in the same way (cf. Yi's suggestion). |