From: Ritesh on
Hi,

I am trying to implement nelder mead algorithm to an optimization problem usng derivative free approach. But the problem is I have a large number of parameters (around ten thousand). So to use this I have to to create a simplex of "number of parameters + 1" sides and evaluate function value at each point. This is very costly as function evaluation take around 10 sec. I have used projected gradient algorithm for the same problem and is trying to develop some quicker algorithm for this specific problem. Is it possible to use Nelder Mead with some variation? If not, is there any other derivative free method that can fasten up the process in this particular case?

Thanks
From: Matt J on


1. More info about the problem could help to pick a good algorithm

2. Why exactly are you looking for a derivative free method? If you also tried projected gradient, it would imply that your objective function is differentiable, so why not exploit that?

From: Ritesh on
Hi Matt

I have modeled my optimization problem as convex model which is amenable to convex optimization solution techniques like projected gradient. But the the accuracy of the solution cannot be guaranteed, though practically it works fine. It is a fluence map optimization process, where we have large number of beams whose intensity we have to optimize. Objective value is calculated by calling an objective function which returns the objective value and gradient if required.

I was trying DFO techniques to check whether it can give better results. But too many beams or say parameters and computing function value too many times is causing problem.
From: Matt J on
When you use projected gradient, what sort of pre-conditioning are you using? If none, it would explain why you're getting poor results.