From: James Allison on
If your objective is to find specific parameter values that produce the
best image quality, I would suggest using an optimization approach
instead of an exhaustive full-factorial search like you are performing.
You will be able to identify a better solution with fewer function
evaluations.

The fminsearch algorithm might work, but if you have any bounds on
variables, or if the number of parameters is too large for fminsearch to
handle, you will need to try something from the optimization toolbox,
such as lsqnonlin or fmincon. If the objective function is non-smooth or
has multiple optima, you may need to use functions from the global
optimization toolbox:

http://www.mathworks.com/products/global-optimization/

If for some reason you need more than just the set of parameters that
produces the best image, I would recommend using a more efficient design
of experiments technique than full-factorial. Something like lhsdesign
can help you extract more information with fewer function evaluations
(and no loop).

Best Regards,

-James

Daphne wrote:
>
> Sorry for being so vague. I really didn't know where to start, as in the
> loops there are about 400 lines of code in the main function + about 5
> subfunctions (the simulation, image procesing, iterations on various
> parameters and other goodies). I guess what I would like to do is not
> vectorize the code itself (I've vectorized as much as I could think of),
> but perhaps find a way to reduce the number of loops needed to send the
> parameters list into the main subfunction. I'm guessing that's not
> possible. I really don't know what to put here, and can't put the whole
> code (length and reveal).
> I do make sure to preallocate and try to clear any unnecessary variables
> between runs, I also use the ~ for unneeded variables.
> What the function does is basically is collect the parameters from the
> loops, call the main subfunction and generate an image according to
> specifications, then I run image processing procedures on a thresholded
> image (a previously determined threshold), compare the processed image
> to an original to find quality (another time-consuming bit, small enough
> to post so here it is)
>
> A = matrix;
> B = estimated_matrix;
> W = weight_factor;
> Q = W *(sum(sum(and( A, B)))/sum(sum( A))) + ...
> (1-W)*(sum(sum(and(1-A,1-B)))/sum(sum(1-A)));
>
> Once good quality is obtained, many parameters are calculated and saved
> into a file. Sorry I can't be more specific...
>
> About the files, yes, I open (fopen) one file in the begining and use
> fprintf to write the line of final data to it after each run. Perhaps it
> would be better to just collect it all into a matrix and dump it to a
> file every few hundreds of lines (~60 columns worth of numerical data
> each run).
> dlmwrite (as opposed to fprintf) doesn't require to keep the file open,
> does it?
> Daphne
From: Daphne on

Thanks for the suggestions!
Unfortuntaly, in this case I do need all the permutations of the variables, as I am testing parameters.
What you sent is great for later experiments of mine, thanks!

Does anyone know how much slower is fprintf as compared to dlmwrite?
I wonder if its worth my time to convert...

Daphne


James Allison <james.allison(a)mathworks.com> wrote in message <i2cbb8$bsv$1(a)fred.mathworks.com>...
> If your objective is to find specific parameter values that produce the
> best image quality, I would suggest using an optimization approach
> instead of an exhaustive full-factorial search like you are performing.
> You will be able to identify a better solution with fewer function
> evaluations.
>
> The fminsearch algorithm might work, but if you have any bounds on
> variables, or if the number of parameters is too large for fminsearch to
> handle, you will need to try something from the optimization toolbox,
> such as lsqnonlin or fmincon. If the objective function is non-smooth or
> has multiple optima, you may need to use functions from the global
> optimization toolbox:
>
> http://www.mathworks.com/products/global-optimization/
>
> If for some reason you need more than just the set of parameters that
> produces the best image, I would recommend using a more efficient design
> of experiments technique than full-factorial. Something like lhsdesign
> can help you extract more information with fewer function evaluations
> (and no loop).
>
> Best Regards,
>
> -James
>
> Daphne wrote:
> >
> > Sorry for being so vague. I really didn't know where to start, as in the
> > loops there are about 400 lines of code in the main function + about 5
> > subfunctions (the simulation, image procesing, iterations on various
> > parameters and other goodies). I guess what I would like to do is not
> > vectorize the code itself (I've vectorized as much as I could think of),
> > but perhaps find a way to reduce the number of loops needed to send the
> > parameters list into the main subfunction. I'm guessing that's not
> > possible. I really don't know what to put here, and can't put the whole
> > code (length and reveal).
> > I do make sure to preallocate and try to clear any unnecessary variables
> > between runs, I also use the ~ for unneeded variables.
> > What the function does is basically is collect the parameters from the
> > loops, call the main subfunction and generate an image according to
> > specifications, then I run image processing procedures on a thresholded
> > image (a previously determined threshold), compare the processed image
> > to an original to find quality (another time-consuming bit, small enough
> > to post so here it is)
> >
> > A = matrix;
> > B = estimated_matrix;
> > W = weight_factor;
> > Q = W *(sum(sum(and( A, B)))/sum(sum( A))) + ...
> > (1-W)*(sum(sum(and(1-A,1-B)))/sum(sum(1-A)));
> >
> > Once good quality is obtained, many parameters are calculated and saved
> > into a file. Sorry I can't be more specific...
> >
> > About the files, yes, I open (fopen) one file in the begining and use
> > fprintf to write the line of final data to it after each run. Perhaps it
> > would be better to just collect it all into a matrix and dump it to a
> > file every few hundreds of lines (~60 columns worth of numerical data
> > each run).
> > dlmwrite (as opposed to fprintf) doesn't require to keep the file open,
> > does it?
> > Daphne