From: ashutosh srivastava on
dear readers,

i'm in need to know that how much system-memory is being utilised in executing a particular *.m file.
is there any command for the same, or is there any suitable method to know the computational complexity involved therein.

-----------
ashutosh srivastava
From: Walter Roberson on
ashutosh srivastava wrote:

> i'm in need to know that how much system-memory is being utilised in
> executing a particular *.m file.
> is there any command for the same,
> or is there any suitable method to
> know the computational complexity involved therein.

See my reply in
http://www.mathworks.com/matlabcentral/newsreader/view_thread/286357

> or is there any suitable method to
> know the computational complexity involved therein.

If your memory storage is indefinite precision but each element to be
stored has a finite range, then you only need a single memory location
to store everything. You can even goedelize the values if using the
least infinite memory is important somehow.

If your memory storage is finite precision, then unless you have an
infinite number of such locations you will eventually run out of memory
on a large enough (non-trivial) problem. However, whether you *will* run
out of memory or not for a particular run depends upon implementation
details in addition to algorithmic complexity. For example, your
algorithmic complexity might be completely linear in the problem size,
but you might run out of actual memory if you represent the data as
double precision in a situation that single precision would be adequate.

About this point you should read my earlier reply.... Go ahead, I'll wait.

http://i51.photobucket.com/albums/f373/wildflowerco62/EATERY/hands_twiddling_thumbs_lg_nwm.gif


Conclusion: "system memory usage" is a totally inadequate method of
measuring algorithmic complexity.