From: Naved Nouyed on
Hi,

I've been trying to load LGBP enocded local histogram information of about 1000 images into a cell array. Each image is Gabor filtered for 5 scales and 8 orientations so we have 40 Gabor images, and then each Gabor image is divided into 64 parts and local histogram is extracted from them. The size of the histogram bin is 256. So, for each image we have a 5x8x64x256 array. I am trying to store all this information in a cell array of the size of 1000

LH_q = cell (1000,1 );

then trying to preallocate the size like this,

scale = 5; % scales
orient = 8; % orientations
region = 64; % number of regions
bin = 256; % the number of histogram bins (gray level range)

for i = 1:1000
LH_q{i} = cell(scale, orient, region);
for v = 1:scale
for mu = 1:orient
for r = 1:region
LH_q{i}{v,mu,r} = zeros(1, bin);
end
end
end
% whos
end

But I am getting a out of memory error after storing information of about 250 images.

My chkmem recomments using 3GB switch to increase memory but it doesn't work since the computer performance deteriorates and MATLAB doesn't start if I use this switch in my boot.ini.

Since I can't append a single variable periodically to save on disk, I wonder if there is anyway I can increase the memory available. Here's my memstats report:

>> feature memstats

Physical Memory (RAM):
In Use: 654 MB (28e6a000)
Free: 1383 MB (5678e000)
Total: 2037 MB (7f5f8000)
Page File (Swap space):
In Use: 534 MB (2161d000)
Free: 4405 MB (11351b000)
Total: 4939 MB (134b38000)
Virtual Memory (Address Space):
In Use: 478 MB (1dec4000)
Free: 1569 MB (6211c000)
Total: 2047 MB (7ffe0000)
Largest Contiguous Free Blocks:
1. [at 320b5000] 924 MB (39c5b000)
2. [at 202c5000] 285 MB (11d3b000)
3. [at 6bd58000] 65 MB ( 4148000)
4. [at 1c010000] 63 MB ( 3ff0000)
5. [at 7d1d4000] 37 MB ( 251c000)
6. [at 6fea8000] 27 MB ( 1ba8000)
7. [at e6f0000] 24 MB ( 18e0000)
8. [at 74dfb000] 21 MB ( 1585000)
9. [at 71ac7000] 21 MB ( 1539000)
10. [at 73026000] 18 MB ( 126a000)
======= ==========
1489 MB (5d19a000)

Any suggestion will be helpful. Let me know if any further information is needed.

Thanks in advance.
From: Malcolm Lidierth on
It's likely you do not have a large enough block of contiguous memory. On windows, try
>> feature('memstats').
at he command line.
From: Malcolm Lidierth on
Ah - should read it all. You have an upper limit of 924Mb, hence the problem.
See http://www.mathworks.com/support/tech-notes/1100/1106.html
From: Naved Nouyed on
"Malcolm Lidierth" <ku.ca.lck(a)htreidil.mloclam> wrote in message <i04b7t$fst$1(a)fred.mathworks.com>...
> Ah - should read it all. You have an upper limit of 924Mb, hence the problem.
> See http://www.mathworks.com/support/tech-notes/1100/1106.html

Hi,

Thanks for replying. I've read the help sections and only thing that I've found that I can do to increase the upper limit of largest contiguous block of memory is to add the /3gb switch in boot.ini :

http://www.mathworks.com/support/tech-notes/1100/1107.html#_Toc170182655

But, this doesn't work because after booting in, the OS gets extremely slow and MATLAB even fails to load. My page file size is 3055 MB and I have 2 GB physical memory.
From: Naved Nouyed on
"Malcolm Lidierth" <ku.ca.lck(a)htreidil.mloclam> wrote in message <i04b7t$fst$1(a)fred.mathworks.com>...
> Ah - should read it all. You have an upper limit of 924Mb, hence the problem.
> See http://www.mathworks.com/support/tech-notes/1100/1106.html

I think I found something here,
http://www.mathworks.com/access/helpdesk/help/techdoc/matlab_prog/br04bw6-98.html

Here it says that,
"If you increase the number of cells in a cell array over time, the size of the header also grows, thus using more of this segment in memory. This can eventually lead to "out of memory" errors."

Any idea on optimizing my cell array pre-allocation ? I'm working on this.

Thanks