From: Stefano pioli on
I have a script that has to load a file of 1gb. Memory occupies 5gb, I use Windows 7 but this script will be used most likely on computers with XP 32-bit, when I did a test has come out of "out of memory ...". I tried with:

http://www.mathworks.com/support/tech-notes/1100/1106.html

http://www.mathworks.com/support/tech-notes/1100/1107.html

http://www.mathworks.com/support/solutions/en/data/1-1HE4G5/index.html?solution=1-1HE4G5

http://technet.microsoft.com/en-us/library/bb124810 28EXCHG.65 29.aspx

Increase the paging file is useless.

From what I understand since Vista 64-bit, the available memory has increased from 2-3gb to 8TB in order to have sufficient "Contiguous Free Blocks." Whereas it is not possible to increase the RAM, how can I fix this?


For uploading use TEXTSCAN. MATLAB suggests in the Help for "Importing Large Data Sets" says:

This example opens a large data file and reads the file segment at a time in a for loop. The code calls textSCAN to read a particular pattern of date (as specified by format) For Each Segment 10.000 times. Following Each read, the subfunction process_data processes the data in cell arrays segarray Collected:

format = '% s% n% s% 8.2f%% 8.2f 8.2f 8.2f%% U8';
file_id = fopen ('largefile.dat', 'r');
for k = 1: segcount
segarray textSCAN = (file_id, format, 10000);
process_data (segarray);
end
fclose (file_id);



May be the solution to the problem? This is my load script:


fid = fopen (filename, 'rt');
date = textSCAN (fid, '% s', 'delimiter','',' headerlines' 9);
fclose (fid);
regexprep data = (cat (1, date {:}),'^ \ d (2): \ d (2): \ d (2) ','');
cellfun date = (@ (x) sscanf (x '% f').', date' uni ', false);
data = cat (1, date {:});

Work with an ASCII file with about 10 million lines (double numbers) .... How can I avoid the "out-of-memory ..."?



Thanks, Stefano
From: Stefano pioli on
Maybe I could split the file to read in order not to fill the ram?

> Thanks, Stefano
From: Stefano pioli on
watching:

http://www.mathworks.com/support/tech-notes/1600/1602.html

Reading a file with low-level functions can avoid the 'out of memory ...". how can I change my import data script?

Thanks, Stefano
From: Stefano pioli on
perhaps with memmapfile or fread, it's possible to resolve the problem of memory,but how can I do?

...please help me...


> Thanks, Stefano