From: Craig on
Hi,

I'm trying to read in a large excel file to use in a neural network, it is approximately 40 columns by 650,000 rows. After running the m file for a while, I get a "not enough storage" error message. My question is, is there any way to read in a single row at a time? The input file could be either *.xlsx or *.csv. I am looking for a way to read a row, perform some action on it, then load in the next line etc. This will save me having the whole file loaded into a single variable in Matlab.

Thanks
From: us on
"Craig " <p09312600(a)dmu.ac.uk> wrote in message <hsc8hl$jda$1(a)fred.mathworks.com>...
> Hi,
>
> I'm trying to read in a large excel file to use in a neural network, it is approximately 40 columns by 650,000 rows. After running the m file for a while, I get a "not enough storage" error message. My question is, is there any way to read in a single row at a time? The input file could be either *.xlsx or *.csv. I am looking for a way to read a row, perform some action on it, then load in the next line etc. This will save me having the whole file loaded into a single variable in Matlab.
>
> Thanks

well... did you carefully read

help xlsread; % <- in particular, examples five to nine

us
From: Craig on
"us " <us(a)neurol.unizh.ch> wrote in message <hsc9a1$8gm$1(a)fred.mathworks.com>...
> "Craig " <p09312600(a)dmu.ac.uk> wrote in message <hsc8hl$jda$1(a)fred.mathworks.com>...
> > Hi,
> >
> > I'm trying to read in a large excel file to use in a neural network, it is approximately 40 columns by 650,000 rows. After running the m file for a while, I get a "not enough storage" error message. My question is, is there any way to read in a single row at a time? The input file could be either *.xlsx or *.csv. I am looking for a way to read a row, perform some action on it, then load in the next line etc. This will save me having the whole file loaded into a single variable in Matlab.
> >
> > Thanks
>
> well... did you carefully read
>
> help xlsread; % <- in particular, examples five to nine
>
> us

Hi,

Thanks for your response, the problem I have with this method is that the whole file is reloaded every time one row is read from the excel file; is there any automatic way to buffer this data transfer, or would I need to look a doing it manually.

Thanks.
From: Walter Roberson on
Craig wrote:
> "us " <us(a)neurol.unizh.ch> wrote in message
> <hsc9a1$8gm$1(a)fred.mathworks.com>...
>> "Craig " <p09312600(a)dmu.ac.uk> wrote in message
>> <hsc8hl$jda$1(a)fred.mathworks.com>...

>> > > I'm trying to read in a large excel file to use in a neural
>> network, it is approximately 40 columns by 650,000 rows. After running
>> the m file for a while, I get a "not enough storage" error message. My
>> question is, is there any way to read in a single row at a time? The
>> input file could be either *.xlsx or *.csv. I am looking for a way to
>> read a row, perform some action on it, then load in the next line etc.
>> This will save me having the whole file loaded into a single variable
>> in Matlab.

>> help xlsread; % <- in particular, examples five to nine

> Thanks for your response, the problem I have with this method is that
> the whole file is reloaded every time one row is read from the excel
> file; is there any automatic way to buffer this data transfer, or would
> I need to look a doing it manually.

When you said "buffer" I though, "Well, there's mmapfile()" -- but if the file
is large, you would possibly run out of virtual address space when you went to
map it.

So, what I would suggest is that you use .csv files, fopen() them for 'rt'
(read text file), and use textscan telling textscan that you want to read only
1 line. You might want to play with the Delimiter or WhiteSpace parameters,
especially if some of the entries might be strings that contain spaces.

If you run into trouble getting textscan() to only read from one line, then
you could instead use fgetl() to read a single line into a char variable, and
then pass that char variable to textscan() where you would normally specify
the fid (file identifier returned by fopen).