From: Rune Allnor on
On 20 Feb, 11:35, "BHARATH " <bhushabhu...(a)gmail.com> wrote:
> hi..
>
>  I am working in hyperspectral image.... i converted into .tif format and read through matlab using imread.
>
>   then while i converted into 2D matrix by appending as each column as one band...
>
>  i found out of memory,,,,, mostly while doing svd i have this problem......
>
> my image dimension is 145*145*220

4.6 million elements, possibly some 37 MBytes, if each
element is stored as an 8-byte double precision number...

> and other one is
> 1280*308*191

75 million elements / 600 MBytes...

> and in multispectral image while doing svd i have this problem for 169*169*7
>
> can u please tell me how to tackle this problem

1) Make sure you store the data on an efficient binary format

2) Break the data down into smaller batches, either by reducing
the number of channels, or by reducing the covered area.

Rune
From: BHARATH on
Rune Allnor <allnor(a)tele.ntnu.no> wrote in message <d6609bbe-39ea-4930-922a-60f5de978997(a)o3g2000yqb.googlegroups.com>...
> On 20 Feb, 11:35, "BHARATH " <bhushabhu...(a)gmail.com> wrote:
> > hi..
> >
> >  I am working in hyperspectral image.... i converted into .tif format and read through matlab using imread.
> >
> >   then while i converted into 2D matrix by appending as each column as one band...
> >
> >  i found out of memory,,,,, mostly while doing svd i have this problem.....
> >
> > my image dimension is 145*145*220
>
> 4.6 million elements, possibly some 37 MBytes, if each
> element is stored as an 8-byte double precision number...
>
> > and other one is
> > 1280*308*191
>
hi sir
i have the matrix of size 30,000*220. i have problem in doing svd ... error occur as out of memory......
how large big matrix can matlab handle.....

> 75 million elements / 600 MBytes...
>
> > and in multispectral image while doing svd i have this problem for 169*169*7
> >
> > can u please tell me how to tackle this problem
>
> 1) Make sure you store the data on an efficient binary format
>
> 2) Break the data down into smaller batches, either by reducing
> the number of channels, or by reducing the covered area.
>
> Rune
From: Steven Lord on

"BHARATH " <bhushabhusha(a)gmail.com> wrote in message
news:hltaep$cqs$1(a)fred.mathworks.com...
> Rune Allnor <allnor(a)tele.ntnu.no> wrote in message
> <d6609bbe-39ea-4930-922a-60f5de978997(a)o3g2000yqb.googlegroups.com>...
>> On 20 Feb, 11:35, "BHARATH " <bhushabhu...(a)gmail.com> wrote:
>> > hi..
>> >
>> > I am working in hyperspectral image.... i converted into .tif format
>> > and read through matlab using imread.
>> >
>> > then while i converted into 2D matrix by appending as each column as
>> > one band...
>> >
>> > i found out of memory,,,,, mostly while doing svd i have this
>> > problem.....
>> >
>> > my image dimension is 145*145*220
>>
>> 4.6 million elements, possibly some 37 MBytes, if each
>> element is stored as an 8-byte double precision number...
>>
>> > and other one is
>> > 1280*308*191
>>
> hi sir
> i have the matrix of size 30,000*220. i have problem in doing svd ...
> error occur as out of memory......
> how large big matrix can matlab handle.....

Read the Memory Management Guide for the answer to your second question:

http://www.mathworks.com/support/tech-notes/1100/1106.html

As for your first, if you call SVD on a 30000-by-220 matrix, the resulting
three matrices will be 30000-by-30000, 30000-by-220, and 220-by-220. That
first matrix is what's causing you to run out of memory.

Instead, compute the economy SVD using svd(A, 0) or svd(A, 'econ') as
described in HELP SVD.

--
Steve Lord
slord(a)mathworks.com
comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ


From: BHARATH on
thanks sir, i worked with economy type... now its ok