From: Tom Nimbus on
Hi,

I need to deal with a very large matrix in Matlab, i.e. image(21600, 43200). With a floating point type, this matrix will take more than 3 GB memory. With a 64-bit machine and OS, this amount of memory can be allocated. But Matlab seems not be able to allocate that much of memory automatically. Does anybody have successful experience with similar situation?

Thanks in advance.
From: Matt J on
"Tom Nimbus" <ravensalo(a)gmail.com> wrote in message <hht5hc$evt$1(a)fred.mathworks.com>...
> Hi,
>
> I need to deal with a very large matrix in Matlab, i.e. image(21600, 43200). With a floating point type, this matrix will take more than 3 GB memory. With a 64-bit machine and OS, this amount of memory can be allocated. But Matlab seems not be able to allocate that much of memory automatically. Does anybody have successful experience with similar situation?
===============================

If it works on a 64-bit machine, why not use that?

Generally speaking though, loading such a large matrix into memory sounds like it could be a brute force solution. If you tell us more about the structure of the matrix, people may be able to suggest more efficient, structured solutions. In particular, is the matrix sparse? See "help sparse", if you're not familiar with sparse MATLAB arrays.
From: Tom Nimbus on
"Matt J " <mattjacREMOVE(a)THISieee.spam> wrote in message <hht9s4$p0i$1(a)fred.mathworks.com>...
> "Tom Nimbus" <ravensalo(a)gmail.com> wrote in message <hht5hc$evt$1(a)fred.mathworks.com>...
> > Hi,
> >
> > I need to deal with a very large matrix in Matlab, i.e. image(21600, 43200). With a floating point type, this matrix will take more than 3 GB memory. With a 64-bit machine and OS, this amount of memory can be allocated. But Matlab seems not be able to allocate that much of memory automatically. Does anybody have successful experience with similar situation?
> ===============================
>
> If it works on a 64-bit machine, why not use that?
>
> Generally speaking though, loading such a large matrix into memory sounds like it could be a brute force solution. If you tell us more about the structure of the matrix, people may be able to suggest more efficient, structured solutions. In particular, is the matrix sparse? See "help sparse", if you're not familiar with sparse MATLAB arrays.
> ===============================

Matt, thanks for your replying. Yes, this is a sparse array. It contains data from a large image, in which a lot of pixels are blank with scattered clusters. It's impossible to find good ways to divide the image without affecting the integrity of any cluster in the image. My problem is to load this image into Matlab. Due to the limitation of a 32-bit system, it's not possible to allocate memory above 4 GB. So I tried the 64-bit system, but Matlab still gave me the "out of memory" error.
From: Matt J on
"Tom Nimbus" <ravensalo(a)gmail.com> wrote in message <hhu380$3f4$1(a)fred.mathworks.com>...
> "Matt J " <mattjacREMOVE(a)THISieee.spam> wrote in message <hht9s4$p0i$1(a)fred.mathworks.com>...
> > "Tom Nimbus" <ravensalo(a)gmail.com> wrote in message <hht5hc$evt$1(a)fred.mathworks.com>...
> > > Hi,
> > >
> > > I need to deal with a very large matrix in Matlab, i.e. image(21600, 43200). With a floating point type, this matrix will take more than 3 GB memory. With a 64-bit machine and OS, this amount of memory can be allocated. But Matlab seems not be able to allocate that much of memory automatically. Does anybody have successful experience with similar situation?
> > ===============================
> >
> > If it works on a 64-bit machine, why not use that?
> >
> > Generally speaking though, loading such a large matrix into memory sounds like it could be a brute force solution. If you tell us more about the structure of the matrix, people may be able to suggest more efficient, structured solutions. In particular, is the matrix sparse? See "help sparse", if you're not familiar with sparse MATLAB arrays.
> > ===============================
>
> Matt, thanks for your replying. Yes, this is a sparse array. It contains data from a large image, in which a lot of pixels are blank with scattered clusters. It's impossible to find good ways to divide the image without affecting the integrity of any cluster in the image. My problem is to load this image into Matlab. Due to the limitation of a 32-bit system, it's not possible to allocate memory above 4 GB. So I tried the 64-bit system, but Matlab still gave me the "out of memory" error.
=========================

You might be able to use memmapfile() to access the image file on disk from MATLAB. Then you can convert the data to a sparse matrix, at which point it will presumably occupy much less memory. Then you can save the sparse matrix to a .mat file.
From: Malcolm Lidierth on
For an example of something similar using memory mapping see
http://www.mathworks.com/matlabcentral/fileexchange/17992-3d-cube-slice