From: James on
Hi I'm trying to get auto-correlation of a 10512 x 120 matrix with 6 shifts. The result is 13 x 10512^2 matrix which it seems kills my computer. I'm using a MAC with 32GB of RAM and it's still crashing! I was wondering if there was a way to reduce the memory imprint of xcorr? The dataset is a time-series and I'm trying to get the cross correlation between each row with 6 month lags.
The only other solution that I found was to use a nested for-loop:
for i = 1:10512
for j = 1: 10512
cc(:,i) = xcorr(data(i,:),data(j,:),6,'coeff');
end
end

But this is so slow it takes more than a day to get the cc matrix! Is there another way I could perform row-wise cross correlation without the nested for-loop?
Thank you.
James
From: Steven Lord on

"James " <jfaghm(a)googlemail.com> wrote in message
news:i14stm$k24$1(a)fred.mathworks.com...
> Hi I'm trying to get auto-correlation of a 10512 x 120 matrix with 6
> shifts. The result is 13 x 10512^2 matrix which it seems kills my
> computer.

A 13-by-110502144 (= 10512^2) real full double precision matrix requires a
contiguous block of memory of size approximately 10.7 GB. Since you can't
work with that large a contiguous block of memory on a 32-bit system I'm
assuming your machine is a 64-bit system. Even on such a system, allocating
that much memory is going to take TIME.

> I'm using a MAC with 32GB of RAM and it's still crashing!

If it's crashing reproducibly, please send the crash report (and whatever
commands/data are necessary to reproduce the crash) to Technical Support.

If it's throwing an error indicating that MATLAB is out of memory, that's
different.

--
Steve Lord
slord(a)mathworks.com
comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
To contact Technical Support use the Contact Us link on
http://www.mathworks.com


From: Godzilla on
"Steven Lord" <slord(a)mathworks.com> wrote in message <i153t5$qnf$1(a)fred.mathworks.com>...
>
> "James " <jfaghm(a)googlemail.com> wrote in message
> news:i14stm$k24$1(a)fred.mathworks.com...
> > Hi I'm trying to get auto-correlation of a 10512 x 120 matrix with 6
> > shifts. The result is 13 x 10512^2 matrix which it seems kills my
> > computer.
>
> A 13-by-110502144 (= 10512^2) real full double precision matrix requires a
> contiguous block of memory of size approximately 10.7 GB. Since you can't
> work with that large a contiguous block of memory on a 32-bit system I'm
> assuming your machine is a 64-bit system. Even on such a system, allocating
> that much memory is going to take TIME.
>
> > I'm using a MAC with 32GB of RAM and it's still crashing!
>
> If it's crashing reproducibly, please send the crash report (and whatever
> commands/data are necessary to reproduce the crash) to Technical Support.
>
> If it's throwing an error indicating that MATLAB is out of memory, that's
> different.
>
> --
> Steve Lord
> slord(a)mathworks.com
> comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
> To contact Technical Support use the Contact Us link on
> http://www.mathworks.com
>

have you considered low-pass filtering the data and then decimation?
From: James on
Steven,
Thank you for the response. There is no crash per se. After a while, Mac OS just opens a prompt telling me I'm running out of memory and must "Force Quit" MATLAB. Also sometimes, I get the Outofmemory error for time to time.
James


"Steven Lord" <slord(a)mathworks.com> wrote in message <i153t5$qnf$1(a)fred.mathworks.com>...
>
> "James " <jfaghm(a)googlemail.com> wrote in message
> news:i14stm$k24$1(a)fred.mathworks.com...
> > Hi I'm trying to get auto-correlation of a 10512 x 120 matrix with 6
> > shifts. The result is 13 x 10512^2 matrix which it seems kills my
> > computer.
>
> A 13-by-110502144 (= 10512^2) real full double precision matrix requires a
> contiguous block of memory of size approximately 10.7 GB. Since you can't
> work with that large a contiguous block of memory on a 32-bit system I'm
> assuming your machine is a 64-bit system. Even on such a system, allocating
> that much memory is going to take TIME.
>
> > I'm using a MAC with 32GB of RAM and it's still crashing!
>
> If it's crashing reproducibly, please send the crash report (and whatever
> commands/data are necessary to reproduce the crash) to Technical Support.
>
> If it's throwing an error indicating that MATLAB is out of memory, that's
> different.
>
> --
> Steve Lord
> slord(a)mathworks.com
> comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
> To contact Technical Support use the Contact Us link on
> http://www.mathworks.com
>
From: James on
Thank you Godzilla for your response. I'm still getting acquainted with time-series.
Are you referring to the y = decimate(x,r) method?
Thanks.
James

"Godzilla " <godzilla(a)tokyo.edu> wrote in message <i164oj$og8$1(a)fred.mathworks.com>...
> "Steven Lord" <slord(a)mathworks.com> wrote in message <i153t5$qnf$1(a)fred.mathworks.com>...
> >
> > "James " <jfaghm(a)googlemail.com> wrote in message
> > news:i14stm$k24$1(a)fred.mathworks.com...
> > > Hi I'm trying to get auto-correlation of a 10512 x 120 matrix with 6
> > > shifts. The result is 13 x 10512^2 matrix which it seems kills my
> > > computer.
> >
> > A 13-by-110502144 (= 10512^2) real full double precision matrix requires a
> > contiguous block of memory of size approximately 10.7 GB. Since you can't
> > work with that large a contiguous block of memory on a 32-bit system I'm
> > assuming your machine is a 64-bit system. Even on such a system, allocating
> > that much memory is going to take TIME.
> >
> > > I'm using a MAC with 32GB of RAM and it's still crashing!
> >
> > If it's crashing reproducibly, please send the crash report (and whatever
> > commands/data are necessary to reproduce the crash) to Technical Support.
> >
> > If it's throwing an error indicating that MATLAB is out of memory, that's
> > different.
> >
> > --
> > Steve Lord
> > slord(a)mathworks.com
> > comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
> > To contact Technical Support use the Contact Us link on
> > http://www.mathworks.com
> >
>
> have you considered low-pass filtering the data and then decimation?