Prev: Load flow
Next: There are no products to install
From: ImageAnalyst on 19 Feb 2010 12:12 Jean: I don't know what your code is but I made two 2.5 million element arrays and cross correlated and plotted them in only 3.18 seconds with no error messages whatsoever. And I just have a notebook computer running 32 bit Windows XP. Here's the code: clc; clear all; close all; workspace; % Generate 2 2.5 million element arrays. tic; data1 = rand(2500000,1); data2 = rand(2500000,1); % Cross correlate them. result = xcorr(data1, data2); toc; % Replies "Elapsed time is 3.178348 seconds." % Plot the various arrays. % Like Rune says, there's too many to see distinct points. subplot(2,2,1); plot(data1, 'r'); title('data1'); set(gcf, 'Position', get(0,'Screensize')); % Maximize figure. subplot(2,2,2); plot(data2, 'r'); title('data2'); subplot(2,2,3); plot(result, 'r'); title('result'); % Show memory stats: memory And here are the results: Elapsed time is 3.178348 seconds. Maximum possible array: 753 MB (7.900e+008 bytes) * Memory available for all arrays: 1206 MB (1.264e+009 bytes) ** Memory used by MATLAB: 581 MB (6.096e+008 bytes) Physical Memory (RAM): 3036 MB (3.184e+009 bytes) * Limited by contiguous virtual address space available. ** Limited by virtual address space available.
From: ImageAnalyst on 19 Feb 2010 12:25 On Feb 19, 11:30 am, "Jean " <domnul_...(a)yahoo.com> wrote: > If with your 64 bit box you can handle 35gb of datasets, please tell me what and how much datasets can handle with my 32 bit box and 4Gb of RAM. Then I will try this to convince myself if I am wrong or not. --------------------------------------------------------------- Jean: Since I showed above that I can run functions on semi-large arrays (just 2.5 million elements) I'm wondering if you have a bunch of arrays hanging aroudn in memory that you don't need anymore. Is it possible that you have some intermediate arrays that were used earlier in your function but are no longer needed? If so they are just taking up memory, and you can get rid of them using the clear command: clear('oldUnneededArray', 'otherUnneededArray'); Perhaps that will help regain your memory.
From: John D'Errico on 19 Feb 2010 12:36 "Jean " <domnul_jan(a)yahoo.com> wrote in message <hlmene$mvr$1(a)fred.mathworks.com>... > Thank you for your reply. Unfortunately I think you missunderstood what I said. Plotting one 10 million samples signal on my computer is possible, but if I want to plot another one on the same figure I cannot. I do not think I am doing sloppy work; my frustration results from the fact that Matlab is the only tool I employ in what I do because of it's ability to adapt to any kind of data. In my work I am faced with different types of data in many formats and sizes. > As has been pointed out you are complaining about the tool, when it is the user who is at fault here. If I hit my thumb with a hammer, should I sue the manufacturer? Should I throw the hammer away because I cannot use it properly? Yes, it may make me feel better to curse the stupid hammer, but really, who is at fault here? Learn to use the tool. Don't be a lazy programmer. John
From: big data on 19 Feb 2010 12:49 "Jean " <domnul_jan(a)yahoo.com> wrote in message <hlmeb0$rge$1(a)fred.mathworks.com>... > If with your 64 bit box you can handle 35gb of datasets, please tell me what and how much datasets can handle with my 32 bit box and 4Gb of RAM. Then I will try this to convince myself if I am wrong or not. > One thing you can do is to implement the 3GB switch in your boot.ini file. Win32 XP can see 4GB. Normally 2GB is allocated to the kernel and 2GB is allocated to applications. Adding the 3GB line in your boot.ini will allow 1GB to be allocated to the kernel and 3GB allocated to applications. This definitely helped when I was running 32 bit processes using "large address aware" applications like Matlab. http://msdn.microsoft.com/en-us/library/ms791558.aspx
From: Steve Amphlett on 19 Feb 2010 13:02
"John D'Errico" <woodchips(a)rochester.rr.com> wrote in message <hlmi64$98h$1(a)fred.mathworks.com>... > "Jean " <domnul_jan(a)yahoo.com> wrote in message <hlmene$mvr$1(a)fred.mathworks.com>... > > Thank you for your reply. Unfortunately I think you missunderstood what I said. Plotting one 10 million samples signal on my computer is possible, but if I want to plot another one on the same figure I cannot. I do not think I am doing sloppy work; my frustration results from the fact that Matlab is the only tool I employ in what I do because of it's ability to adapt to any kind of data. In my work I am faced with different types of data in many formats and sizes. > > > > As has been pointed out you are complaining about > the tool, when it is the user who is at fault here. > > If I hit my thumb with a hammer, should I sue the > manufacturer? Should I throw the hammer away > because I cannot use it properly? Yes, it may make > me feel better to curse the stupid hammer, but > really, who is at fault here? > > Learn to use the tool. Don't be a lazy programmer. > > John Some more fuel on the fire... Have you ever thought about block processing? Is there really a need to do FFT, XCORR, PLOT, etc on the entire dataset? Almost certainly not. When prototyping and testing algorithms, it's natural to use datasets that are workable. But when you get real data, you need to realise that there will always be a point where you can't work on the whole dataset in a single operation. In my experience, dataset sizes have grown as fast as the memory available to process them. It stands to reason, the systems used to measure and store them also grow at the same rate. When I was in the field of analysing measured data, we always used to have an input of "memory" or equivalent. If you had more, you could use bigger blocks. Otherwise, it'd just take a bit longer. |