Prev: Load flow
Next: There are no products to install
From: Oleg Komarov on 22 Feb 2010 05:50 "Jean " <domnul_jan(a)yahoo.com> wrote in message <hltl06$8kj$1(a)fred.mathworks.com>... > Hello, > > I am still waiting for your answer. > > "big data" <bigdata(a)bigdata.com> wrote in message <hlmdg3$3k8$1(a)fred.mathworks.com>... > > "Jean " <domnul_jan(a)yahoo.com> wrote in message <hlm7d4$oru$1(a)fred.mathworks.com>... > > > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM). > > > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool. > > > Thank you Mathworks for dissapointing me! > > > > > I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer. > > > > Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology. Don't wait somebody's answer, read the documentation on how to avoid memory problems. Many CSSMers already showed you that the problem is probably due to poor coding. If you need help to individuate where the out of memory occurs, post here what you've done so far. Oleg
From: David R. on 23 Feb 2010 04:47 "Jean " <domnul_jan(a)yahoo.com> wrote in message <hltl06$8kj$1(a)fred.mathworks.com>... > Hello, > > I am still waiting for your answer. > > "big data" <bigdata(a)bigdata.com> wrote in message <hlmdg3$3k8$1(a)fred.mathworks.com>... > > "Jean " <domnul_jan(a)yahoo.com> wrote in message <hlm7d4$oru$1(a)fred.mathworks.com>... > > > Although Matlab is an excellent tool in signal processing, it has totally dissapointed me these days when I tried even the SIMPLEST function on large signals (let's say 2.5 million samples or higher). Not only that you must say goodbye to ffts, xcorrs, etc., but even smiple PLOTS cannot be showed because the "OUT OF MEMORY" error appears (in spite of my 4Gb of RAM). > > > If you want to "play" research, then Matlab is the perfect tol for you. If you want to do some advanced signal processing o real signals, then orient yourself to another tool. > > > Thank you Mathworks for dissapointing me! > > > > > I run a 64-bit box with 64-bit OS and 64-bit Matlab and 16GB RAM. I routinely work with large datasets right up to 35GB page file size. Matlab is perfectly happy with large data provided you have an adequate computer. > > > > Coming in here muttering about wanting your money back without having done the most basic research into the subject is lame. You owe TMW an apology. Hi. Though I am not an expert of ML's internal solutions, here's my guess: - If You plot an array, ML will have to copy (duplicate) the data to a memory space associated with the figure, so that You can zoom in even if You delete the array from workspace. - Therefore, trying to plot large datasets will consume large amounts of memory. - Since You cannot see more than say 2000 points at once (not eonught pixels there), therefore You could plot - say - yourdata(1:1000:end) and make Your zooming responsive, i.e. updating the properties of Your line object with the data You actually need for a certain degree of zoom. (See zoom object, line object, axes properties, etc.) Does that make sense? Best regards, David
From: Eric on 23 Feb 2010 14:12
Anybody claiming to brag that their research is "real" while everybody else here is playing ought to 1. understand why plotting 10 million data points on a monitor might be impractical. Let's say you want to print/display a plot of 10 million data points at 300 data points per inch. This will take 33,333 inches = 0.53 miles = 0.85 km of linear space. Scale the ordinate appropriately and you could see it from space! 2. have a better computer. You can get a Dell T5500 with two Xeon quad-core processors, 24 GB of RAM, and an NVidia Quadro FX 3800 video card for $5143. Combine this with a skilled analyst and Matlab x64 and processing large data sets can be done effectively. If you're doing "real" research I'm guessing your customer would rather pay for this rather than for your time playing with the /3GB switch. I'm guessing for many of the people in this forum 10 million data points is small. I routinely work with data sets consisting of 30,000 images that are each 128x128 (491 million data points) or 60 images that are each 2048x2048 (251 million data points). These don't seem that big to me and I'm guessing many people here work with data sets orders of magnitude larger. -Eric |