From: Ashish Uthama on 18 Feb 2010 21:14 On Thu, 18 Feb 2010 20:48:04 -0500, Vanessa Lim <v.lim(a)auckland.ac.nz> wrote: > "Ashish Uthama" <first.last(a)mathworks.com> wrote in message > <op.u8b0js1ja5ziv5(a)uthamaa.dhcp.mathworks.com>... >> On Wed, 17 Feb 2010 15:33:18 -0500, Vanessa Lim <v.lim(a)auckland.ac.nz> >> wrote: >> > Hi, >> > I have 2 machines available at work to do EEG processing, I've tried >> to > increase my memory on both machines as I have files that are >> greater > that 200 MB and they won't get progressed by matlab. I seem >> to have a > limit of reading in files about 68 MB using EEGlab > >> (http://sccn.ucsd.edu/eeglab/). I have tried to increase limits by > >> checking the allocated memory on my vista laptop (Intel, 2 duo CPU, > >> P8600 @ 2.4 GHz, memory 4 G, 32 bit OS) (says paging file size, > >> allocated 3871 MB; recommended 5356, min 16 MB) I have it on > >> automatically manage paging file size for all drivers as the matlab on >> > memory suggests. When I do this >> > x=rand(1000,1000,200) >> > ??? Error using ==> rand >> > Out of memory. Type HELP MEMORY for your options. >> > >> > feature memstats I get Physical Memory (RAM): >> > In Use: 2097 MB (83115000) >> > Free: 1473 MB (5c1eb000) >> > Total: 3571 MB (df300000) >> > Page File (Swap space): >> > In Use: 2056 MB (80801000) >> > Free: 5270 MB (1496fd000) >> > Total: 7326 MB (1c9efe000) >> > Virtual Memory (Address Space): >> > In Use: 486 MB (1e621000) >> > Free: 1561 MB (619bf000) >> > Total: 2047 MB (7ffe0000) >> > Largest Contiguous Free Blocks: >> > 1. [at 1a790000] 1323 MB (52b30000) >> > 2. [at 7c41b000] 50 MB ( 32d5000) >> > 3. [at 6ec23000] 26 MB ( 1aed000) >> > 4. [at 6d2c9000] 24 MB ( 18b7000) >> > 5. [at 772cb000] 16 MB ( 1015000) >> > 6. [at 1812c000] 16 MB ( 1004000) >> > 7. [at 70713000] 10 MB ( a3d000) >> > 8. [at 7f7f0000] 7 MB ( 7b5000) >> > 9. [at 7355f000] 7 MB ( 791000) >> > 10. [at 73d62000] 7 MB ( 76e000) >> > ======= ========== >> > 1490 MB (5d2b3000) >> > >> > Any ideas how to change, increase my memory please? Many thanks. >> > Vanessa Lim >> You have 1473 MB free and the ability to create a 1323MB array. >> How would increasing this help you in reading a 68MB file, dont you >> already have enough? >> I suspect bad code somewhere, if you know where in the code this is >> thrown, try putting a break-point before it and see if you notice >> anything interesting. > > The software is Matlab based that I am using to do analyses, but it says > ?? Out of memory. Type Help Memory' for your options at files that are > bigger than 70 MB even though I have as you say 1473 free in matlab. Again: I suspect bad code somewhere, if you know where in the code this is thrown, try putting a break-point before it and see if you notice anything interesting. Even if the file size is in itself 70MB, the program you are using might be creating multiple copies inside, or maybe the data is compressed and while uncompressing you run out of memory? You could either contact the authors of tool with your data and explain your situation or you could look at their code and figure out whats happening. Aside: Your RAND call above would need 1000*1000*200*8/1024/1024 MB, which is roughly about 1525MB which is greater than the largest contigous free block (1323), so no surprises that this call results in an out of memory issue.
From: Walter Roberson on 18 Feb 2010 21:40 Vanessa Lim wrote: > The software is Matlab based that I am using to do analyses, but it says > ?? Out of memory. Type Help Memory' for your options at files that are > bigger than 70 MB even though I have as you say 1473 free in matlab. Is the software creating covariance matrices?
First
|
Prev
|
Pages: 1 2 Prev: Programmatically creating toolbar shortcut - Next: ylabel wrote backward. |