From: Matlabuser on 17 Mar 2010 09:03 Thank you both for your answers. I know it's bad programming practice. The reason behind it is that I am running matlab functions from Excel (I compiled the functions as an Excel add-in). Excel (and the excel add-in compiler) is in 32-bit. When I try to run the matlab "memory" function from Excel, I only see 250MB for the "maximum possible array" (although Windows shows I have a massive amount of free RAM) (and I get 2047MB when I run this function directly in Matlab) So I am experimenting some ways of "recovering" RAM and wanted to try the "clear all" command by running it from excel. And you can't run scripts from Excel (or let's say I don't know how to do it) so I needed to compile it in a function. All of this because I am often facing "out of memory" errors, because of Excel being 32-bits. And unfortunately for me, even though Excel 2010 (64-bit) will be released in a couple months, it doesn't seem like Matlab will release the XL compiler in 64-bit any time soon. And I don't know why Matlab sees so little free RAM when I run it through Excel. I have Excel 2007, and in theory, Excel 2007 can use up to 2GB of RAM (I have a total of 6GB and I am running XP 64-bit)
From: Steven Lord on 17 Mar 2010 09:41 "Matlabuser " <b.enis(a)laposte.net> wrote in message news:hnqju6$960$1(a)fred.mathworks.com... > Thank you both for your answers. > I know it's bad programming practice. > The reason behind it is that I am running matlab functions from Excel (I > compiled the functions as an Excel add-in). > Excel (and the excel add-in compiler) is in 32-bit. > When I try to run the matlab "memory" function from Excel, I only see > 250MB for the "maximum possible array" (although Windows shows I have a > massive amount of free RAM) > (and I get 2047MB when I run this function directly in Matlab) One possible reason is that you're running Excel in addition to MATLAB -- depending on where Excel and any libraries it depends upon is loaded into memory, you could see your largest contiguous block of memory reduced quite a bit. Let's say (for sake of demonstration) you had 5 bytes of memory: OOOOO You currently have 5 bytes as your "maximum possible array". Now you load Excel (which takes 1 byte of memory) and a library upon which Excel depends (also 1 byte.) In the best case scenario: OOOEL % E = Excel, L = library you have 3 bytes as your max array size. In the worst case scenario: OEOLO you have _1_ byte as your max array size, even though you have 3 bytes of memory unused. > So I am experimenting some ways of "recovering" RAM and wanted to try the > "clear all" command by running it from excel. > And you can't run scripts from Excel (or let's say I don't know how to do > it) so I needed to compile it in a function. > > All of this because I am often facing "out of memory" errors, because of > Excel being 32-bits. > And unfortunately for me, even though Excel 2010 (64-bit) will be released > in a couple months, it doesn't seem like Matlab will release the XL > compiler in 64-bit any time soon. > > And I don't know why Matlab sees so little free RAM when I run it through > Excel. > I have Excel 2007, and in theory, Excel 2007 can use up to 2GB of RAM (I > have a total of 6GB and I am running XP 64-bit) Take a look at the Memory Management guide: http://www.mathworks.com/support/tech-notes/1100/1107.html The OS reserves a large chunk of your memory. MATLAB uses additional memory just to run (not to store data in, but to hold the program itself.) Excel uses additional memory. Any other application (Internet Explorer? Firefox? Your favorite music player? Your email program?) uses additional memory. -- Steve Lord slord(a)mathworks.com comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ
|
Pages: 1 Prev: Question about crop image Next: Embedded Matlab Function: global matrix, RTW/RTWT |