From: Roland Kruse on 6 May 2010 04:58 Well, of course I wanted to say that the problem is more serious on x32 than on x64.
From: Bruno Luong on 6 May 2010 05:53 "Roland Kruse" <roland.kruse(a)uni-oldenburg.de> wrote in message <hru03c$dh9$1(a)fred.mathworks.com>... > The problem seems to be even stranger. > I use R2010 and Win 7 x64. > If I run the first example > > A = sprand(10000,10000,0.1); > C = cell(1,100); > for k=1:100; C{k} = 0*A; end > clear > > I loose no memory, even when I use mtimesx. > On the other hand, if I make A more sparse, e.g. > > A = sprand(10000,100000,0.001); > But 10 times larger, the number of column is 1 billion and not 10 thousands. The number of buffer allocated for non zeros elements in sparse is proportional to the matrix size. Bruno
From: Roland Kruse on 6 May 2010 08:33 You may have noticed that I used "clear" to delete all variables. Even though, during each run of the program I loose ~ 70 MB till I'm out of memory. This is certainly not the correct behaviour. Strangely, if I set A = sprand(100000,10000,0.001) instead of sprand(10000,100000,0.001), A uses still the same amount of memory but I "loose" only about 8 MB. Interestingly, this is about the size of C which is 80 MB in the first case and 8 MB in the second case, even though A has the same number of elements. And C has been cleared but the memory is still allocated.
First
|
Prev
|
Pages: 1 2 3 Prev: InitialMagnification is linear? Next: Need cross-sectional area of non-convex 3D point cloud. |