From: us on 19 Jul 2010 08:28 "fabio freschi" <fabio.freschi(a)remove.gmail.com> wrote in message <i21fl0$r88$1(a)fred.mathworks.com>... > Unfortunately it is not easy to reproduce the matrix. The problem comes from a finite element analysis of an electromagnetic structure. I can only provide the matrices for different meshes, but the complete code to assemble the stiffness matrix comes from many years of research and implementation. It is not easy to extract a way > > I can guarantee that the matrix is relatively "good" from the conditioning point of view, since it is explicitly gauged. A 51000x51000 matrix has an estimated condition number (via condest) of 1.7971e+09, not too bad for this kind of simulations. > Fabio the problem is: - CSSMers would like to help by repeating/simulating your results... - this, however, is only possible, if they have an input that ~resembles what you have... us
From: fabio freschi on 19 Jul 2010 08:47 you are right, but the problem is not of closeness of my code (I am going to publish it on the FEX as soon as the manual will be ready), but of being able to extract a subset of functions that re-create the problem. I will try to produce a matrix with similar properties by repeatedly using matrices in 'gallery'. You will hear from me soon Fabio
From: John D'Errico on 19 Jul 2010 09:17 "fabio freschi" <fabio.freschi(a)remove.gmail.com> wrote in message <i21fl0$r88$1(a)fred.mathworks.com>... > Unfortunately it is not easy to reproduce the matrix. The problem comes from a finite element analysis of an electromagnetic structure. I can only provide the matrices for different meshes, but the complete code to assemble the stiffness matrix comes from many years of research and implementation. It is not easy to extract a way > > I can guarantee that the matrix is relatively "good" from the conditioning point of view, since it is explicitly gauged. A 51000x51000 matrix has an estimated condition number (via condest) of 1.7971e+09, not too bad for this kind of simulations. > Fabio You have told us only enough to guess that it is probably an issue of returning several quite large arrays, which in sparse form will still be darn full. A lower triangular matrix that is 51000x51000 will require around 10 gigabytes of ram to store. So just moving these things around will take some serious time. John
From: fabio freschi on 19 Jul 2010 11:01 Dear John, your reply is according with my initial guess. The point is that I can reproduce the "problem" with a simple matrix built starting from 'gallery': >> N = 120; >> A = gallery('wathen',N,N)+1i*gallery('wathen',N,N); >> b = ones(size(A,1),1); >> tic, x = A\b; toc Elapsed time is 1.158922 seconds. >> tic, [L,U,p,q,R] = lu(A,'vector'); toc Elapsed time is 1.668238 seconds. in this last case the memory allocated for L is 53 MB. The computational cost to return the outputs is extremely large, at least in my opinion, given that it is necessary to pass from the mex-file to matlab only the pointer to an already existent variable. Fabio BTW: for the original problem (51000x51000) the dimension of the triangular factor is 512 MB.
From: us on 19 Jul 2010 11:51 "fabio freschi" <fabio.freschi(a)remove.gmail.com> wrote in message <i21pbl$n1e$1(a)fred.mathworks.com>... > Dear John, > your reply is according with my initial guess. The point is that I can reproduce the "problem" with a simple matrix built starting from 'gallery': hmm... that's what i was asking for all along... us
First
|
Prev
|
Pages: 1 2 Prev: Read data with non-uniform structure Next: Read data with non-uniform structure |