From: Carl on 27 May 2010 13:05 Hi, I am working on a MATLAB script which uses a database of ~300,000 2D matrices, one matrix for each position in a 3D space (x,y,z), each 2D matrix is accessed multiple times during the script. At present I am accessing these matrices from disk and each has a unique filename indicating the x,y and z position. In order to speed up the script I want to hold the entire database in memory. I first tried generating a unique variable name for each point (using eval() ) but the script crashed with a error due to the maximum number of variables being esceeded. I could use a 5D matrix but one dimension of the 2D matrices varies in size from (50-750) so declaring a 5D matrix would use much more memory than the sum of individual matrices. My next idea was to create a structure for each z position containing fields for each (x,y) combination but now the pre-load script crashes the whole MATLAB environment about half way through the run. Two questions: What is the most efficient way to organise such a dataset? Why does my latest attempt crash MATLAB? Any help would be very much appreciated. Running MATLAB 2010 on 64bit Linux system. 4GB RAM Thanks, Carl
From: David Young on 27 May 2010 13:24 I can't say why the script crashes, but assuming that the data will actually fit into the available memory, you should probably investigate cell arrays next. Look at Getting Started > Programming > Other Data Structures > Cell Arrays in the help navigator. A cell array is what I would expect to use to store a lot of matrices of varying size.
|
Pages: 1 Prev: create a sine wave Next: Problem with Matlab on Ubuntu Linux |