From: Ken on
OK-- I am getting very close.

By moving my "H5D.get_space" call up one line so that it occurs prior to extending the dimensions, I am able to append to the dataset without the "dummy arrays" requirement.



Here is my latest code:

%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];
data_combined=permute(data_combined,[3 2 1]);
data2_perm=permute(testdata2,[3 2 1]);

filename = 'test3dim_newmod.h5'
dsetname = 'my_dataset'

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------


%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};
space = H5S.create_simple (3, dims, maxdims);
%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);

H5P.set_chunk(dcpl, chunk);

%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);


%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% ===============New HDF5 File Created ====================




%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Get Data Space and Extend Existing Dataset Dimensions
space = H5D.get_space(datasetID);
H5D.extend(datasetID, newdims);

%
% Setup Hyperslab ---- See: http://www.hdfgroup.org/HDF5/doc1.6/RM_H5S.html#Dataspace-SelectHyperslab for more info.
start = [47 0 0];
stride = [1 1 1];
count = [1 1 1];
block = [1 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_AND', start, stride, count, block);

% Write Data to newly extended dimensions
H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data2_perm);

H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);
% ===================New HDF5 File Appended =================





The only problem I am having now is that I am only one (1) array away from having the exact results that I want. Note that start=[47 0 0];. I need this to be start=[48 0 0];, but when I try that I get the following error:
___________________________________________________________________
??? Error using ==> H5ML.hdf5 at 25
The HDF5 library encountered an error:

Error in ==> H5D.write at 24
H5ML.hdf5('H5Dwrite', dataset_id, mem_type_id, mem_space_id, file_space_id, plist_id, buf);

Error in ==> test18 at 98
H5D.write(datasetID, 'H5T_NATIVE_FLOAT', 'H5S_ALL', space,'H5P_DEFAULT', data2_perm);
____________________________________________________________________

Any suggestions would be greatly appreciated.

Thanks!
From: Dinesh Iyer on
Hello Ken,
I have written a simple code that shows how you can append data to dataset after extending the dimensions. You can modify this code to suit your needs.

Dinesh

%%
clc
clear all

% The objective of this exercise is to write data into an dataset that has
% unlimited dimensions. We will create a dataset that has an initial size
% of 10-by-20. It will be then extended upto 10-by-25 by adding a 10-by-5
% hyperslab of data

% Create the HDF5 file
fileName = 'myFile.h5';
fcpl_id = H5P.create('H5P_FILE_CREATE');
fapl_id = H5P.create('H5P_FILE_ACCESS');

fid = H5F.create(fileName, 'H5F_ACC_TRUNC', fcpl_id, fapl_id);

% Create the Space for the Dataset
initDims = [10 20];
h5_initDims = fliplr(initDims);
maxDims = [10 -1];
h5_maxDims = fliplr(maxDims);
space_id = H5S.create_simple(2, h5_initDims, h5_maxDims);

% Create the Dataset
dsetName = 'myDataset';
dcpl_id = H5P.create('H5P_DATASET_CREATE');
chunkSize = [10 1];
h5_chunkSize = fliplr(chunkSize);
H5P.set_chunk(dcpl_id, h5_chunkSize);

dsetType_id = H5T.copy('H5T_NATIVE_DOUBLE');

dset_id = H5D.create(fid, dsetName, dsetType_id, space_id, dcpl_id);

% Initial Data to Write
rowDim = initDims(1); colDim = initDims(2);
initDataToWrite = reshape( (0:rowDim*colDim-1), rowDim, colDim );

% Write the initial data
H5D.write(dset_id, 'H5ML_DEFAULT', 'H5S_ALL', 'H5S_ALL', 'H5P_DEFAULT', initDataToWrite);

% Close the open Identifiers
H5S.close(space_id);
H5D.close(dset_id);
H5F.close(fid);

%%
% Open the Dataset and append data to the unlimited dimension which in this
% case is the seond dimension as seen from MATLAB.
fid = H5F.open(fileName, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
dset_id = H5D.open(fid, dsetName);

% Create the data to be appended
dimsOfData = [ 10 5 ];
h5_dimsOfData = fliplr(dimsOfData);

% Get the Dataspace of the Dataset to be appended
space_id = H5D.get_space(dset_id);

[~, h5_currDims] = H5S.get_simple_extent_dims(space_id);
currDims = fliplr(h5_currDims);

% Update the extend of the Dataspace to match the data to be appended
newDims = currDims;
newDims(2) = currDims(2) + dimsOfData(2);
h5_newDims = fliplr(newDims);

H5D.set_extent(dset_id, h5_newDims);

% Data to append
rowDim = dimsOfData(1); colDim = dimsOfData(2);
dataToWrite = 10*reshape( (0:rowDim*colDim-1), rowDim, colDim );

% Update the File Space ID such that only the appended data is written.
H5S.close(space_id);
space_id = H5D.get_space(dset_id);

% Define the hyperslab selection
start = [0 currDims(2)]; h5_start = fliplr(start);
stride = [1 1]; h5_stride = fliplr(stride);
count = [1 1]; h5_count = fliplr(count);
block = dimsOfData; h5_block = fliplr(block);

H5S.select_hyperslab(space_id, 'H5S_SELECT_SET', h5_start, h5_stride, h5_count, h5_block);

% Write the Data
memSpace_id = H5S.create_simple(2, h5_dimsOfData, []);
H5D.write(dset_id, 'H5ML_DEFAULT', memSpace_id, space_id, 'H5P_DEFAULT', dataToWrite);

% Close the open identifiers
H5S.close(memSpace_id);
H5S.close(space_id);
H5D.close(dset_id);
H5F.close(fid);
From: Ken on
Thank you Dinesh! You helped me solve my dilemma!

The memSpace_id did the trick.

For those interested, here is the final code that writes the file and dataset with the correct dimensions that I was after:



%----------------------------------------------------------------------------------------
% Initialize Data
%----------------------------------------------------------------------------------------

testdata = single(ones(48,100,2));
testdata2=testdata.*2;

data_initialize =zeros(size(testdata2));
data_combined=[ data_initialize; testdata2];
data_combined=permute(data_combined,[3 2 1]);
data2_perm=permute(testdata2,[3 2 1]);

filename = 'test3dim_works.h5'
dsetname = 'my_dataset'

dims(1) = 48;
dims(2) = 100;
dims(3) = 2;

newdims(1) = 96;
newdims(2) = 100;
newdims(3) = 2;

chunk(1) = 48;
chunk(2) = 100;
chunk(3) = 2;

%----------------------------------------------------------------------------------------
% Create Initial HDF5 File
%----------------------------------------------------------------------------------------


%
% Create a new file using the default properties.
%
fileID = H5F.create(filename, 'H5F_ACC_TRUNC', 'H5P_DEFAULT', 'H5P_DEFAULT');
%
% Create dataspace with unlimited dimensions.
%
maxdims = {'H5S_UNLIMITED', 'H5S_UNLIMITED', 'H5S_UNLIMITED'};
space = H5S.create_simple (3, dims, maxdims);
%
% Create the dataset creation property list, add the gzip
% compression filter and set the chunk size.
%
dcpl = H5P.create('H5P_DATASET_CREATE');
H5P.set_deflate(dcpl, 9);

H5P.set_chunk(dcpl, chunk);

%
% Create the compressed unlimited dataset.
%
datasetID = H5D.create(fileID, dsetname, 'H5T_NATIVE_FLOAT', space, dcpl);
%
% Write the data to the dataset.
%
datatypeID = H5T.copy('H5T_NATIVE_FLOAT');
H5D.write(datasetID, datatypeID,'H5S_ALL', 'H5S_ALL','H5P_DEFAULT', testdata);


%
% Close and release resources.
%
H5P.close(dcpl);
H5D.close(datasetID);
H5S.close(space);
H5F.close(fileID);

% =============New HDF5 File Created ===================




%----------------------------------------------------------------------------------------
% Open Existing HDF5 File and Append Data to Dataset
%----------------------------------------------------------------------------------------

% Open Existing HDF5 File
fileID = H5F.open(filename, 'H5F_ACC_RDWR', 'H5P_DEFAULT');
% Open Existing Dataset
datasetID = H5D.open(fileID, dsetname);

% Get Data Space and Extend Existing Dataset Dimensions
H5D.extend(datasetID, newdims);
space = H5D.get_space(datasetID);

%
% Setup Hyperslab ----
% See:
% http://www.hdfgroup.org/HDF5/doc1.6/RM_H5S.html#Dataspace-SelectHyperslab
% for more info.
%
start = [48 0 0 ];
stride = [1 1 1];
count = [1 1 1];
block = [48 100 2];

H5S.select_hyperslab(space, 'H5S_SELECT_SET', start, stride, count, block);

% Write Data to newly extended dimensions
memspaceID = H5S.create_simple(3, block, []);
H5D.write(datasetID, 'H5T_NATIVE_FLOAT', memspaceID, space,'H5P_DEFAULT', data2_perm);

H5S.close(memspaceID);
H5S.close(space);
H5D.close(datasetID);
H5F.close(fileID);

% ===============New HDF5 File Appended ====================


Thanks again!