An extendible dataset is one whose dimensions can grow. HDF5 allows you to define a dataset to have certain initial dimensions, then to later increase the size of any of the initial dimensions.
HDF5 requires you to use chunking to define extendible datasets. This makes it possible to extend datasets efficiently without having to excessively reorganize storage. (To use chunking efficiently, be sure to see the advanced topic, Chunking in HDF5.)
The following operations are required in order to extend a dataset:
This example shows how to create a 3 x 3 extendible dataset, write to that dataset, extend the dataset to 10x3, and write to the dataset again:
For details on compiling an HDF5 application: Compiling HDF5 Applications
An unlimited dimension dataspace is specified with the H5S_CREATE_SIMPLE call, by passing in
H5S_UNLIMITED as an element of the maxdims array.
The H5P_CREATE call creates a new property as an instance of a property list class. For creating an extendible array dataset, pass in H5P_DATASET_CREATE for the property list class.
The H5P_SET_CHUNK call modifies a Dataset Creation Property List instance to store a chunked layout dataset and sets the size of the chunks used.
To extend an unlimited dimension dataset use the the H5D_SET_EXTENT call. Please be aware that after this call, the dataset's dataspace must be refreshed with H5D_GET_SPACE before more data can be accessed.
The H5P_GET_CHUNK call retrieves the size of chunks for the raw data of a chunked layout dataset.
Once there is no longer a need for a Property List instance, it should be closed with the H5P_CLOSE call.