[ HDF5 Tutorial Top ] [ Advanced Topics Top ] [ Next ] [ Prev ]

Chunking and Extendible Datasets


Creating an Extendible Dataset

An extendible dataset is one whose dimensions can grow. HDF5 allows you to define a dataset to have certain initial dimensions, then to later increase the size of any of the initial dimensions.

HDF5 requires you to use chunking to define extendible datasets. This makes it possible to extend datasets efficiently without having to excessively reorganize storage.

The following operations are required in order to write an extendible dataset:

  1. Declare the dataspace of the dataset to have unlimited dimensions for all dimensions that might eventually be extended.
  2. Set dataset creation properties to enable chunking.
  3. Create the dataset.
  4. Extend the size of the dataset.

Programming Example


This example shows how to create a 3 x 3 extendible dataset, write to that dataset, extend the dataset to 10x3, and write to the dataset again.
NOTE: To download a tar file of the examples, including a Makefile, please go to the References page.


The National Center for Supercomputing Applications

University of Illinois at Urbana-Champaign


Last Modified: November 21, 2001