Skip to content
Switch branches/tags
Go to file
Cannot retrieve contributors at this time

jHDF Change Log




  • Adds support for reading in-memory files from byte[] or ByteBuffers
  • Breaking API change To support in-memory files HdfFile#getHdfChannel is replaced by HdfFile#getHdfBackingStorage which now returns a HdfBackingStorage. Internally the new interface replaces the use of HdfFileChannel
  • Fix reading empty arrays in variable length datasets
  • Dependency updates
  • Update Gradle



  • Add checksum validation, with "Jenkins Lookup 3 Hash". Will help to detect file corruption.
  • Add support for opening a HDF5 file from an InputStream. Many Java API provide InputStreams so this improves integration possibilities.
  • Test and coverage improvements
  • Test and build dependency updates


  • Add support for v1 and v2 Data Layout Messages Fix
  • Add support for Old Object Modification Time Message - Improves compatibility with older files
  • Fix issue if compact datasets are read multiple times
  • Improve handling of empty contiguous datasets
  • Test and coverage improvements
  • Test dependency updates


  • Breaking API change Dataset#getDiskSize is renamed Dataset#getSizeInBytes and Attribute#getDiskSize is renamed Attribute#getSizeInBytes
  • New API method Dataset#getStorageInBytes which returns the total storage size of the dataset. By comparison with Dataset#getSizeInBytes allows the compression ratio to be obtained
  • Fixes an issue when reading empty datasets with no allocated storage
  • Code quality improvements and cleanup
  • Dependency updates
  • CI and build improvements



  • Add support for reading half precision (16 bit) floats
  • Add support for getting the ByteBuffer backing contiguous datasets and attributes
  • Memory usage and performance improvements
  • Test coverage improvements
  • CI and build improvements




  • Add support for chunked v4 datasets with b-tree chunk indexing
  • Improve exceptions for unsupported b-tree records
  • Improve test coverage
  • Upgrade to Gradle 6.0.1



  • Fix bug in chunked v4 datasets (added in v0.5.0) where incorrect data was returned if, fixed array or extensible array indexing was used and the dataset dimensions were not a multiple of the chunk dimensions.
  • Adds support for enum datasets (which are returned in string form)
  • Adds HdfFile convenience constructors for URI and Path
  • Fix
  • Update dependencies
  • Refactors test files to separate HDF5 files from scrips.
  • Improvements to test coverage.





  • Adds support for compound datasets
  • Adds support for array data type
  • Adds support for reading chunked datasets with Fletcher32 checksums. Note: the checksum is not verified.
  • Improved performance of Dataset.isEmpty method
  • Dependency updates


  • Fix - Big (>10x) performance improvement for chunked dataset reads. Chunks are now decompressed in parallel and the resulting data copies are a large as possible.
  • Update Gradle to 5.5
  • Update test dependencies



  • Initial work for #49 slow chunked dataset reads
  • Lots of typos cleaned up
  • Add additional build data to MANIFEST.MF


  • Add support for byte shuffle filter
  • Many filter management improvements including support for dynamically loaded filters
  • Add support for reading dataset fill values
  • Checkstyle added to improve code consistency - not full code formatting yet...
  • Update Gradle to 5.4
  • Update commons-lang3 to 3.9 (Java 8)
  • Update mockito-core to 2.27.+


  • Add support for broken links
  • Add support for attribute and link creation order tracking
  • Allow superblock v1 files to be loaded
  • Improve exceptions thrown when lazy loading fails
  • Fix bug to allow non-cached groups to be loaded
  • Improvement to documentation
  • Update Gradle
  • Update test dependencies
  • Code base cleanup
  • Improvements to CI builds and PR validation


  • Add support for accessing attributes (see
  • Add support for scalar datasets
  • Add support for empty datasets
  • Add support for files with user blocks
  • Fix bug where "old" style groups containing soft links could not be opened
  • Fix bug reading unsigned numbers from "awkward" buffer sizes
  • Lots of minor code cleanup and refactoring
  • Improvements to tests and coverage


  • Fix bug when fixed size string datasets contain strings of exactly that size.
  • Fix bug where >1D fixed size datasets could not be read
  • Add more JavaDoc
  • Minor refactoring


  • Add support for String datasets
  • Remove Dataset.getDataBuffer - Not all datasets can reasonably support accessing the backing buffer
  • Dataset.getMaxSize now always returns a result previously returned Optional if no max size was in the file now it returns the dataset size if no max size is present.
  • Remove dependency on org.slf4j.slf4j-simple, now just depends on slf4j-api
  • Update SLF4J to 1.8.0-beta4
  • Update to Gradle 5.2.1 and Gradle plugins


  • First release to support reading chunked datasets. (note: v1.8 files only)
  • Initial support for compressed datasets, GZIP only at the moment.

Pre 0.3.0

Lots of initial development towards being able to read HDF5 files in pure Java. See Git history if your interested.