- Add support for Java 25
- Drop support for Java 8 and 11. Allows for improved performance, dependency updates, and simplifies the testing matrix. #700
- Dependency updates
org.slf4j:slf4j-apito 2.0.17org.apache.commons:commons-lang3to 3.18.0- Lots of others for test and build plugins
- Move deployment to Maven Central Portal
- Fix thread-safety of
FileChannelFromSeekableByteChanneladded viaReentrantLock. Thanks to @thomas-reimonn - Add
HdfFile(FileChannel chennel)constructor for custom FileChannel providers. Thanks to @thomas-reimonn
- Add support for slicing of chunked datasets. This allows accessing portions of large, chunked datasets using
getData(offset, length). Thanks to @thomas-reimonn #52 - Add constructor to open an
HdfFile(URL url)to support streaming reading of remote HDF5 files. Thanks to @thomas-reimonn - Add constructor for opening an
HdfFilefrom aSeekableByteChannelproviding the IO access to the underlying resource. Thanks to @thomas-reimonn - Add an implementation of
HttpSeekableByteChannelsupporting streaming of HDF5 files from remote HTTP(s) sources without a local copy. Thanks to @thomas-reimonn
- Fix external links pointing to the root group #689
- Add additional API for custom writing allowing for larger datasets to be written. Thanks to @jshook #692
- Improve Java FileSystem support. Allow use of
FileSystemimplementations that do not supportFileChannel, also allows wider compatability if memory mapped file access is not possible. This improves the ability to use jHDF with file systems like S3. Thanks to @tbrunsch for this contribution. - Some improvements to test infrastructure. Also thanks to @tbrunsch
- Build updates
- Breaking API change Fix typo
WritiableDatasettoWritableDatasetso you will need to make code updates if you are using writing. Thanks to @jshook - Allow files contain datatype version 0 to be read. Note a warning will be logged as this is out of spec. #524
- Add support for reading the
timedatatype. This is not commonly used as it appears to be poorly specified, but it will be read as byte[] to be interpreted. #523 - Build and dependency updates
- Add support for reading implicit index chunked datasets #651 #655
- Test fixes for fixed-array index datasets
- Dependency updates
- Fix issue writing string datasets containing non-ASCII characters #656
- Add support for reading chunked datasets using fixed-array paging. #622
- Allow maven publish tasks to complete without signing, if no signing keys are available. Makes local builds easier and allows building on jitpack.io https://jitpack.io/#jamesmudd/jhdf making a rolling jar release available.
- Dependency updates
- Fix incorrectly written string attributes. #641
- Dependency updates
- Add support for accessing decompressed chunks individually. Thanks to @marcobitplane #626
- Fix OSGi headers, and autogenerate them during the build. Thanks to @mailaender #625 #632
- Delete temporary file when closing a file read from an input stream. Thanks to @ivanwick #262 #636
- Build and dependency updates
- Add support for writing
booleandatasets and attributes as Bitfield
- Add support fo writing
Stringdatasets and attributes - Add
_jHDFdefault attribute to root group - Build and dependency updates
- Major writing support improvements
- Attributes can now be written #552
- Full support for
byte,short,int,long,float,doubleand wrapper classes as datasets and attributes #587 - Support for scalar datasets and attributes
- Much more complete API on writable objects, allowing introspection of data type and data layout and data space etc.
- Many test improvements for writing support
- Build and dependency updates
- Note: This may be the last release supporting Java 8
- Release adding HDF5 writing support! #354. Special thanks to @thadguidry for sponsoring this work. See WriteHdf5.java for example usage. Supports
- Groups
- n-dimensional
byte,intanddoubledatasets
- Fix UTF-8 groups names #539
- Java 21 now officially supported
- Adds
h5dumpto CI builds to perform compatability tests. - Build and dependency updates
- Add initial HDF5 writing support! #354. Special thanks to @thadguidry for sponsoring this work. See WriteHdf5.java for example usage.
- Build and dependency updates
- Add support for files containing superblock extensions #462
- Build and dependency updates
- Add support for LZ4 compressed datasets #415
- Improve BitShuffle performance and reduce memory usage
- Add CI on ARM64 architecture
- Build and dependency updates
- Add support for getting flat data
Dataset#getDataFlat()#397 - Add support for dereferencing addresses/ids
HdfFile.getNodeByAddress(long address)#316 - Dependency updates
- Add support for Bitshuffle filter #366
- Add ability to get filter data
Dataset#getFilters();#378 - Dependency and CI updates
- Add support for slicing of contiguous datasets. This adds a new method
Dataset#getData(long[] sliceOffset, int[] sliceDimensions)allowing you to read sections of a dataset that would otherwise be too large in memory. Note: chunked dataset slicing support is still missing. #52 #361 - Fix OSGi
Export-Packageheader resulting in API access restriction when running in OSGi. #365 #367 - Dependency and CI updates
- Add support for array type data in multi-dimensional datasets #341
- Fix issue reading compound type attributes #338
- Dependency updates
- Fix issue with byte shuffle filter when data length is not a multiple of element length. #318
- Improve testing of byte shuffle and deflate filters
- Add validation running on Java 17
- Improve support for NIO Path. Allows jHDF to open files on non-default file systems such as zip files or remote storage systems. Thanks, @tbrunsch for this contribution #304
- Fix accessing a missing fill value could cause an exception #307
- Dependency updates
- CI and release process improvements
- Breaking API change Dataset#getMaxSize now returns
long[]allowing files with max sizes larger thanintmax to be opened. #283 - Add support for opaque datatype #264
- Improve chunked dataset read performance with default logging #267
- Dependency updates
- Add GitHub Actions CI
- Switch away from Bintray #250
- Add support for committed datatypes #255
- Add support for attributes with shared datatype
- Switch dependencies repository to Maven Central #250
- Code cleanup
- Adds support for reading in-memory files from
byte[]orByteBuffers#245 - Breaking API change To support in-memory files
HdfFile#getHdfChannelis replaced byHdfFile#getHdfBackingStoragewhich now returns aHdfBackingStorage. Internally the new interface replaces the use ofHdfFileChannel - Fix #247 reading empty arrays in variable length datasets
- Dependency updates
- Update Gradle
- Add LZF compression support allowing LZF datasets to be read. #239
- Test dependency updates
- Add checksum validation, with "Jenkins Lookup 3 Hash". Will help to detect file corruption.
- Add support for opening a HDF5 file from an InputStream. Many Java API provide InputStreams so this improves integration possibilities.
- Test and coverage improvements
- Test and build dependency updates
- Add support for v1 and v2 Data Layout Messages Fix #216
- Add support for Old Object Modification Time Message - Improves compatibility with older files
- Fix issue if compact datasets are read multiple times
- Improve handling of empty contiguous datasets
- Test and coverage improvements
- Test dependency updates
- Breaking API change
Dataset#getDiskSizeis renamedDataset#getSizeInBytesandAttribute#getDiskSizeis renamedAttribute#getSizeInBytes - New API method
Dataset#getStorageInByteswhich returns the total storage size of the dataset. By comparison withDataset#getSizeInBytesallows the compression ratio to be obtained - Fixes an issue when reading empty datasets with no allocated storage #162
- Code quality improvements and cleanup
- Dependency updates
- CI and build improvements
- Fix #177 Reading null or padded strings of zero length
- Fix #182 Typo in
Dataset.isVariableLength. This is an breaking API change replace calls toisVariableLentgh()withisVariableLength() - Add initial support for reading large attributes #183
- Dependency updates
- CI and build improvements
- Add support for reading half precision (16 bit) floats
- Add support for getting the ByteBuffer backing contiguous datasets and attributes
- Memory usage and performance improvements
- Test coverage improvements
- CI and build improvements
- Add support for bitfield datasets #84
- Fix #157 support nested compound datasets
- Fix #159 reading null terminated strings filling their buffer
- Add support for raw chunk access. See https://github.com/jamesmudd/jhdf/blob/master/jhdf/src/main/java/io/jhdf/examples/RawChunkAccess.java
- Fix issues running on systems where default charset is not ASCII/UTF8
- Upgrade to Gradle 6.1.1
- Some CI improvements
- Add support for variable length datasets #123
- Add support for Compound datatype v3 messages allowing more compound datasets to be read
- Fix #139 bug accessing chunked v4 string datasets
- Fix #143 bug traversing links
- Code cleanup
- Upgrade to Gradle 6.1
- Update dependencies
- Add support for chunked v4 datasets with b-tree chunk indexing
- Improve exceptions for unsupported b-tree records
- Improve test coverage
- Upgrade to Gradle 6.0.1
- Fix #124 String padding not handled correctly.
- Fix #132 Multi dimensional fixed length string datasets read incorrectly.
- Fix bug in chunked v4 datasets (added in v0.5.0) where incorrect data was returned if, fixed array or extensible array indexing was used and the dataset dimensions were not a multiple of the chunk dimensions.
- Adds support for enum datasets (which are returned in string form) #121
- Adds
HdfFileconvenience constructors forURIandPath - Fix #125
- Update dependencies
- Refactors test files to separate HDF5 files from scrips.
- Improvements to test coverage.
- Adds support for some types (the most common) of chunked v4 datasets:
- Single chunk
- Fixed array
- Extensible array
- Fix #113 fixed length UTF8 datasets can now be read correctly.
- Fix #112 multiple accesses to a global heap object now behave correctly.
- Lots of code cleanup and minor improvements
- Updates dependencies
- Add support for reference data type. Thanks to Gisa Meier and JCzogalla #106 #91
- Creation order tracking is skipped allowing these files to be read
FileChannelcan now be accessed allowing more low-level access to datasets- Add version logging when the library is used
- Fix #101
- Add additional testing of attributes
- Add attribute example
- Adds support for compound datasets
- Adds support for array data type
- Adds support for reading chunked datasets with Fletcher32 checksums. Note: the checksum is not verified.
- Improved performance of
Dataset.isEmptymethod - Dependency updates
- Fix #49 - Big (>10x) performance improvement for chunked dataset reads. Chunks are now decompressed in parallel and the resulting data copies are a large as possible.
- Update Gradle to 5.5
- Update test dependencies
- Fix #88 error when running on Java 8
- Improvements to IDE support
- Improvements to exceptions in currently unsupported cases
- Initial work for #49 slow chunked dataset reads
- Lots of typos cleaned up
- Add additional build data to MANIFEST.MF
- Add support for byte shuffle filter
- Many filter management improvements including support for dynamically loaded filters
- Add support for reading dataset fill values #74
- Checkstyle added to improve code consistency - not full code formatting yet...
- Update Gradle to 5.4
- Update
commons-lang3to 3.9 (Java 8) - Update
mockito-coreto 2.27.+
- Add support for broken links
- Add support for attribute and link creation order tracking #70
- Allow superblock v1 files to be loaded
- Improve exceptions thrown when lazy loading fails
- Fix bug to allow non-cached groups to be loaded
- Improvement to documentation
- Update Gradle
- Update test dependencies
- Code base cleanup
- Improvements to CI builds and PR validation
- Add support for accessing attributes (see Attribute.java)
- Add support for scalar datasets
- Add support for empty datasets
- Add support for files with user blocks
- Fix bug where "old" style groups containing soft links could not be opened
- Fix bug reading unsigned numbers from "awkward" buffer sizes
- Lots of minor code cleanup and refactoring
- Improvements to tests and coverage
- Fix bug when fixed size string datasets contain strings of exactly that size.
- Fix bug where >1D fixed size datasets could not be read
- Add more JavaDoc
- Minor refactoring
- Add support for String datasets
- Remove
Dataset.getDataBuffer- Not all datasets can reasonably support accessing the backing buffer Dataset.getMaxSizenow always returns a result previously returnedOptionalif no max size was in the file now it returns the dataset size if no max size is present.- Remove dependency on
org.slf4j.slf4j-simple, now just depends onslf4j-api - Update SLF4J to 1.8.0-beta4
- Update to Gradle 5.2.1 and Gradle plugins
- First release to support reading chunked datasets. (note: v1.8 files only)
- Initial support for compressed datasets, GZIP only at the moment.
Lots of initial development towards being able to read HDF5 files in pure Java. See Git history if your interested.