-
Notifications
You must be signed in to change notification settings - Fork 1
V7.0.0
Introducing version 7.0.0 with major changes, additional features, additional hash algorithms, and bug fixes.
The first major change to mention is that the HMAC version of each wrapped algorithm is no longer a separate instance but rather hidden under the original algorithm with the exposed Key config property. For example, if you use SHA1 without a key, leaving the key null, it will use the original SHA1. If you assign a key, you will get an HMACSHA1 result; the implementation picks this automatically for you. This makes it less prone to bugs and easier to use, as well as reducing complexity.
Second major change to note before we dig further into a list of all the changes is that ArraySegment<byte> has been superseded by ReadOnlySpan<byte>. Even though ArraySegment<byte> can still be implicitly assigned to functions taking ReadOnlySpan<byte>, we still count this as a major change. ArraySegment is something from the past and must be avoided where possible since a better alternative, ReadOnlySpan<byte> exists.
With this release, you can now compute hashes with no overhead, and you can even make the entire process zero-copy when possible.
We have also rewritten the internal implementations by using ReadOnlySpan<byte> instead of ArraySegment<byte>, so the entire library has been refurbished and optimized using ReadOnlySpan<byte>.
Introducing ValueEndianness enum. From now on, all of the implementations are going to explicitly tell you about the endianness of the result they return inside the HashValue class. We believe this transparency brings further additions to the library, where a specific order might be needed; you can now apply this!
The HashValue class also provides helper methods such as AsLittleEndian, AsBigEndian, ReverseEndianness, or WithEndianness that can be used to enforce specific endianness of the hash result. There is also a global config for enforcing fixed endianness of all computed hashes, using the HashValueOptions class. For example, if you set the FixedEndianness property of HashValueOptions to LittleEndian, all of the computed hashes will be enforced to return little-endian results regardless of their spec. Some sources generate big-endian results, some follow the spec of the underlying algorithm; you can now aim for whichever you'd like.
Please keep in mind that ValueEndianness does not always have to hold little-endian or big-endian; it can also point to NotApplicable in cases where the underlying value does not hold a primitive in the background where endianness can apply. Any conversation applied to NotApplicable values will be a no-op, so even if you call AsLittleEndian, AsBigEndian, ReverseEndianness, or WithEndianness, the value will be returned as-is since endianness cannot be determined, swapped, or applied to its value.
TLDR: Here's a brief list of all the changes we've made:
- Introduced
ValueEndiannessenum to provide transparent endianness of the resulting hash. - Replaced all
ArraySegment<byte>usages withReadOnlySpan<byte>, resulting in more optimized usage and computation in the background. - Avoided
Array.Copywhere possible and usedBuffer.BlockCopyinstead. - The
Endiannesshelper class is now even more advanced, covering 99% of all the needs for conversations between primitive types and byte arrays, as well as with full support forReadOnlySpan<byte>. - CRC implementation follows the spec with endianness and returns the value in the correct endianness. Previously, it was fixed at little-endian; however, this is not the case for all CRC profiles. If your implementation always expects little-endian CRC results, make sure to call
AsLittleEndianin your code before working on the result. Don't forget, theAsLittleEndiancall will only hide the fact, and your values will still be off the spec for some CRC profiles, so you can also revise your code to respect the endianness of the values at some point, if possible. - Calls to
Coercewill now return the same instance if the requested bit length equals the current length, a no-op shortcut. - Added
ByteLengthproperty toHashValue. - Fixed Base85 variant list in the
HashValueclass, as previously the definedAscii85was returningAdobeAscii85, now both of them are defined, and the default has been set toAscii85. - Added Base58 variant list to the
HashValueclass, similar to Base85. - Added Base32 variant list to the
HashValueclass, similar to Base85. - Added
AsSpanhelper methods to theHashValueclass. - Added
Slicehelper methods to theHashValueclass. - Added
AsLittleEndianhelper method to theHashValueclass. - Added
AsBigEndianhelper method to theHashValueclass. - Added
WithEndiannesshelper method to theHashValueclass. - Added
ReverseEndiannesshelper method to theHashValueclass. - Added
AsStreamhelper methods to theHashValueclass. - Added
AsMemoryhelper method to theHashValueclass. - Added
CopyTohelper methods to theHashValueclass. - Added
CopyTohelper methods to theHashValueclass. - Added
CalculateEntropyhelper method to theHashValueclass. You can now determine the most unique result with the input you have. - Added
CalculateEntropyPercentagehelper method to theHashValueclass. This callsCalculateEntropyin the background and converts the 0-7 result to the range of the bit length of the underlying hash value, then converts that into a percentage between 0 and 100 that can be better understood. -
IHashValuenow derives fromIEnumerable<byte>, so you can directly enumerate the resulting hash with foreach, or use LINQ features on it without having to convert it to a byte array. - Added
DebuggerDisplayfor theHashValueclass for better viewing of the values while debugging. - Added equality operators to the
HashValueclass; this usesFixedTimeEqualsin the background, so you can safely compare two instances of theHashValueclass. - Added T1HA0 hash algorithm support.
- Added T1HA1 hash algorithm support.
- Added T1HA2 hash algorithm support.
For the full list of changes, please check our change log.
© 2025, Deskasoft International. All rights reserved.