-
Notifications
You must be signed in to change notification settings - Fork 271
Description
Hi,
I've implemented my own full ASTC decoder, following the Khronos spec as closely as possible:
https://registry.khronos.org/DataFormat/specs/1.1/dataformat.1.1.pdf
I've found that when decoding to ASTCENC_TYPE_U8 with the LDR_SRGB profile enabled, the alpha channel differs (typically by 1) from a decoder that follows the ASTC spec. The problem boils down to what 16-bit value is composed before interpolation on the alpha channel (component index 3): does the alpha channel's low byte get set to 0x80, or to the high bytes? The spec is clear:
"If sRGB conversion is not enabled, or for the alpha channel in any case, C0 and C1 are first expanded to 16 bits by bit replication:"
One of Google's Android decoders matches your decoder's output, but it also doesn't match the spec. The relevant code in their decoder:
https://chromium.googlesource.com/external/deqp/+/refs/heads/master/framework/common/tcuAstcUtil.cpp#1453
According to the ASTC specification, this code (which matches the output of ARM's decoder) is incorrect on channel 3 (alpha).
See section 18.19 (Weight Application) here:
https://registry.khronos.org/DataFormat/specs/1.1/dataformat.1.1.html#astc_weight_application
I'm now wondering what is correct: the spec, or the ARM/Google software reference decoders?
Thanks.
