You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some Vanillaware titles use subtraction blending with underflow on integer framebuffers. On PC, the values get clamped to zero, but it would appear that on PS3 the conversion is unsigned. This means that subtract output less than 0 wraps around to 1.
Obviously this behavior cannot be emulated with fixed-function blending but programmable blending has questionable support on PC especially on AMD windows (linux is ok).
Another reason for this is that PS3 supports 2 blending sign modes. Normal blending takes the absolute values of source and dest while signed blending preserves the sign on the source.
So far we've been able to ignore this but it causes some annoying bugs especially in 2D games where minor errors are very obvious.
The text was updated successfully, but these errors were encountered:
Logical portions have been implemented in #15065
The only thing that remains is deciding where to source the data from. There are no good options here, but replacing MRT writes with image-store could work around the nasty issue of framebuffer loops.
Ftr, I already implemented this locally and it works fine, bar some distracting flicker that requires fragment-shader-interlock to fix.
We also need to emulate several "signed" blend modes that aren't supported on PC, such as "signed" variants that do not clamp the input to [0..1]
These are some of the hardware tests that need to be validated on PS3 first:
ARGB fbo with blend reverse-subtract underflow (it seems RGB components clamp, but not A?)
ARGB fbo with blend reverse-subract-signed underflow
Some Vanillaware titles use subtraction blending with underflow on integer framebuffers. On PC, the values get clamped to zero, but it would appear that on PS3 the conversion is unsigned. This means that subtract output less than 0 wraps around to 1.
Obviously this behavior cannot be emulated with fixed-function blending but programmable blending has questionable support on PC especially on AMD windows (linux is ok).
Another reason for this is that PS3 supports 2 blending sign modes. Normal blending takes the absolute values of source and dest while signed blending preserves the sign on the source.
So far we've been able to ignore this but it causes some annoying bugs especially in 2D games where minor errors are very obvious.
The text was updated successfully, but these errors were encountered: