You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to help someone with a problem they were having with creating an "average" of a few pixels from a viewport texture (by resizing the raw image data to 1x4), they noticed that the output values were not what they expected. I wrote a test program to see if I could understand what was going on and noticed that as an image is sized down, there appears to be an increasing bias on the leading edge pixels towards what the averaged output would be. This seems to apply regardless of the interpolation method used.
It should be noted that even in "uneven" sized images (not power of 2) this also seems to happen. I tested this with an image with a small amount of alpha 0 in the corner. Eventually, the leading edge on both axes continues to approach a total bias on the interpolated average as the final image size approaches 1 pixel. In the case of the test with the 0-alpha pixel at the first row and column, this meant that an image resized to 2x2 would have 3/4 of its area blank.
@nobuyukinyuu Can you (or anyone else) still reproduce this bug in Godot 3.2.2 or any later release?
If yes, please ensure that an up-to-date Minimal Reproduction Project (MRP) is included in this report (a MRP is a zipped Godot project with the minimal elements necessary to reliably trigger the bug). You can upload ZIP files in an issue comment with a drag and drop.
Godot version:
3.0.6
Issue description:
See video.
https://cdn.discordapp.com/attachments/477544613511692358/530275849467985920/2019-01-03_00-46-04.mp4
When trying to help someone with a problem they were having with creating an "average" of a few pixels from a viewport texture (by resizing the raw image data to 1x4), they noticed that the output values were not what they expected. I wrote a test program to see if I could understand what was going on and noticed that as an image is sized down, there appears to be an increasing bias on the leading edge pixels towards what the averaged output would be. This seems to apply regardless of the interpolation method used.
It should be noted that even in "uneven" sized images (not power of 2) this also seems to happen. I tested this with an image with a small amount of alpha 0 in the corner. Eventually, the leading edge on both axes continues to approach a total bias on the interpolated average as the final image size approaches 1 pixel. In the case of the test with the 0-alpha pixel at the first row and column, this meant that an image resized to 2x2 would have 3/4 of its area blank.
Attached is the demo project.
Minimal reproduction project:
MinimalResizeBug.zip
The text was updated successfully, but these errors were encountered: