You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Okay, two feature requests which are related to each other:
Add "Texture filtering of display" option to the menu (it should probably be renamed to "Scaling to display filter" or something like that)
Allow the core to pass a real resolution of the content to RA.
From what I understand, currently with resolution being set to 1X core outputs everything at 640x480, even if actual resolution is lower, most commonly 512x448. With "Texture filtering of display" enabled, 512x448 gets scaled to 640x480 with bilinear filter, which is pretty blurry. With "Texture filtering of display" option disabled, 512x448 gets scaled to 640x480 with nearest neighbor, which is sharp but causes tons of aliasing.
Sending "proper" resolution would avoid scaling image twice (first by core, then by RA), and would be extremely useful for CRT shaders, particularly for the games like Final Fantasy X which changes resolutions between gameplay (512x416) and CG cut-scenes (640x416).
The text was updated successfully, but these errors were encountered:
The first one of Texture filtering of display (bilinear filtering) is a built-in function with same name in GSdx of PCSX2. It enabled by PCSX2 default settings, but we can unchecked this feature in GUI settings (since PS2 doesn't have this feature in real console, if one wants the accuracy).
RA core also including this function by porting, but not given the options to switch on/off yet. So in the core side, only suffering the globally bilinear filtering by now, that means #63 still not fixed or actually worked.
Okay, two feature requests which are related to each other:
Add "Texture filtering of display" option to the menu (it should probably be renamed to "Scaling to display filter" or something like that)
Allow the core to pass a real resolution of the content to RA.
From what I understand, currently with resolution being set to 1X core outputs everything at 640x480, even if actual resolution is lower, most commonly 512x448. With "Texture filtering of display" enabled, 512x448 gets scaled to 640x480 with bilinear filter, which is pretty blurry. With "Texture filtering of display" option disabled, 512x448 gets scaled to 640x480 with nearest neighbor, which is sharp but causes tons of aliasing.
Sending "proper" resolution would avoid scaling image twice (first by core, then by RA), and would be extremely useful for CRT shaders, particularly for the games like Final Fantasy X which changes resolutions between gameplay (512x416) and CG cut-scenes (640x416).
The text was updated successfully, but these errors were encountered: