-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
rsx: Broken lighting on GoldenEye 007: Reloaded #8966
Comments
Same issue as DeS. The root cause is known, just not implemented for performance reasons. The game is using the amount of pixels consumed by the light flare in screenspace to determine how much strength to give it. We fake this information and return a very large number because accurately calculating the exact number of pixels is usually slower than just checking if something is visible or not. The framework for emitting pixel-accurate counting is there, but currently unused, it could be added as an accuracy/advanced option. |
Is the existing framework utilising a fragment shader and an atomic to do this pixel counting? |
No, that is very bad for performance. The hardware has the registers for it, but it is not so simple. Let's just simplify and say, it has an internal array of registers, think of it like a grid overlaid over your fbo. Each tile gets some counters and you add them up to get the total. This is done by the driver, you need not worry about it. TLDR; You really don't ever want to use that precise mode. There is certainly a better way to combine both and I have some ideas, but this has a simple workaround to disable light corona strength scaling by just disabling zcull queries for this game, making its priority not so high right now. |
Would it be feasible to have a low accuracy option? For example, if we know where the effect is in worldspace, and any of the occlusion queries succeeded, we could maybe calculate a simple bounding box that covers the affected area, and just assume a fixed portion (e.g. 50%) was drawn to, without any regard to how much was actually drawn. |
We're not in control of the graphics pipeline, the PS3 application is. This adds a lot of complications, but yes, I have considered that idea. It's not straightforward to do and we cannot do too much extra work as emulating the RSX shader pipes is performance-heavy. Trying to inject even more work per draw call in a game that uses an API (GCM) that can comfortably handle tens of thousands of draw calls per frame is just not feasible. |
Honestly it makes more sense to just add "accurate ZCULL queries" toggle in advanced settings and anyone who wants to use it can use it where it matters. For some games it won't make a big difference if occlusion is only used for effects (like goldeneye) but others like DeS use queries to determine visibility of level geometry and there it will have more of an impact. This solution is much easier to do and shifts the burden of maintenance to the driver vendor, which is what we want. That's pretty much what I plan to set up some time this week or next week when I'm done with some more pressing issues. |
Applies to any version of GoldenEye 007: Reloaded, demo or full game.
RPCS3
![image](https://user-images.githubusercontent.com/10283761/94040657-1d481480-fdc1-11ea-9ebb-24d6eb2287c7.png)
PS3
![image](https://user-images.githubusercontent.com/10283761/94040691-26d17c80-fdc1-11ea-9b45-66483dac99db.png)
The lighting on the window is broken in this initial scene of the demo, for example.
As noted on the forum report, disabling ZCull queries disables the problematic lighting, hiding the issue.
RSX Capture:
https://drive.google.com/file/d/1X8tE8QVojSp7ZirApjKePdTraOQNDI6o/view?usp=sharing
Log file for NPEB90389:
NPEB90389.log.gz
The text was updated successfully, but these errors were encountered: