-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
VSync / screen buffering (feature request) #179
Comments
Earlier versions ran with Vsync on by default (at least on my Windows machine). But for 1.18 I tweaked the timer code (to improve accuracy) and ended up having to disable it. It was difficult for me to work with Vsync on the main thread (especially for timing) because it can cause thread execution to halt for a long time while waiting for the next refresh. Hm, maybe if we want to allow Vsync again, we could run a second thread that does nothing but wait for screen refreshes, acquire a mutex to guarantee that the latest frame has at least been fully drawn, and present the screen. |
I cannot say what is a good solution without knowing how the draw routines work. If every routine works asynchronously, it’s hard to catch a moment when the frame buffer is complete, with no partial sprites. If they are sequential, then signalling a conditional variable after each draw routine is completed, and bitblt the frame buffer to screen in a separate thread, maybe with an intermediary framebuffer to let draw routines work.
… On 5 May 2018, at 21:06, Falcury ***@***.***> wrote:
Earlier versions ran with Vsync on by default (at least on my Windows machine). But for 1.18 I tweaked the timer code (to improve accuracy) and ended up having to disable it. It was difficult for me to work with Vsync on the main thread (especially for timing) because it can cause thread execution to halt for a long time while waiting for the next refresh. Hm, maybe if we want to allow Vsync again, we could run a second thread that does nothing but wait for screen refreshes, acquire a mutex to guarantee that the latest frame has at least been fully drawn, and present the screen.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Here is my attempt: Falcury@1fc3d8d I added an option It may still be unstable. Calling SDL's video APIs from another thread is strictly speaking not supported: https://wiki.libsdl.org/FAQDevelopment#Can_I_call_SDL_video_functions_from_multiple_threads.3F I tried to circumvent the limitation by only allowing the Vsync thread to run at specific times during the loop, when it is reasonably 'safe'. On my system, it doesn't crash anymore now. Hm, maybe an alternative way to keep the timing of screen refreshes separate from the game logic would be to run all of the game logic in the second thread, instead of the other way around. I guess that would be harder though. |
@Falcury On my machine it doesn't work well. It creates a lot of ghost frames with flickering. Maybe it would be possible to avoid threading by using some event library like libuv?.. No idea how the original prince was performing the drawing, but there were no threads in DOS. |
@mbunkin Right, then using threads is off the table, I guess. Maybe all we would need to do is to take a fresh look at the timing code, to make sure that switching Vsync on does not cause timing to become inaccurate, testing with various monitor refresh rates. |
@Falcury I gave this a bit more thought and I think the original DOS environment did have "threads". The video card was independent from the CPU, just like a separate thread it would take video memory contents and send them to the display, while CPU was doing something else. If it would be technically possible, I would suggest the following logic: Main game thread generates the 320x200 screen buffer (drawing sprites). Once draw operation is complete, if the special mutex is open, frame buffer is copied to the second, "video memory" buffer by the main thread, and condition variable is signalled. The second, "video card" thread wakes up to the condition variable, locks the special "video memory" mutex, and begins scaling and displaying the buffer contents via SDL. Once its work is done, it unlocks the "video memory" mutex and sleeps until signalled again. If "video memory" is locked, copy operation returns without waiting for the mutex to be unlocked, and drawing cycle continues until the frame is good for displaying again. This way the threads would be isolated from each other, no incomplete buffer can be displayed, and it should not negatively affect the responsiveness of controls that is in the game now, as long as it is run on a 2+ core machine so the game and video threads run on separate cores. I have no free time at this point, so I cannot test this theory in practice. It might be not applicable to the way prince's engine works. I hope my idea of a "video card" thread would at least amuse the respectable developers :) |
@mbunkin If I remember correctly, what you describe is very similar to what I tried to do ;) DOS did not have threads, all programs had exclusive control over the system resources. Multithreading is not a problem nowadays, even on a single core system. The OS will periodically switch context to let both threads run. SDL also often uses threads internally, for example for audio callbacks and timers. I tried to find out how other games solve this problem. I found this blog post interesting:
This might be interesting to try as a better solution than relying on threads.
Nor me, unfortunately. |
@Falcury I see! I should really give it a go sometime. It's very interesting. One day I should have time to play around with SDL.
Indeed, but the PC "did", as some controllers (like video or sound cards, modems, etc) were operating independently from the CPU and used interrupts, shared memory and DMA channels for control and data exchange. I just thought such a parallel processing environment can be implemented / emulated with threads. |
When, for example, the prince finds the sword, and the background flashes yellow, I can see screen tearing. It doesn't happen in the original DOS game when played on a PC. I checked that today as I still have a 286 PC AT. If it is possible to add vsync / framebuffering to SDL, I'd ask you to consider this as a future feature. Thank you!
The text was updated successfully, but these errors were encountered: