Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Timing issues on Linux (OpenGL app) #218

Closed
SDLBugzilla opened this issue Feb 10, 2021 · 0 comments
Closed

Timing issues on Linux (OpenGL app) #218

SDLBugzilla opened this issue Feb 10, 2021 · 0 comments

Comments

@SDLBugzilla
Copy link
Collaborator

This bug report was migrated from our old Bugzilla tracker.

Reported in version: 1.2.11
Reported for operating system, platform: Linux, x86

Comments on the original bug report:

On 2006-09-01 06:52:27 +0000, William wrote:

I've got an OpenGL engine I've been playing with for a few years; up
until tonight it used GLUT for keyboard input and requesting the window.
I wanted to have proper key-up/key-down events so I ported to SDL.

The GL part worked perfectly, the window comes up, the key events work
properly and all is happy with the world except that it's very very
jerky.

I've narrowed it down to timer issues: stuff in my engine moves a
distance proportional to the most recent frame time so that it's
consistent with varying frame rates. If the measured frame time is
wrong, things move the wrong distance. My source of time information
is ftime() for the first second and then rdtsc once it's confident of
the clock rate. It's not an SMP system, so rdtsc should be consistent;
disabling rdtsc doesn't help my problem anyway.

When I use SDL to create my windows, the time information returned by
both ftime and (this is the bit I REALLY don't understand) rdtsc
occasionally jumps ahead about 60ms, so the motion keeps jumping ahead.

Typical framerate is 150fps so the frame times returned are ~6.5ms.
Occasionally, one will be returned saying 65ms has elapsed and this
causes the jerking. Note that things jerk ahead of their proper
position, not lag behind and that if I ignore the clock output and
force the time intervals to be constant for simulation purposes, the
rendering is perfectly smooth. The system isn't actually pausing or
dropping frames but somehow the clock is jumping ahead.

If I go back to using glut instead of SDL to open the GL window, timing
goes back to normal. Turning off optimisation didn't help at all.
Using SDL as my source of time by calling SDL_GetTicks() didn't help
either.

If I put in a print to stdout and stdout is a terminal, the timing
becomes a little more consistent but it's slow because the terminal is
updating. Redirecting stdout to a file brings the timing back to
inconsistent; here's an example where I print out the measured frame
times derived from SDL, problem values in the middle of each sequence:
fdt 0.007
fdt 0.007
fdt 0.062
fdt 0.005
fdt 0.006

Or derived from rdtsc:
fdt 0.00679732
fdt 0.00671102
fdt 0.06334
fdt 0.00524602
fdt 0.00523732

As a practical matter, I've solved the jerkiness by taking the sim time
to be the average measured frame time from the previous 100ms; the fact
that this makes it smooth proves I think that it's a time measurement
issue, not merely lag and frame-dropping.

It's compiled with gcc 3.4.6 on Athlon 1800+ and ntpd is running though I can't see ntpd changing the output of rdtsc.

On 2006-09-23 21:01:24 +0000, Sam Lantinga wrote:

The only thing that should change rdtsc is switching processors. Is your system a multi-CPU system?

On 2006-09-23 21:53:24 +0000, Ryan C. Gordon wrote:

Alternately, is one frame taking a really long time? That sounds like something happening at the driver/kernel level that has nothing specific to do with SDL, unless we ended up choosing a different GL library than GLUT.

--ryan.

On 2006-09-23 22:49:04 +0000, Sam Lantinga wrote:

Make sure you do glFinish() (or is it glFlush()?) before swapping buffers. Some drivers get things queued up too long and lock the bus transfering everything down to the video card at the last minute if you don't properly finish your scene. I believe glut actually called that for you, but I could be wrong...

On 2006-09-23 23:19:19 +0000, William wrote:

I had glFlush() at the end of my frame and changed that to glFinish() which has fixed the problem. Thanks!

On 2006-09-23 23:24:53 +0000, Sam Lantinga wrote:

You're welcome!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant