Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WebCodecs Time Stamps #104

Closed
sandersdan opened this issue Jan 15, 2021 · 3 comments
Closed

WebCodecs Time Stamps #104

sandersdan opened this issue Jan 15, 2021 · 3 comments

Comments

@sandersdan
Copy link

Hello!

We're trying to work out (w3c/webcodecs#52, w3c/webcodecs#122) what kind of timestamps to use for frames in WebCodecs. In general we need ~microsecond resolution, measured from either time 0 at the start of the media, or in calendar time (typically relative to the Unix epoch).

Usually media times are measured as an integer fraction of seconds (a timestamp and a timebase), but there doesn't seem to be a convenient Rational type in JS, so it's not very ergonomic.

The <video> element has currentTime in seconds (as a double). RTC uses DOMHighResTimeStamp for some things. Internally Chrome uses int64 microseconds.

It seems like DOMHighResTimeStamp is the most reasonable choice for us, but it also seems to have been defined relative to a specific epoch. Is there any concern with WebCodecs using it with no explicit epoch?

@npm1
Copy link
Contributor

npm1 commented Jan 15, 2021

You can use DOMHighResTimeStamp , which is just a typedef for double. From the spec: The DOMHighResTimeStamp type is used to store a time value in milliseconds, measured relative from the time origin, global monotonic clock, or a time value that represents a duration between two DOMHighResTimeStamps. So it should be in milliseconds (not seconds), and it can represent any time duration (a duration is not relative to anything).

@sandersdan
Copy link
Author

Thanks!

@mdrejhon
Copy link

mdrejhon commented Jan 20, 2021

In general we need ~microsecond resolution

Good move.

About milliseconds and microseconds,

I agree with microsecond timestamps -- not integer milliseconds (floating point milliseconds is OK)

For futureproofing testing, we did some tests of 240fps HFR videos and 360fps HFR videos on gaming 240Hz and 360Hz monitors. ASUS has roadmapped a 1000Hz gaming monitor as well. In some browsers the ultra-high-framerate videos work fine, but some issues like synchronization and stutters occur.

Some useful reading may be warranted: Ultra HFR: Real-Time 240fps, 480fps and 1000fps Videos

We successfully tested FireFox and Google Chrome to 480fps on an experimental 480Hz monitor, www.blurbusters.com/480hz

For those who want to read more about recent scientific research on refresh rate physics, there is an Area51 section at Blur Busters that is being worked with display engineers -- www.blurbusters.com/area51

To futureproof adequately, we strongly believe in microsecond timestamps, considering the human-visible benefits of the millisecond (discussed in this forum thread https://forums.blurbusters.com/viewtopic.php?f=7&t=6808 ...)

For you with 120Hz iPads, 120Hz TVs, and 120Hz gaming monitors, check out Example of 120fps embedded video in HTML5 . We have already tested 240fps and 360fps embeds.

120fps is becoming more mainstream (arriving on more phones, most new 4K HDTVs support it, and XBox/PlayStation, plus DELL is roadmapping 120Hz office monitors in year 2025+) and 240fps "mainstreaming" may follow suit in the 2030s, since doubling Hz halves web-browser-scrolling motion blur and doubling Hz halves LCD/OLED motion blur (sample-and-hold).

On the non-mainstream front (bleeding edge), the current 360 Hz gaming monitor (which I have here too), has 1/6th the browser scrolling motion blur of a 60Hz display, and 1/3rd the browser scrolling motion blur of a 120Hz iPad. So high-Hz is not just for games, but in reducing motion blur of LCD/OLEDs in browser scrolling, browser videos, and browser animations.

So because of these odd refresh rates, milliseconds are not well-divisible. Microseconds or floating point timestamps will provide more breathing room of future-proofness.

Best practice: If using floating-point milliseconds timestamps, don't zero out the decimal -- that data can reduce/eliminate stutters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants