You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Spec for RTCCodecStats.clockRate says it is the "media sampling rate". However the parameter that is set is defined in RTCRtpCodec.clockRate as The codec clock rate expressed in Hertz, which on MDN is documented as "the rate at which the codec's RTP timestamp advances"
I'm largely ignorant of this, but looking around, it seems that these are different things. So I am wondering if RTCCodecStats.clockRate definition is correct, and is expected.
The value should be the same across SDP and RTP -- see IANA entries for different codecs, although webrtc only supports -- H.264, VP8, (also VP9 and AV1), for audio PCM and Opus (some support AMR and G722).
The "sampling rate" of 90000 for video was chosen in order to ensure that the common frame rates of 20, 30, 50, 60 and 29.95 fps could all be represented as integers.
"Media sampling rate" is usually the same number for audio, but is an irrelevant concept for video.
Spec for
RTCCodecStats.clockRate
says it is the "media sampling rate". However the parameter that is set is defined inRTCRtpCodec.clockRate
as The codec clock rate expressed in Hertz, which on MDN is documented as "the rate at which the codec's RTP timestamp advances"I'm largely ignorant of this, but looking around, it seems that these are different things. So I am wondering if
RTCCodecStats.clockRate
definition is correct, and is expected.This is for MDN docs update - https://github.com/mdn/content/pull/32452/files#r1503650664
The text was updated successfully, but these errors were encountered: