-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow updating audioBitsPerSecond and videoBitsPerSecond as done for mimeType #205
Comments
I agree with the above statement -- it also seems to me that there is a slightly different behaviour with regard to obtaining "actual" bitrate values as used by an active recorder, as opposed to obtaining the media type -- which, in my understanding, may change (and be set to an actual value) from what was passed to the constructor to whatever actual value the recorder decides to use upon starting recording. In other words, the specification is either deliberately or accidentally vague on this matter, while I think it could benefit from standardizing it -- there should be no harm in being able to obtain actual bitrate values? For the record (not that I claim either works according to the spec here), I see that Chrome reports zero for all bitrate values for an active recorder when no bitrates were explicitly specified to the recorder constructor. Firefox, in contrast, reports non-zero values I assume to be actual [constant] bitrates, even when no values were specified for the constructor. |
My intention when writing the text referred to here was to have the UA choose bitrates as a fallback in case the application didn't specify any. If bitrates are important to the application's use case it should specify them. Having the UA be too smart about choosing bitrate (e.g., "This might usually be platform specific and may depend on the actual encoder being used") opens up a fingerprinting surface. I can see how a particular encoder's default is specific to that encoder, but what this is about in my mind is a UA default that the encoder gets configured with, whether it agrees that is the default or not.
If needed I'm sure that could be worked around in the spirit of w3ctag/design-reviews#439
Note that mimeType is only set async because of the changes for passthrough codecs, i.e., the codec for data received from a peer connection is only known once data has been received. The track however exists and can be recorded before that. |
If by "actual" you mean what the recorder happens to be producing at any given time, then that is out of scope for these attributes. They denote the bitrates the encoders were configured with.
The start() method is pretty clear on these attributes in the spec. To get actual bitrate values one could approximate this by looking at the size of the produced blobs. Or if done when the encoding is complete, it would be exact.
The spec was updated. Chrome was not (yet). See wpt.fyi for more details. |
I do not really see how autoplay sync vs. async relates to bit rate values.
Then the intent of this part of the spec is not clear to me at all and is worth clarifying if that is really the goal. Also, is there anybody implementing passthrough codecs? |
In a case of sync vs async w3c seems to prefer sync if at all feasible. I am not sure I see how deciding a default bitrate is an async getter attached to an OS resource. Are there hardware encoders one may not configure the bitrate of?
See #190
|
The fact that RTCRtpSender.getCapabilities is providing a synchronous API is an identified issue, see w3c/webrtc-extensions#49 If it is difficult to know which encoder (or which profiles) are supported, it is not easy to select a good default bit rate. In the case of passthrough codecs, it seems that we would like the bit rate to be set to 0 (?). |
For some definition of "good", sure. I do believe a decent default bit rate could be found either way.
The bitrate attributes are not really covered well in the spec for passthrough, but the spec does mention that the UA will have to be prepared to fall back to re-encoding, should the source used for passthrough change codec on the fly. After it falls back, and the bitrate attributes were 0 because of passthrough, what are the bitrate attributes supposed to be now? Perhaps it's better to say that with passthrough the bitrate attributes will be ignored. Or that if the configured bitrate is lower than the bitrate of the passthrough source, the UA may re-encode anyway, to match.
Why do we need to know that? |
Ultimately I have to concede. What actual bitrates apply can be inferred from the size and rate of the blobs that the recorder generates.
Well, if we go by the spec as you explained it to me, then zero should be reported for bitrate when no bitrate was passed to constructor? |
If no bitrate parameters were passed to the constructor, the constructor assigns the bitrate attributes per step 10 and 11 in the spec:
|
If audioBitsPerSecond and videoBitsPerSecond are not specified, it is up to the User-Agent to specific these values.
This might usually be platform specific and may depend on the actual encoder being used.
These values may only be available asynchronously as encoders may leave in a different thread/process.
It seems it would be useful to be able to update audioBitsPerSecond and videoBitsPerSecond like done for mimeType, before the start event of MediaRecorder is started.
The text was updated successfully, but these errors were encountered: