-
Notifications
You must be signed in to change notification settings - Fork 115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limit sendEncodings to video. #1814
Conversation
@@ -5159,7 +5159,7 @@ <h2>Methods</h2> | |||
non-null value, as defined in <span data-jsep= | |||
"processing-a-local-desc processing-a-remote-desc">[[!JSEP]]</span>.</p> | |||
<p>The <code>sendEncodings</code> argument can be used to | |||
specify the number of offered simulcast encodings, and | |||
specify the number of offered simulcast encodings for video, and |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So outside of WebRTC, we o simulcast for audio too. Why would be limit it to video ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@fluffy Because of costs of implementing, testing and supporting any feature? Not to mention that features without strong use-cases driving them tend to end up broken anyway. I think the question must be flipped around: Why wouldn't simulcast be limited to video?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm fine with it being optional for browser to implement, but I don't think it should be forbidden. Many people in Africa are working across satellite links and they often can't use anything except very low bit rate audio while people on the thick pipes want high bandwidth. If you want to do this in a way where the media is encrypted, some clients need to send a high and low bandwidth version of the audio. Then the cloud can forward the appropriate bandwidth one to people that are listening.
Sure, it's a trade off. Seems like a good thing for the WG to have consensus on not just get changed by the editors. Perhaps that happened and I just don't recall. There is also a big differences between requiring browsers to do this and forbidding them to do this. I would be against saying this was forbidden as it turns out to be really easy to do and often useful.
… On Mar 22, 2018, at 4:20 PM, jan-ivar ***@***.***> wrote:
@jan-ivar commented on this pull request.
In webrtc.html <#1814 (comment)>:
> @@ -5159,7 +5159,7 @@ <h2>Methods</h2>
non-null value, as defined in <span data-jsep=
"processing-a-local-desc processing-a-remote-desc">[[!JSEP]]</span>.</p>
<p>The <code>sendEncodings</code> argument can be used to
- specify the number of offered simulcast encodings, and
+ specify the number of offered simulcast encodings for video, and
@fluffy <https://github.com/fluffy> Because of costs of implementing, testing and supporting any feature? Not to mention that features without strong use-cases driving them tend to end up broken anyway. I think the question must be flipped around: Why wouldn't simulcast be limited to video?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub <#1814 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AANQcHbNF-Zkw7Hc2SYkeQbQYoIqPSKrks5tg89ogaJpZM4S2acG>.
|
I'm closing PR until discussion concludes in #1813. |
Potential fix for #1813.
Preview | Diff