-
Notifications
You must be signed in to change notification settings - Fork 619
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about DRED #326
Comments
Opus is trying to optimize for whatever packet loss rate you set with OPUS_SET_PACKET_LOSS_PERC. If you tell it to optimize for the no-loss (0%) case, then there's no point in wasting bits for DRED. The exact loss you specify will influence the amount of DRED used. If in doubt, you can always try with 20% and see from there. |
Thanks a lot, that explains it. From my previous understanding, OPUS_SET_PACKET_LOSS_PERC was only relevant for LBRR, so that hadn‘t been obvious for me from the documentation. Regarding my third question. The docs state: „ OPUS_SET_DRED_DURATION: If non-zero, enables Deep Redundancy (DRED) and use the specified maximum number of 10-ms redundant frames.“ Do I interpret it correctly that if I set this value to 50, I get 500ms of DRED data per frame, even if I choose to encode frames of 20ms or 5ms length? I also noticed from experimenting that the amount of DRED data (in milliseconds) available is limited by bitrate, is that correct? |
OPUS_SET_DRED_DURATION is always in units of 10 ms, but things are more complicated underneath anyway. And it specifies the maximum the application allows. There may be less than that due to available bitrate. |
Thanks, that's all I needed to know! |
I just tried out DRED in a live audio application I am working on and it's working great.
However, a few questions came up:
The text was updated successfully, but these errors were encountered: