Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BOLA's Utility Issue #4301

Closed
GreenLv opened this issue Oct 20, 2023 · 2 comments
Closed

BOLA's Utility Issue #4301

GreenLv opened this issue Oct 20, 2023 · 2 comments
Assignees
Milestone

Comments

@GreenLv
Copy link

GreenLv commented Oct 20, 2023

I am learning the logic of BOLA in dash.js (latest version) and I find a potential issue with BOLA's decision logic parameter. Please correct me if I am mistaken.

Parameter Calculation

I have noticed that BOLA's actual implementation is a little different from [the original paper (INFOCOM '16)](https://ieeexplore.ieee.org/document/7524428) regarding its parameters of Vp and gp. In detail, these parameters are now calculated by the following code (lines 102-108 in [BolaRule.js](https://github.com/Dash-Industry-Forum/dash.js/blob/development/src/streaming/rules/abr/BolaRule.js)):

// If using Math.log utilities, we can choose Vp and gp to always prefer bitrates[0] at minimumBufferS and bitrates[max] at bufferTarget.
// (Vp * (utility + gp) - bufferLevel) / bitrate has the maxima described when:
// Vp * (utilities[0] + gp - 1) === minimumBufferS and Vp * (utilities[max] + gp - 1) === bufferTarget
// giving:
const gp = (utilities[highestUtilityIndex] - 1) / (bufferTime / MINIMUM_BUFFER_S - 1);
const Vp = MINIMUM_BUFFER_S / gp;
// note that expressions for gp and Vp assume utilities[0] === 1, which is true because of normalization

Parameter Setting Principle

To my understanding, this implementation aims to ensure that BOLA selects the lowest bitrate level (bitrates[0]) at the buffer level minimumBufferS, and the highest bitrate level (bitrates[max]) at the buffer level bufferTarget, which is similar to the logic of [BBA (SIGCOMM '14)](https://dl.acm.org/doi/10.1145/2619239.2626296).
Let $M$ denote the highest bitrate level (equal to max), $B_{min}$ denote minimumBufferS (corresponding to $Q_{min} = B_{min}/p$ chunks, where $p$ is chunk duration), and $B_{max}$ denote bufferTarget (corresponding to $Q_{max} = B_{max}/p$ chunks).
Considering the description of the BOLA paper, let $v_m$ and $S_m$ represent the original utility value and chunk size of the bitrate level $m\in{0, 1, \dots, M}$, respectively. The core idea of BOLA is to maximize $(V(v_m +gp) - Q(t_k)) / S_m$, where $g$ means $\gamma$ in the paper and $Q(t_k) * p$ means the current buffer level.
To select bitrates[0] at $B_{min}$, the following equation holds: $(V(v_0+gp) - Q_{min}) / S_0 = 0$. Therefore, we further have: $Vp(v_0 + gp) = B_{min}$. Note that $v_0=0$.
While in dash.js implementation, the utility value is normalized by adding one as follows (line 118 in [BolaRule.js](https://github.com/Dash-Industry-Forum/dash.js/blob/development/src/streaming/rules/abr/BolaRule.js)):

utilities = utilities.map(u => u - utilities[0] + 1); // normalize

Denote the normalized utility value at bitrate level $m$ as $u_m=v_m+1$. Thus, $Vp(u_0 - 1 + gp) = B_{min}$. Similarly, selecting bitrates[max] at $B_{max}$ requires: $Vp(u_M-1+gp)=B_{max}$. Given that $u_0 - 1 = 0$, we have: $Vp = B_{min} / gp$ and $gp = (u_M -1) / (B_{max} / B_{min}-1)$. These equations are consistent with BOLA's code above.

Decision Logic Issue

The real issue is: While BOLA uses a normalized utility value in calculating its parameters (Vp and gp), it still uses the original utility format in selecting bitrate. BOLA's decision logic is implemented as the following code (lines 188-201 in [BolaRule.js](https://github.com/Dash-Industry-Forum/dash.js/blob/development/src/streaming/rules/abr/BolaRule.js)):

// The core idea of BOLA.
function getQualityFromBufferLevel(bolaState, bufferLevel) {
    const bitrateCount = bolaState.bitrates.length;
    let quality = NaN;
    let score = NaN;
    for (let i = 0; i < bitrateCount; ++i) {
        let s = (bolaState.Vp * (bolaState.utilities[i] + bolaState.gp) - bufferLevel) / bolaState.bitrates[i];
        if (isNaN(score) || s >= score) {
            score = s;
            quality = i;
        }
    }
    return quality;
}

Note that BOLA's code uses the exact same equation (based on $v_m$) as in its paper to calculate s. However, considering bolaState.utilities[m] is actually the normalized value (i.e., $u_m = v_m + 1$), should this value be subtracted by one to be consistent with the original equation format? Specifically, the decision logic should be maximizing $(Vp(u_m - 1 +gp) - B(t_k)) / (S_m * p)$ instead of $(Vp(u_m +gp) - B(t_k)) / (S_m * p)$, where $S_m * p$ indicates the chunk bitrate of the level $m$. As a result, the code should be:

        let s = (bolaState.Vp * (bolaState.utilities[i] - 1 + bolaState.gp) - bufferLevel) / bolaState.bitrates[i];
@GreenLv GreenLv added the Bug label Oct 20, 2023
@GreenLv
Copy link
Author

GreenLv commented Oct 25, 2023

@spiterikevin

@dsilhavy
Copy link
Collaborator

dsilhavy commented Aug 1, 2024

It sounds reasonable to me to subtract the value that was used to normalize the utility values. I added the proposed fix to #4536. However, I am not an author of the BOLA algorithm. It would still be nice to get feedback from the original authors and developers.

@dsilhavy dsilhavy closed this as completed Aug 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

No branches or pull requests

2 participants