Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better Handling for Failed/Timed Out Requests from Tidal #29

Open
cleonyc opened this issue Jan 19, 2023 · 7 comments
Open

Better Handling for Failed/Timed Out Requests from Tidal #29

cleonyc opened this issue Jan 19, 2023 · 7 comments
Assignees
Labels
enhancement New feature or request

Comments

@cleonyc
Copy link

cleonyc commented Jan 19, 2023

When a Getting Track Details task times out or fails, it just gets stuck, it doesn't produce any feedback for the user. This results in a confusing error that only appears once you look at the files being downloaded. Might be a good idea to implement a timeout and retry for this step, and if that's not feasible, then at least having some feedback and (maybe, lmk if this is a good idea) stopping the download entirely might work best.

@cleonyc cleonyc changed the title Better Handling for Timed Out Requests from Tidal Better Handling for Failed/Timed Out Requests from Tidal Jan 19, 2023
@MinisculeGirraffe
Copy link
Owner

MinisculeGirraffe commented Jan 20, 2023

Internally at least there's already a retry with exponential backoff for API requests. When it's stuck on Getting Track Details it's because of this retry process. But to your point this doesn't display any user feedback and it probably should.

I think the ideal solution is that we need to track the number of API requests we can make before rate limit and only initiate a call if it wouldn't cause a rate limit.

If making a new request would cause a rate limit we sleep that specific task until it would succeed and display a message saying that's what's going on and that if it keeps happening to lower the concurrency.

This weekend I should have some free time and I'll try and implement this and a few other issues i've been putting off.

@MinisculeGirraffe
Copy link
Owner

MinisculeGirraffe commented Jan 20, 2023

The retry values for the API requests are found here.

They aren't user configurable though, and they probably should be. Another issue with using the middleware system is that it's entirely transparent to the caller, so when getting the details of a track, the caller has no idea if 1 request was made or 5. It just resolves to an error once the retry policy has been exhausted.

For a quick and dirty fix you can modify the max_n_retries value to be significantly higher and set backoff_exponent to be 1.

You'll send overall more http requests, but given a high enough value they'll all go through eventually without failure.

@MinisculeGirraffe MinisculeGirraffe self-assigned this Jan 20, 2023
@MinisculeGirraffe MinisculeGirraffe added the enhancement New feature or request label Jan 20, 2023
@cleonyc
Copy link
Author

cleonyc commented Jan 21, 2023

Went ahead and looked at the debug logs, seems like almost all of the errors coming from tidal when downloading from https://listen.tidal.com/artist/3565245 are 429s. These settings fixed the failed downloads, but as you said it's not ideal since there is little feedback to the user.

   let retry_policy = ExponentialBackoff {
        max_n_retries: 100,
        max_retry_interval: std::time::Duration::from_millis(10000), // doesn't really matter
        min_retry_interval: std::time::Duration::from_millis(1000),
        backoff_exponent: 1,
    };

@MinisculeGirraffe
Copy link
Owner

MinisculeGirraffe commented Jan 21, 2023

Yeah. It needs a rework. Tonight and tomorrow I'm going to work on implementing

  • User configurable retry settings with better default values.
  • Visible Feedback when a retry occurs.
  • Displaying a report once all the downloading tasks have completed to show if there were any failures and other useful information like total size, number of downloads, etc.

@popeye2468
Copy link

how do i implement these settings? please

@cleonyc
Copy link
Author

cleonyc commented Sep 20, 2023

Hey do you want any help implementing this? I've been really busy recently but next weekend and the weekend after i might be able to write something. Anything you figured out or that I should keep in mind?

@cleonyc
Copy link
Author

cleonyc commented Nov 5, 2023

@MinisculeGirraffe Just in case you missed it, see above.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants