-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Download entire shows in a Zip file #347
Comments
Possible implementations:
|
Playlists/shows as zipfiles was a feature in the initial release but I dropped it because it was inefficient/slow (it was synchronous) and used infrequently. This could be re-added as an async job but it would require adding Sidekiq or other background task processing, which would also require Redis. Due to these requirements I don't think it belongs on the primary roadmap at this time. Downloading tracks by batch could easily be done via the API and in parallel, obviating the need for zipping/unzipping. If someone wants to take a crack at a browser-based track download feature, I'd be open to that, but at this point I'm hesitant to add it as a backend feature. Frontend implementation would likely be easier if using React, which is started at #325 It's worth noting that CloudFlare does a ton of heavy lifting as far as caching MP3s at the edge, and for free. This is basically what allows phish.in to keep running, otherwise it would get crushed by traffic or someone would have to pay a big S3/CloudFront bill for file transfers. This edge cache (accessing MP3s directly by their URL) should be utilized as much as possible by all client requests, whether the client is interested in a single file or many. |
A "download all" button which effectively just triggered downloads of all individual tracks separately would be just fine. |
I prefer to download locally instead of streaming.
It's possible to do this track by track.
It'd be more convenient to be able to download an entire show or set as a Zip file with one click.
The text was updated successfully, but these errors were encountered: