You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When browsing a directory, for example one with a large number of photos or with source code for an application, I would like to download all the files in the directory. This is currently either a slow manual process of clicking each file in turn in a web browser, or requires a tool or script.
Describe the solution you'd like
On directory listings, I would like a link to download an archive of the contents of the directory. The archive could for example be a zip file or a (plain or gzipped) tar file. The link could function similarly to the Download ZIP link on GitHub repositories which builds the archive gradually and streams it to the client, so the download starts immediately.
Alternative Solutions
I can create an archive before using http-server and download that, but it requires me to know upfront which directories I would like to download the contents of, and for large directories, it's slow to have to package all of the contents before starting the download.
Client-side tools that can solve this is are for example wget with the --spider flag, but that's not always available, especially not on mobile devices. I have not found an easy way to do this in normal desktop and mobile browsers.
Additional context
I believe this can be implemented quite easily using a package like archiver (streaming file creation and support for zip and tar formats) or more directly with support for just zip files using its dependency zip-stream. I'm happy to give this a go and see if I can whip up a draft pull request, assuming the extra dependency and resultant package size increase is acceptable.
I would suggest that the zip archive for a folder /path/to/folder/ be made available at /path/to/folder.zip. If I file with that path already exists, the functionality could be disabled for that directory, so as not to change existing behavior. Alternatively, the archive could be available at /path/to/folder/.zip, which would reduce the risk of collisions.
If the functionality is implemented in such a non-disruptive way, I would also suggest that it be enabled by default when directory listings are enabled, and that it can be disabled with a command-line option. Alternatively, it could be disabled by default.
The text was updated successfully, but these errors were encountered:
What's the problem this feature will solve?
When browsing a directory, for example one with a large number of photos or with source code for an application, I would like to download all the files in the directory. This is currently either a slow manual process of clicking each file in turn in a web browser, or requires a tool or script.
Describe the solution you'd like
On directory listings, I would like a link to download an archive of the contents of the directory. The archive could for example be a zip file or a (plain or gzipped) tar file. The link could function similarly to the Download ZIP link on GitHub repositories which builds the archive gradually and streams it to the client, so the download starts immediately.
Alternative Solutions
I can create an archive before using http-server and download that, but it requires me to know upfront which directories I would like to download the contents of, and for large directories, it's slow to have to package all of the contents before starting the download.
Client-side tools that can solve this is are for example
wget
with the--spider
flag, but that's not always available, especially not on mobile devices. I have not found an easy way to do this in normal desktop and mobile browsers.Additional context
I believe this can be implemented quite easily using a package like archiver (streaming file creation and support for zip and tar formats) or more directly with support for just zip files using its dependency zip-stream. I'm happy to give this a go and see if I can whip up a draft pull request, assuming the extra dependency and resultant package size increase is acceptable.
I would suggest that the zip archive for a folder
/path/to/folder/
be made available at/path/to/folder.zip
. If I file with that path already exists, the functionality could be disabled for that directory, so as not to change existing behavior. Alternatively, the archive could be available at/path/to/folder/.zip
, which would reduce the risk of collisions.If the functionality is implemented in such a non-disruptive way, I would also suggest that it be enabled by default when directory listings are enabled, and that it can be disabled with a command-line option. Alternatively, it could be disabled by default.
The text was updated successfully, but these errors were encountered: