Skip to content

Writing the same zip file from 0 bytes for multiple times #64

@sumtec

Description

@sumtec

When giving a large number of files from different directories as the paths argument, it seems to be writing the file over and over again in one execution.

For example, there are lots of directories under c:\logs, like:
C:\Logs\1
C:\Logs\1\This
C:\Logs\1\That
C:\Logs\2
C:\Logs\2\This
C:\Logs\2\That
C:\Logs\3
C:\Logs\3\This
C:\Logs\3\That
...

Each of these directories has many files. If there are 10k files and 1GB data, you will notice that the output zip file will show the size in such a manner:

...
103MB
15MB
75MB
121MB
0MB
30MB
147MB
26MB
...

It seems obvious to me that it's going to compress in such a way:

  1. Compress some of the files in one directory and saved into a zip file.
  2. Open the zip file and try to read it out, delete or truncate the old file, and repeat step 1.

It will take an unacceptable time to finish. Here is the script I used:
$files = ls "C:\Logs*" -Recurse | where { $_.Name -notlike '*.cab'} | select -ExpandProperty FullName

Compress-Archive $files -DestinationPath "C:\Compressed\Logs.zip" -CompressionLevel Optimal

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions