-
Notifications
You must be signed in to change notification settings - Fork 35
Parallel Decompression #14
Comments
Yes, it's possible, the code is already there for compression https://github.com/tritonas00/system-tar-and-restore/blob/master/star.sh#L529 i will look into it when i have my laptop back |
I did a quick patch to enable this feature for p/bzip2. |
no need to add extra statements in read_archive replace also i thing you need -d https://linux.die.net/man/1/pbzip2 |
I took your suggestion with the as for the -d from what I read is that when passing a program via -I the program itself simply has to support the -d option |
yes like that but the statement must be before https://github.com/tritonas00/system-tar-and-restore/blob/master/star.sh#L764 not after when i have my laptop i will extend it to all compressors and add threads also |
The diff shows that After 763 in the first file (original file) it's adding the lines from the second file (one I modified). That would put them above line 764. Other than that I think it paves the way for other compressors. ( I have another modification that adds pixz support) |
you are right, i didn't watch carefully
exactly, but please check if non parallel way still works fork, add your code and create a pull request thank you for your contribution |
can confirm that without -M option bzip2 is used. With -M option pbzip2 is used. |
#16 merged, pbzip2 decompression is now done and you can use -z as well to define max threads any idea of parallel decompressors for xz and gzip also? |
Noticed that on restoring a system that there was a significant amount of time checking the archive
and wondered if it could be quicker with parallel decompression. I know that pbzip2 has it.
tar supports a -I option to use other programs such as pbzip2
possibly just reuse the -M --multi-core option for defining decompression (Restore mode).
The text was updated successfully, but these errors were encountered: