-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tar directory depth #54
Comments
To follow up on this after some more experimentation, it's not (directly) related to the dir depth but instead to the full file path length exceeded 100 chars. The wikipedia page talks about UStar format to address the file name limit (http://en.wikipedia.org/wiki/Tar_(computing)#Format_details). Which I guess isn't supported in archiver? |
@ewen i will adding prefix support to tar in the next version or possibly a patch release. lots of work on matching header specs lately to support parsing archives into archiver. |
Cool, thanks a lot Chris. |
@ewen archiver |
Tested and working! Thanks. |
Hi,
I'm trying to create a tarball of a src tree using this grunt plugin.
My config looks something like:
compress: {
main: {
options: {
archive: 'archive.tar',
mode: 'tar'
},
files: [
{expand: true, cwd: 'build/tar/', src: ['**'], dest: '/project/'}
]
}
}
And the tar is successfully created, but when I list the files with tar -tf archive.tar I get the following errors that look like:
:Archive contains
0' where numeric mode_t value expected tar: Archive contains
0' where numeric mode_t value expectedtar: Exiting with failure status due to previous errors
After some experimentation it seems that it is related to the max depth of the tar contents, if they are more than 12 levels deep I get the errors, less and I don't.
The tar file does seem to be extractable despite the errors, although I haven't exhaustively checked all the files are present.
Any ideas on the cause would be much appreciated. Perhaps this is better asked in the node-archiver project?
Thanks!
The text was updated successfully, but these errors were encountered: