New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
32 bit git-lfs fails silently for large files #3570
Comments
Can you provide the version of Git LFS that you're using and the OS you're using it on? (And ideally the full output of To my knowledge, we fixed the only known large file issue for 32-bit systems in Git LFS 2.7.0 with #3426. I'd definitely be interested in knowing more about what's going on if you're using that version or newer and still seeing problems. It is known that 32-bit Git versions may have issues with checking out large files because they're checked out entirely into memory. On Windows, there's also an additional issue with Git which is documented thoroughly in #2434, along with workarounds. These issues aren't things that we as the Git LFS project can control. If you're not seeing any progress, even from the command line, can you report that as a new issue? We'd like to track that independently of this issue. |
Sure, here we are:
I'm aware of the problems with git checking out files in memory, I posted a separate issue for that (#3559). So, we are running the --skip smudge filter settings. Looking at the issue from the outside, it looks as though the "clean" operation fails. Ordinarily it ought to recognize a pointer file and just pass it through directly. But probably because it cannot parse the 'size' entry (it could be trying to fit it in 32 bits) it doesn't recognize the contents as being a pointer file, and creates a new output. A separate thing is that git lfs doesn't attempt to even download the large files, but that could be related to the above, if it is scouring the repository, parsing pointer files, to get at the lfs source file hashes. I'll look at the progress issue separately. |
Yeah, your assessment is spot on. That's exactly the issue I fixed in #3426, and using 2.7.0 or newer should automatically fix your problems, whether it's 32-bit or 64-bit. We were parsing the integer as a native-size integer instead of a 64-bit integer. If you want to upgrade, I recommend 2.7.1 over 2.7.0, as it fixed the error handling during network errors. |
Ok, thanks for the explanation. Closing this then, since it's a known issue. |
Tools such as Atlassian Sourcetree bundle an embedded git/git-lfs combo. This version is 32 bits.
When it encounters files larger than 4GB (or perhaps even 2GB), it won't download them from source.
Also, git status will report the files to be modified, since (from what I am able to deduce) the 'clean' filter doesn't recognise the pointers, and instead interprets them as new, short, files.
This probably wouldn't be so bad if it weren't completely silent. Git lfs fetch, git lfs checkout, nothing reports any errors.
I finally resolved this issue by installing a proper 64 bit build of git/git-lfs.
(also git-lfs fetch doesn't show any progress while it is downloading our huge files from the server, a bit of a drag)
The text was updated successfully, but these errors were encountered: