Replies: 1 comment 2 replies
-
@ysebyy have you tried using filter to limit the size of the blob? Just like --filter=blob:limit= would. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey!
We have been using go-git for our projects for a while ( one of the projects is actually available here: https://github.com/vinted/sbomsftw )
However, we ran into an issue. So we run git clone on our ORG and we have a few jupyter notebook applications. They have pretty hefty filesizes and during the clone, we see a sizeable chunk of memory/CPU increase (kubernetes).
We can't seem to find any option if we could just clone the root dir without sub-dirs or not clone specific files or any of those sorts. Any possible things we could use, or making exceptions for these repos the only one?
We have currently limited the depth size to 1.
I have also searched for an LFS option, but it seems that it is not available due to licensing.
Beta Was this translation helpful? Give feedback.
All reactions