You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Over time the documents inside the taxonomy repo might get quite large. Cloning a massive repository will be an impediment to first time contributors. So having all the documents download as part of the clone might not be a good idea.
There was a decision to require contributors submitting knowledge to submit the documents via git-lfs.
How does requiring LFS impact contributor experience both command line and GitHub UI? For both:
contributors using git clients
contributors using the GitHub UI
The text was updated successfully, but these errors were encountered:
Git LFS storage on GitHub is a paid feature when you exceed 1 GiB of storage or 1 GiB a month of bandwidth. Interestingly, forks count against the parent's repository's quota for storage and bandwidth. Additional data packs costs $5 per month for 50 GiB for bandwidth and 50 GiB for storage.
Contributors will not need to install LFS to clone the repo, but they will not be able to see or add files of the extentions configured for LFS.
I think it makes sense to use LFS if we expect individual files to exceed 50 MiB in size. But it may not make sense to use LFS just to handle a large amount of total data in the repository.
Related to: instructlab/instructlab#62 (comment)
Over time the documents inside the taxonomy repo might get quite large. Cloning a massive repository will be an impediment to first time contributors. So having all the documents download as part of the clone might not be a good idea.
There was a decision to require contributors submitting knowledge to submit the documents via git-lfs.
How does requiring LFS impact contributor experience both command line and GitHub UI? For both:
The text was updated successfully, but these errors were encountered: