You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
rlskoeser
changed the title
deal large file system error / repository over quota
deal with large file system error / repository over quota
Feb 12, 2024
@kayiwa@acozine I found the over quota information in the billing for my CDH github org and did some investigation to try to figure out what's going on and see if there's an easy solution.
I think the short term fix is that I should just pay for the extra LFS storage, but would love your input on a better long-term solution.
I apparently have enabled LFS on two different repositories. I think I briefly used it in the simulating-risk repo for a large data file but decided keeping large CSVs in git wasn't a good plan. I removed it, but must not have removed it from the repository history. I used git-repo-filter to remove all large files from that repository history and pushed all the changes to GitHub, but I don't know how soon that will have an impact (maybe not until next month? ... or maybe not unless I delete the repo?!? I will contact GitHub about this).
In cdh-ansible I have a vaulted file that is 288MB (over the 100MB threshold that requires LFS); this is a bzip tar file of three files. I did some investigating, and it looks like it's actually the vault encryption that is making the file size so large.
The files are needed for the prosody project - it's a set of binary purchased MARC records that we have access to through PUL but can't distribute. Putting a vaulted file in the ansible repo is similar to what we did with purchased font files.
causing a "smudge" error; prevents ansible tower from checking out the cdh-ansible git repo
The text was updated successfully, but these errors were encountered: