-
Notifications
You must be signed in to change notification settings - Fork 3
Description
So the disk space warning is indeed back in: https://github.com/EinsteinToolkit/tests/actions/runs/4585746865 and that run actually died. One wonders which repository has the largest impact.
If gh-pages then there are in principle ways to reduce its on-disk size significantly though I more suspect the culprit is at least partially the compiled ET code.
If I do a:
git clone -b master https://github.com/einsteintoolkit/tests master
cd master
git submodule update --init --recursive --remote --jobs 4
cd ../
git clone -b gh-pages --depth 1 https://github.com/einsteintoolkit/tests gh-pages
git clone -b scripts --depth 1 https://github.com/einsteintoolkit/tests scripts
then the different checkouts are sized (du -hs *):
19G gh-pages
2.1G master
356K scripts
So gh-pages is actually sizable and the majority is actually in the checked out files (and not the .git git objects directory which is only 486MB).
Since there are only a very few files we actually modify (other than add new files) we should be able to start with a "checkout" that has nothing actually checked out (using git clone's --no-checkout option) and operate on that one after manually checking out the couple of files we do modify (eg version.txt). This means one has to manually add the files that one wants to add or modify and we need to check that github's checkout action does not "helpfully" run a git commit -a at the end of the worflow (which it may well do) since that would record all the never checked out files as deleted files.