You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One day, we deploy gogs, use configuration: BACKUP_INTERVAL=1d BACKUP_RETENTION=7d
Another day after then, for some reason, Gogs was unable to provide services (such as server shutdown), and backup work was suspended
3.One week after, the administrator fixed the issue and restarted gogs
At this point, the problem occurred: All backup files were deleted by the backup-rotator.sh script because the logic in the script is to find files with mtime seven days ago and delete them all
In fact, although this situation is relatively difficult to occur in production environments (people generally do not allow Gogs to stop serving for more than a week), it may still occur in personal usage scenarios.
Describe the solution you'd like
So, I think we should need a way to keep at least a few of the latest backups, regardless of the time interval, such as we change the logic in backup-rotator.sh(e.g. using a new config indicating the number of backup files we need to keep such as BACKUP_GC_KEEP_NUM=5).
Assume that all files in /backup dir are like: gogs-backup-*.zip, we can maunally find all files(e.g. use ls command with -lthr to find all files and sort by mtime) and keep some of them(e.g. use tail command to select files in pipline instead of the latest 5 files) and delete seleted files.
push a new PR, try to make it forworld compatibility with existing configuration manual, tested with my own docker-compose.yaml(not modified throughout the upgrade process):
Describe the feature
Imagine this scenario:
3.One week after, the administrator fixed the issue and restarted gogs
In fact, although this situation is relatively difficult to occur in production environments (people generally do not allow Gogs to stop serving for more than a week), it may still occur in personal usage scenarios.
Describe the solution you'd like
So, I think we should need a way to keep at least a few of the latest backups, regardless of the time interval, such as we change the logic in backup-rotator.sh(e.g. using a new config indicating the number of backup files we need to keep such as BACKUP_GC_KEEP_NUM=5).
Assume that all files in /backup dir are like:
gogs-backup-*.zip
, we can maunally find all files(e.g. use ls command with -lthr to find all files and sort by mtime) and keep some of them(e.g. use tail command to select files in pipline instead of the latest 5 files) and delete seleted files.End up, the original command in script will be llike(not tested):
Describe alternatives you've considered
I will find some free time testing if the solution above can run : convert this issue to a pull request ? think of other ways.
Additional context
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: