A tool to dump a git repository from a website.
usage: git-dumper.py [options] URL DIR Dump a git repository from a website. positional arguments: URL url DIR output directory optional arguments: -h, --help show this help message and exit --proxy PROXY use the specified proxy -j JOBS, --jobs JOBS number of simultaneous requests -r RETRY, --retry RETRY number of request attempts before giving up -t TIMEOUT, --timeout TIMEOUT maximum time in seconds before giving up
./git-dumper.py http://website.com/.git ~/website
Install the dependencies
pip install -r requirements.txt
How does it work?
The tool will first check if directory listing is available. It it is, then it will just recursively download the .git directory (what you would do with
If directory listing is not available, it will use several methods to find as many files as possible. Step by step, git-dumper will:
- Fetch all common files (
- Find as many refs as possible (such as
refs/remotes/origin/HEAD, etc.) by analyzing
.git/packed-refsand so on;
- Find as many objects (sha1) as possible by analyzing
- Fetch all objects recursively, analyzing each commits to find their parents;
git checkout .to recover the current working tree