Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: complete refactor to use argparse and classes #6

Merged
merged 6 commits into from Oct 20, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 2 additions & 5 deletions .coveragerc
@@ -1,5 +1,2 @@
[report]
exclude_lines =
if __name__ == '__main__':
main()
Cli()
[run]
omit = forks_sync/cli.py
35 changes: 25 additions & 10 deletions README.md
Expand Up @@ -2,7 +2,7 @@

# Forks Sync

Keep all your git forks up to date with the remote main branch.
Keep all your git forks up to date with the remote default branch.

[![Build Status](https://github.com/Justintime50/forks-sync/workflows/build/badge.svg)](https://github.com/Justintime50/forks-sync/actions)
[![Coverage Status](https://coveralls.io/repos/github/Justintime50/forks-sync/badge.svg?branch=main)](https://coveralls.io/github/Justintime50/forks-sync?branch=main)
Expand All @@ -13,11 +13,11 @@ Keep all your git forks up to date with the remote main branch.

</div>

If you manage more than a couple git forks, keeping them up to date with the remote default branch can be a pain. Forks Sync lets you avoid all the fuss by concurrently cloning each of your projects locally, adding the remote upstream, fetching new changes, rebasing your local default branch against the remote default branch, and `force pushing` to your origin repo's default branch - keeping all your forks up to date with the original repo.
If you manage more than a couple git forks, keeping them up to date with their remote default branch can be a pain. Forks Sync lets you avoid all the fuss by concurrently cloning each of your projects locally, adding the remote upstream, fetching new changes, rebasing your local default branch against the remote default branch, and `force pushing` to your repo's origin default branch - keeping all your forks up to date with the original repo.

By default, Forks Sync will save all your forks to `~/forks-sync` where you can also find logs for this tool.

**Note:** Before proceeding, know that this tool will forcefully update the default branch of your fork to match the upstream default branch.
**NOTE:** Before proceeding, know that this tool will forcefully update the default branch of your fork to match the upstream default branch.

## Install

Expand All @@ -31,15 +31,30 @@ make install

## Usage

```bash
# Setup your ssh agent to ensure the script runs continually
ssh-add
```
Usage:
forks-sync --token 123...

Options:
-h, --help show this help message and exit
-t TOKEN, --token TOKEN
Provide your GitHub token to authenticate with the GitHub API.
-f, --force Pass this flag to force push changes to forked repos, otherwise the tool will run in "dry mode".
-th THREADS, --threads THREADS
The number of threads to run.
-to TIMEOUT, --timeout TIMEOUT
The number of seconds before a git operation times out.
-l LOCATION, --location LOCATION
The location where you want your forks and logs to be stored.
```

# Pass your GitHub API key/token here:
GITHUB_TOKEN=123... forks-sync
### Automating SSH Passphrase Prompt (Recommended)

# Optional params:
# FORKS_SYNC_LOCATION="~/my-folder"
To allow the script to run continuosly without requiring your SSH passphrase, you'll need to add your passphrase to the SSH agent. **NOTE:** Your SSH passphrase will be unloaded upon logout.

```bash
# This assumes you've saved your SSH keys to the default location
ssh-add
```

## Development
Expand Down
70 changes: 70 additions & 0 deletions forks_sync/cli.py
@@ -0,0 +1,70 @@
import argparse

from forks_sync import ForksSync
from forks_sync.constants import DEFAULT_LOCATION, DEFAULT_NUM_THREADS, DEFAULT_TIMEOUT


class ForksSyncCli:
def __init__(self):
parser = argparse.ArgumentParser(
description='Keep all your git forks up to date with the remote default branch.'
)
parser.add_argument(
'-t',
'--token',
type=str,
required=True,
default=None,
help='Provide your GitHub token to authenticate with the GitHub API.',
),
parser.add_argument(
'-f',
'--force',
action='store_true',
required=False,
default=False,
help='Pass this flag to force push changes to forked repos, otherwise the tool will run in "dry mode".',
),
parser.add_argument(
'-th',
'--threads',
type=int,
required=False,
default=DEFAULT_NUM_THREADS,
help='The number of threads to run.',
),
parser.add_argument(
'-to',
'--timeout',
type=int,
required=False,
default=DEFAULT_TIMEOUT,
help='The number of seconds before a git operation times out.',
),
parser.add_argument(
'-l',
'--location',
type=str,
required=False,
default=DEFAULT_LOCATION,
help='The location where you want your forks and logs to be stored.',
),
parser.parse_args(namespace=self)

def run(self):
forks_sync = ForksSync(
token=self.token,
force=self.force,
threads=self.threads,
timeout=self.timeout,
location=self.location,
)
forks_sync.run()


def main():
ForksSyncCli().run()


if __name__ == '__main__':
main()
5 changes: 5 additions & 0 deletions forks_sync/constants.py
@@ -0,0 +1,5 @@
import os

DEFAULT_LOCATION = os.path.expanduser('~/forks-sync')
DEFAULT_NUM_THREADS = 10
DEFAULT_TIMEOUT = 300
29 changes: 29 additions & 0 deletions forks_sync/logger.py
@@ -0,0 +1,29 @@
import logging
import logging.handlers
import os

# 200kb * 5 files = 1mb of logs
LOG_MAX_BYTES = 200000 # 200kb
LOG_BACKUP_COUNT = 5


class Logger:
@staticmethod
def setup_logging(logger, location):
"""Setup project logging (to console and log file)"""
log_path = os.path.join(location, 'logs')
log_file = os.path.join(log_path, 'forks-sync.log')

if not os.path.exists(log_path):
os.makedirs(log_path)

logger.setLevel(logging.INFO)
handler = logging.handlers.RotatingFileHandler(
log_file,
maxBytes=LOG_MAX_BYTES,
backupCount=LOG_BACKUP_COUNT,
)
formatter = logging.Formatter("%(asctime)s - %(levelname)s - %(message)s")
handler.setFormatter(formatter)
logger.addHandler(logging.StreamHandler())
logger.addHandler(handler)