Skip to content
14 changes: 9 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,20 +65,20 @@ There is a docker build file if you want to run the scanner from a container:
### Command-line options

```
usage: gcp-scanner -o /folder_to_save_results/ -g -
usage: python3 scanner.py -o folder_to_save_results -g -

GCP Scanner

options:
-h, --help show this help message and exit
-ls, --light-scan Return only the most important GCP resource fields in the output.
-k KEY_PATH, --sa-key-path KEY_PATH
Path to directory with SA keys in json format
-g GCLOUD_PROFILE_PATH, --gcloud-profile-path GCLOUD_PROFILE_PATH
Path to directory with gcloud profile. Specify - to search for credentials in default gcloud config path
-m, --use-metadata Extract credentials from GCE instance metadata
-at ACCESS_TOKEN_FILES, --access-token-files ACCESS_TOKEN_FILES
A list of comma separated files with access token and OAuth scopes.TTL limited. A token and scopes should be stored in JSON
format.
A list of comma separated files with access token and OAuth scopes.TTL limited. A token and scopes should be stored in JSON format.
-rt REFRESH_TOKEN_FILES, --refresh-token-files REFRESH_TOKEN_FILES
A list of comma separated files with refresh_token, client_id,token_uri and client_secret stored in JSON format.
-s KEY_NAME, --service-account KEY_NAME
Expand All @@ -89,10 +89,14 @@ options:
Comma separated list of project names to include in the scan
-c CONFIG_PATH, --config CONFIG_PATH
A path to config file with a set of specific resources to scan.
-l {INFO,WARNING,ERROR}, --logging {INFO,WARNING,ERROR}
-l {DEBUG,INFO,WARNING,ERROR,CRITICAL}, --logging {DEBUG,INFO,WARNING,ERROR,CRITICAL}
Set logging level (INFO, WARNING, ERROR)
-lf LOG_DIRECTORY, --log-file LOG_DIRECTORY
-lf LOG_FILE, --log-file LOG_FILE
Save logs to the path specified rather than displaying in console
-pwc PROJECT_WORKER_COUNT, --project-worker-count PROJECT_WORKER_COUNT
Set limit for project crawlers run in parallel.
-rwc RESOURCE_WORKER_COUNT, --resource-worker-count RESOURCE_WORKER_COUNT
Set limit for resource crawlers run in parallel.

Required parameters:
-o OUTPUT, --output-dir OUTPUT
Expand Down
14 changes: 10 additions & 4 deletions src/gcp_scanner/arguments.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,11 +129,17 @@ def arg_parser():
help='Save logs to the path specified rather than displaying in\
console')
parser.add_argument(
'-wc',
'--worker-count',
'-pwc',
'--project-worker-count',
default=1,
dest='worker_count',
help='Set limit for workers run in parallel.')
dest='project_worker_count',
help='Set limit for project crawlers run in parallel.')
parser.add_argument(
'-rwc',
'--resource-worker-count',
default=1,
dest='resource_worker_count',
help='Set limit for resource crawlers run in parallel.')

args: argparse.Namespace = parser.parse_args()

Expand Down
4 changes: 2 additions & 2 deletions src/gcp_scanner/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ def __init__(
sa_name,
credentials,
chain_so_far,
worker_count
resource_worker_count
):
self.project = project
self.sa_results = sa_results
Expand All @@ -70,4 +70,4 @@ def __init__(
self.sa_name = sa_name
self.credentials = credentials
self.chain_so_far = chain_so_far
self.worker_count = worker_count
self.resource_worker_count = resource_worker_count
Loading