Skip to content

Commit

Permalink
Merge pull request #728 from payaljindal/update/target-server-validator
Browse files Browse the repository at this point in the history
Updated target server validator tool
  • Loading branch information
danistrebel committed Mar 25, 2024
2 parents c6e4c3e + 87e2d55 commit 983e391
Show file tree
Hide file tree
Showing 13 changed files with 889 additions and 253 deletions.
2 changes: 2 additions & 0 deletions tools/target-server-validator/.gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
callout/target/
export/
scan_output.json
input.csv
147 changes: 122 additions & 25 deletions tools/target-server-validator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,10 @@ Validation is done by deploying a sample proxy which check if HOST & PORT is ope
## Pre-Requisites
* Python3.x
* Java
* Maven
* Maven >= 3.9.6

* If you are pushing the data to gcp metrics, you require `roles/monitoring.editor` role.

* Please install the required Python dependencies
```
python3 -m pip install -r requirements.txt
Expand All @@ -25,32 +28,55 @@ bash callout/build_java_callout.sh

```
[source]
baseurl=https://x.x.x.x/v1 # Apigee Base URL. e.g http://management-api.apigee-opdk.corp:8080
org=xxx-xxxx-xxx-xxxxx # Apigee Org ID
auth_type=basic # API Auth type basic | oauth
baseurl=https://x.x.x.x/v1 # Apigee Base URL. e.g http://management-api.apigee-opdk.corp:8080
org=xxx-xxxx-xxx-xxxxx # Apigee Org ID
auth_type=basic # API Auth type basic | oauth
[target]
baseurl=https://apigee.googleapis.com/v1 # Apigee Base URL
org=xxx-xxxx-xxx-xxxxx # Apigee Org ID
auth_type=oauth # API Auth type basic | oauth
baseurl=https://apigee.googleapis.com/v1 # Apigee Base URL
org=xxx-xxxx-xxx-xxxxx # Apigee Org ID
auth_type=oauth # API Auth type basic | oauth
[csv]
file=input.csv # Path to input CSV. Note: CSV needs HOST & PORT columns
default_port=443 # default port if port is not provided in CSV
file=input.csv # Path to input CSV. Note: CSV needs HOST & PORT columns
default_port=443 # default port if port is not provided in CSV
[validation]
check_csv=true # 'true' to validate Targets in input csv
check_proxies=true # 'true' to validate Proxy Targets else 'false'
skip_proxy_list=mock1,stream # Comma sperated list of proxies to skip validation;
proxy_export_dir=export # Export directory needed when check_proxies='true'
api_env=dev # Target Environment to deploy Validation API Proxy
api_name=target_server_validator # Target API Name of Validation API Proxy
api_force_redeploy=false # set 'true' to Re-deploy Target API Proxy
api_hostname=example.apigee.com # Target VirtualHost or EnvGroup Domain Name
api_ip=<IP> # IP address corresponding to api_hostname. Use if DNS record doesnt exist
report_format=csv # Report Format. Choose csv or md (defaults to md)
check_csv=true # 'true' to validate Targets in input csv
check_proxies=true # 'true' to validate Proxy Targets else 'false'
skip_proxy_list=mock1,stream # Comma separated list of proxies to skip validation;
proxy_export_dir=export # Export directory needed when check_proxies='true'
api_env=dev # Target Environment to deploy Validation API Proxy
api_name=target_server_validator # Target API Name of Validation API Proxy
api_force_redeploy=false # set 'true' to Re-deploy Target API Proxy
api_hostname=example.apigee.com # Target VirtualHost or EnvGroup Domain Name
api_ip=<IP> # IP address corresponding to api_hostname. Use if DNS record doesnt exist
report_format=csv # Report Format. Choose csv or md (defaults to md)
[gcp_metrics]
enable_gcp_metrics=true # set 'true' to push target server's host and status to GCP metrics
project_id=xxx-xxx-xxx # Project id of GCP project where the data will be pushed
metric_name=custom.googleapis.com/<metric_name> # Replace <metric_name> with custom metric name
enable_dashboard=true # set 'true' to create the dashboard with alerting policy
dashboard_title=Apigee Target Server Monitoring Dashboard # Monitoring Dashboard Title
alert_policy_name=Apigee Target Server Validator Policy # Alerting Policy Name
notification_channel_ids=xxxxxxxx # Comma separated list of Notification Channel ids
[target_server_state_file]
state_file=gs://bucket_name/path/to/file/scan_output.json # GCS Bucket path to store --scan output
# state_file=file://scan_output.json # File path to store --scan output (only one can be used either GCS or file)
gcs_project_id=xxx-xxxx-xxx-xxxxx # GCS bucket project id
```

To get the notification channel id, use the following command

```
gcloud beta monitoring channels list --project=<project_id>
```

This command will display all available notification channels within your project. You can select the appropriate one based on your requirements. Locate the notification channel ID under the `name` field in the format `projects/<project_id>/notificationChannels/<notification_channel_id>`, and insert it into the input.properties file.


* Sample input CSV with target servers
> **NOTE:** You need to set `check_csv=true` in the `validation` section of `input.properties`
Expand All @@ -64,7 +90,14 @@ smtp.gmail.com,465
```


* Please run below commands to authenticate, based on the Apigee flavours you are using.
* Please run below commands to authenticate,

```
gcloud auth application-default set-quota-project <project_id>
```
You can skip the quota-project if you want.

Another way to authenticate is to use the environmnet variables based on the Apigee flavours.

```
export APIGEE_OPDK_ACCESS_TOKEN=$(echo -n "<user>:<password>" | base64) # Access token for Apigee OPDK
Expand All @@ -76,16 +109,42 @@ export APIGEE_ACCESS_TOKEN=$(gcloud auth print-access-token) # Access
* Export Proxy Bundle
* Parse Each Proxy Bundle for Target
* Run Validate API against each Target (optional)
* Generate csv/md Report
* Generate csv/md Report or push data to GCP Monitoring Dashboard

## Usage

Run the script as below
The script supports the below arguments

* `--onboard` option to create validator proxy, custom metric descriptors, alerting policy and dashboard
* `--scan` option to fetch target servers from Environment target servers, api proxies & csv file
* `--monitor` option to check the status of target servers and generate report or push to GCP metrics
* `--offboard` option to delete validator proxy, custom metric descriptors, alerting policy and dashboard
* `--input` Path to input properties file

To onboard, run
```
python3 main.py --input path/to/input_file --onboard
```
Make sure you have build the java callout jar before running onboard.

To scan, run
```
python3 main.py
python3 main.py --input path/to/input_file --scan
```

This script deploys an API proxy to validate if the target servers are reachable or not. To use the API proxy, make sure your payloads adhere to the following format:
To monitor, run
```
python3 main.py --input path/to/input_file --monitor
```

To offboard, run
```
python3 main.py --input path/to/input_file --offboard
```

You can also pass multiple arguments at the same time.

--onboard deploys an API proxy to validate if the target servers are reachable or not. To use the API proxy, make sure your payloads adhere to the following format:

```json
[
Expand All @@ -112,7 +171,7 @@ The response will look like this -
{
"host": "example2.com",
"port": 443,
"status" : "UNKNOWN_HOST"
"status" : "UNKNOWN_HOST"
},
// and so on
]
Expand All @@ -122,3 +181,41 @@ The response will look like this -
Validation Report: `report.md` OR `report.csv` can be found in the same directory as the script.

Please check a [Sample report](report.md)

## GCP Monitoring Dashboard
The script can also create a GCP Monitoring Dashboard with an alerting widget like shown below:

![GCP Monitoring Dashboard](images/dashboard.png)

This script creates a custom metric with labels as hostname and status. The possible statuses, namely REACHABLE NOT_REACHABLE, and UNKNOWN_HOST, are determined by calling the validator proxy. These statuses are then assigned values of 1, 0.5, and 0, respectively.

Then, an alerting policy is created with a threshold of 0.75. Entries below this threshold trigger alerts sent to designated notification channels. Finally, this policy is added as a widget on the GCP dashboard.

# Running the Pipeline

To run the pipeline script (`pipeline.sh`), follow these steps:

## Prerequisites

Before running the pipeline script, ensure you have the following prerequisites configured:

- **Environment Variables**: Set up the necessary environment variables required by the script. These variables should include:
- `APIGEE_X_ORG`: Your Apigee organization ID.
- `APIGEE_X_ENV`: The Apigee environment to deploy to.
- `APIGEE_X_HOSTNAME`: The hostname for your Apigee instance.

*NOTE*: This pipeline will create a test notification channel with type email and email_address as `no-reply@google.com`.

- **IAM Roles**: To set up the monitoring dashboard and alerts, make sure that you have `roles/monitoring.editor` role.

- **Input Properties Template**: This script requires an `input.properties` file for the necessary configuration parameters and will create a corresponding `generated.properties` file by replacing the environment variables with their values. Ensure that the values are set properly in this file before running the script.

## Running the Pipeline

### Command

To execute the pipeline, use the following command:

```
./pipeline.sh
```
66 changes: 61 additions & 5 deletions tools/target-server-validator/apigee_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@
import requests
import shutil
from time import sleep
import google.auth
import google.auth.transport.requests
from utilities import ( # pylint: disable=import-error
run_validator_proxy,
unzip_file,
Expand Down Expand Up @@ -57,6 +59,18 @@ def is_token_valid(self, token):
return False

def get_access_token(self):
try:
credentials, project_id = google.auth.default()
request = google.auth.transport.requests.Request()
credentials.refresh(request)
access_token = credentials.token
if self.is_token_valid(access_token):
return access_token
logger.error('please run "export APIGEE_ACCESS_TOKEN=$(gcloud auth print-access-token)" first or set the Application Default Credentials using "gcloud auth application-default login" !! ') # noqa
except Exception as e:
logger.debug(f"Couldn't find the default credentials. ERROR-INFO :{e}") # noqa

logger.debug("Checking env variable value.")
token = os.getenv(
"APIGEE_ACCESS_TOKEN"
if self.apigee_type == "x"
Expand All @@ -67,15 +81,15 @@ def get_access_token(self):
if self.is_token_valid(token):
return token
else:
logger.error('please run "export APIGEE_ACCESS_TOKEN=$(gcloud auth print-access-token)" first !! ') # noqa
logger.error('please run "export APIGEE_ACCESS_TOKEN=$(gcloud auth print-access-token)" first or set the Application Default Credentials using "gcloud auth application-default login" !! ') # noqa
sys.exit(1)
else:
return token
else:
if self.apigee_type == "x":
logger.error('please run "export APIGEE_ACCESS_TOKEN=$(gcloud auth print-access-token)" first !! ') # noqa
logger.error('please run "export APIGEE_ACCESS_TOKEN=$(gcloud auth print-access-token)" first or set the Application Default Credentials using "gcloud auth application-default login" !! ') # noqa
else:
logger.error('please export APIGEE_OPDK_ACCESS_TOKEN')
logger.error('please export APIGEE_OPDK_ACCESS_TOKEN or set the Application Default Credentials using "gcloud auth application-default login"') # noqa
sys.exit(1)

def set_auth_header(self):
Expand Down Expand Up @@ -142,6 +156,48 @@ def create_api(self, api_name, proxy_bundle_path):
logger.debug(response.text)
return False, None

def get_api_deployments(self, api_name):
headers = self.auth_header.copy()

deployed_revision_url = f"{self.baseurl}/apis/{api_name}/deployments"
deployed_revision_get_response = requests.request(
"GET", deployed_revision_url, headers=headers, data={}
)
deployments = deployed_revision_get_response.json()
revision_deployements = deployments.get('deployments')
return revision_deployements

def delete_api(self, api_name):
headers = self.auth_header.copy()
revision_deployements = self.get_api_deployments(api_name)

if revision_deployements:
for revision_deployement in revision_deployements:
deployed_env = revision_deployement.get('environment')
rev = revision_deployement.get('revision')

# delete api deployment
revision_delete_url = f"{self.baseurl}/environments/{deployed_env}/apis/{api_name}/revisions/{rev}/deployments" # noqa
revision_response = requests.request(
"DELETE",
revision_delete_url, headers=headers, data={}
)
if revision_response.status_code == 200:
logger.info(f"Successfully deleted {api_name} api proxy revision {rev} in env {deployed_env}") # noqa

# proxy deletion
url = f"{self.baseurl}/apis/{api_name}"
try:
response = requests.request(
"DELETE", url, headers=headers, data={}
)
if response.status_code == 200:
logger.info(f"Api proxy {api_name} deleted successfully.")
else:
logger.error(f"Error deleting Api proxy {api_name}. ERROR-INFO - {response.json()}") # noqa
except Exception as e:
logger.error(f"Couldn't delete api proxy {api_name}. ERROR-INFO- {e}") # noqa

def get_api_revisions_deployment(self, env, api_name, api_rev): # noqa
url = (
url
Expand Down Expand Up @@ -207,12 +263,12 @@ def deploy_api_bundle(self, env, api_name, proxy_bundle_path, api_force_redeploy
return True
else:
if self.deploy_api(env, api_name, api_rev):
logger.info(f"Proxy with name {api_name} has been deployed to {env} in Apigee Org {self.org}") # noqa
logger.info(f"Deploying proxy with name {api_name} to {env} in Apigee Org {self.org}") # noqa
while api_deployment_retry_count < api_deployment_retry:
if self.get_api_revisions_deployment(
env, api_name, api_rev
):
logger.debug(f"Proxy {api_name} active in runtime after {api_deployment_retry_count*api_deployment_sleep} seconds ") # noqa
logger.info(f"Proxy {api_name} active in runtime after {api_deployment_retry_count*api_deployment_sleep} seconds ") # noqa
return True
else:
logger.debug(f"Checking API deployment status in {api_deployment_sleep} seconds") # noqa
Expand Down
40 changes: 40 additions & 0 deletions tools/target-server-validator/generated.properties
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
[source]
baseurl=https://apigee.googleapis.com/v1
org=xxx-xxx-xxx
auth_type=oauth

[target]
baseurl=https://apigee.googleapis.com/v1
org=xxx-xxx-xxx
auth_type=oauth

[csv]
file=input.csv
default_port=443

[validation]
check_csv=true
check_proxies=true
proxy_export_dir=export
skip_proxy_list=mock1,stream
api_env=dev
api_name=target-server-validator
api_force_redeploy=true
api_hostname=example.apigee.com
api_ip=
report_format=md
allow_insecure=false

[gcp_metrics]
enable_gcp_metrics=true
project_id=xx-xxx-xxx
metric_name=custom.googleapis.com/host_status
enable_dashboard=true
dashboard_title=Apigee Target Server Health Monitoring Dashboard
alert_policy_name=Apigee Target Server Validator Policy
notification_channel_ids=xxxxx

[target_server_state_file]
state_file=gs://bucket_name/path/to/file/scan_output.json
# state_file=file://scan_output.json
gcs_project_id=xx-xxx-xxx
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 983e391

Please sign in to comment.