In this section we will detail the deployment process of the function from setting up the code, adding its configuration and deployment into GCP.
The code is composed of three repos (one is optional):
-
This repo which contains the main fetching logic
-
secops_common which contains common reusable logic
-
internal, which is optional and is meant for you to add internal logic (see transformation)
$ git clone https://github.com/airwallex/splunk-pipeline-oss
$ cd splunk-pipeline-oss
$ git clone https://github.com/airwallex/secops_common
# optionally if you have internal logic you want to include
$ git clone <internal repo> internal
The function expects two main configuration files to be present and set:
$ cat .env
function_service_account='function service account in GCP'
function_name='splunk-pipeline'
function_topic='splunk-pipeline'
main_topic='main-splunk-pipeline'
function_memory='8192MB'
connector_name='optional network connector used by the function'
And
$ cat pipeline.yml
dataset: "bigquery data set the function will use"
company: "your company name"
project: "GCP project the function will be running in"
project_id: "GCP project id that the function will be running in"
subject: "Used for workspace logs see pipeline/workspace.py"
lastpass_org_id: "Lastpass org id"
gcp_org_id: "GCP org id"
secops_common contains the required script to deploy the function:
./secops_common/bin/deploy_function
The following is an example of using GCP cloudbuild to deploy the function from a CI process:
steps:
- name: 'gcr.io/cloud-builders/git'
entrypoint: git
args: ['clone' ,'https://github.com/airwallex/secops_common', 'secops_common']
- name: 'gcr.io/cloud-builders/git'
entrypoint: git
args: ['clone' ,'https://github.com/airwallex/splunk-pipeline-oss', 'splunk-pipeline-oss']
- name: 'gcr.io/cloud-builders/git'
entrypoint: git
args: ['clone' ,'your internal logic', 'internal']
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args: ['./bin/copy_pipeline.sh']
- name: python:3.8
entrypoint: pip
args: ['install' ,'-q' ,'-r' ,'requirements.txt', '--user']
- name: python:3.8
entrypoint: python
args: ['-m', 'yapf', '-r', 'pipeline', '-q']
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args: ['./secops_common/bin/deploy_function']
Where:
cat copy_pipeline.sh
#!/usr/bin/env bash
/usr/bin/mv splunk-pipeline-oss/* .
rm -rf splunk-pipeline-oss
Next we set up the topic the that will be used to trigger the function:
$ gcloud pubsub topics create splunk-pipeline
Set up the required secrets as specified in secrets per fetcher type being used and enable the required permissions
And trigger the function manually:
# Pulling the latest information into biquery
$ gcloud pubsub topics publish --message='{"service":"confluence"}' splunk-pipeline
$ gcloud pubsub topics publish --message='{"service":"jira"}' splunk-pipeline
$ gcloud pubsub topics publish --message='{"service":"spreadsheet", "id":"spreadsheet id", "range":"Example!A:G"}' splunk-pipeline
# Reviewing the logs
$ gcloud functions logs read splunk-pipeline
Follow scheduling guide in order to trigger the function continuously.