This connector allows you to send security alerts from Google Cloud Security Command Center to Microsoft Azure Sentinel Log Analytics Workspace or DataDog in almost realtime. If you have created a better version of this integration, please do contribute by creating a pull request! If you have any feedback, or need any help in setting this up, please reach out to amiacs@gmail.com.
- 2023-06-16 Support DataDog integration
- 2023-06-15 Fix for UnicodeEncodeError
- 2023-06-14 Support Deployment via Terraform
- 2023-06-07 Logging enhancements
- 2023-03-19 Support GCP Secret Manager to store secrets, in addition to using environment variables
- 2023-02-06 First version with support for Azure Log Analytics
In the above diagram:
- SCC streams security alerts to a PubSub queue
- Setup SCC notification and Cloud Function in Google Cloud (via Terraform or manually)
- A Cloud function is setup that subscribes to the PubSub queue and gets triggered by EventArc
- The Cloud Function calls the Data Collection API of Azure Sentinel with HMAC-SHA256 authorization
- On the Azure Sentinel side, the security alert is routed to a custom table in the Log Analytics Workspace
End-to-end latency from when an alert is triggered to when it appears in Sentinel is within a couple of seconds. It does take some time (10-15 minutes) for the first SCC alert to show up in Sentinel, as the Sentinel table gets initialized.
- Set up the target destination for the security alerts
- DataDog (Create a API Key and URL)
- Azure Sentinel (Create a Log Analytics Workspace)
- Create a continuous pubsub export of SCC Alerts
- Create a Cloud Function in Google Cloud
- Download python source code from this Github repository (you don’t need to modify the code)
- Create .env file and provide credentials OR put the credentials in GCP secrets manager
- Setup EventArc trigger and deploy the function
- Trigger an SCC Alert and run a query on the Sentinel Log Table to view the finding
- Go to Azure Console -> Log Analytics Workspaces -> Create
- Create a new Workspace or use an existing one
- After creation, the Workspace would look like as in the screenshot
- Take note of the Workspace ID
- Go to Azure Console -> Log Analytics Workspaces -> Agents
- Expand Log analytics agents instructions
- Take a note of the Primary or Secondary key (either of them). This will be required to construct the SHA256-HMAC authorization header to call the Sentinel data collection API
- Go to DataDog -> Organization Settings -> API Keys
- Determine your endpoint URL: for e.g. datadoghq.eu or datadoghq.com
- Launch a Cloud Shell in the GCP console, and clone this github repo
git clone https://github.com/EuroAlphabets/integration-scc-sentinel.git
- Edit the main.tf file to put your Azure credentials and GCP Project ID as shown below. One or more target destinations will be activated based on the environment variables you provide here.
locals {
gcp_organization = "YOUR_ORG_ID"
gcp_project = "YOUR_PROJ_ID"
# provide these to activate Azure connector
azure_log_analytics_workspace_id = "YOUR_WORKSPACE_ID"
azure_log_analytics_authentication_key = "YOUR_KEY"
azure_log_analytics_custom_table = "scc_alerts_table"
# provide these to activate Datadog connector
dd_site = "YOUR DATADOG_URL e.g. datadoghq.eu"
dd_api_key = "YOUR_DATADOG_API_KEY"
}
- Let Terraform do the magic. After executing the below commands, you will have the SCC connector running as a Cloud Function.
terraform init
terraform validate
terraform apply
- Go to GCP Console -> Security Command Center -> Settings -> Continuous Exports
- Create a new PubSub Export as shown in the screenshot. This also requires you to create a pubsub topic.
- You could also use a query to filter out only certain events (e.g. critical/high severity) that would be exported to the PubSub
- Go to GCP Console -> Cloud Functions -> Create Function
- Select 2nd Gen Environment
- Add Eventarc Trigger with Cloud Pub/Sub as the Event Provider and the scc-pubsub topic that you created earlier. The UI might ask you to enable certain APIs during this step, if they were not enabled earlier. Also, please make sure that you grant the necessary IAM role during the creation of the EventArc trigger. It will be shown in the UI, and it is difficult to miss it.
- Click Next
Please note that the service account used to run the Clouf Function requires following IAM permission:
- Artifact Registry Repository Administrator
- Cloud Run Invoker
- Eventarc Event Receiver
- Logs Writer
- Download the code from Github (src/main.py, src/requirements.txt, src/.env.example) and put in in Cloud Function
- Select Python3 as the Runtime
- Rename .env.example to .env and put the following credentials that you noted from Azure earlier
- Set the entry point as entry_point_function
- Deploy the function
AZURE_LOG_ANALTYTICS_WORKSPACE_ID=YOUR_LOG_ANALYTICS_WORKSPACE_ID
AZURE_LOG_ANALYTICS_AUTHENTICATION_KEY=YOUR_PRIMARY_OR_SECONDARY_CLIENT_AUTHENTICATION_KEY
AZURE_LOG_ANALYTICS_CUSTOM_TABLE=YOUR_CUSTOM_LOG_TABLE_NAME
PROJECT_ID=YOUR_GCP_PROJECT_ID
Please note that the Log Analytics custom table is created automatically in Azure, if it does not exist. This is the recommended way to deploy this connector.
Also, as a best practice, you can save the Log Analytics Workspace ID and Key in GCP Secret Manager. This connector first tries to read the .env file, and if it does not find the values, it will try GCP Secret Manager using the PROJECT_ID mentioned in the .env file. You need to ensure that the Cloud Function's service account has the permission to read secrets from the GCP Secrets Manager.
- Trigger an SCC Alert in GCP Console (e.g. by opening a firewall port)
- Go to Azure -> Log Analytics Workspace -> Logs
- Create a New Query and write the custom table name appended by ‘_CL’ and hit run
- You will see the SCC findings listed as in the screenshot
- Trigger an SCC Alert in GCP Console (e.g. by opening a firewall port)
- Go to DataDog -> Event Explorer
If the connector does not work, please inspect the logs in Google Cloud Logging as in the screenshot below.
Congrats on successfully deploying this connector! If you have any feedback, or need any help in setting this up, please reach out to amiacs@gmail.com.