diff --git a/docs/integrations/google/cloud-vpc.md b/docs/integrations/google/cloud-vpc.md
index f59abfd98c..fd09486423 100644
--- a/docs/integrations/google/cloud-vpc.md
+++ b/docs/integrations/google/cloud-vpc.md
@@ -9,7 +9,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl';
-The Google Cloud Platform (GCP) [Virtual Private Cloud](https://cloud.google.com/vpc/docs/)(VPC) provides networking functionality to [Compute Engine](https://cloud.google.com/compute/docs/) virtual machine (VM) instances, [Kubernetes Engine](https://cloud.google.com/kubernetes-engine/docs/) containers and [App Engine Flex](https://cloud.google.com/appengine/docs/flexible/). The Sumo Logic App for Google Cloud VPC provides visibility into the activities, traffic, and VPC flow in your GCP. The preconfigured dashboards provide you details on the VPC flows, source and destination IP addresses, ports, protocols, and messages.
+The Google Cloud Platform (GCP) [Virtual Private Cloud](https://cloud.google.com/vpc/docs/) (VPC) provides networking functionality to [Compute Engine](https://cloud.google.com/compute/docs/) virtual machine (VM) instances, [Kubernetes Engine](https://cloud.google.com/kubernetes-engine/docs/) containers, and [App Engine Flex](https://cloud.google.com/appengine/docs/flexible/). The Sumo Logic app for Google Cloud VPC provides visibility into the activities, traffic, and VPC flow in your GCP. The preconfigured dashboards provide you details on the VPC flows, source and destination IP addresses, ports, protocols, threat intel, traffic direction, and messages.
## Log types
@@ -110,20 +110,20 @@ This page describes the Sumo pipeline for ingesting logs from Google Cloud Platf
The key components in the collection process for GCP services are Google Logs Export, Google Cloud Pub/Sub, and Sumo’s Google Cloud Platform (GCP) source running on a hosted collector.
-The GCP service generates logs which are exported and published to a Google Pub/Sub topic through Stackdriver. You will then set up a Sumo Logic Google Cloud Platform source that subscribes to this topic and receives the exported log data.
+The GCP service generates logs which are exported and published to a Google Pub/Sub topic through Google Cloud Loggings [Log Router](https://cloud.google.com/logging/docs/routing/overview). You will then set up a Sumo Logic Google Cloud Platform source that subscribes to this topic and receives the exported log data.
### Configuring collection for GCP uses the following process:
1. Configure a GCP source on a hosted collector. You'll obtain the **HTTP URL for the source**.
-2. Create a topic in Google Pub/Sub and subscribe the GCP source URL to that topic.
-3. Create an export of GCP logs from Google Stackdriver Logging. Exporting involves writing a filter that selects the log entries you want to export, and choosing a Pub/Sub as the destination. The filter and destination are held in an object called a sink.
+1. Create a topic in Google Pub/Sub and subscribe the GCP source URL to that topic.
+1. Create an export of GCP logs from Google Log Router. Exporting involves writing a filter that selects the log entries you want to export, and choosing a Pub/Sub as the destination. The filter and destination are held in an object called a sink.
See the following sections for configuration instructions.
:::note
-Logs from GCP services can be [exported](https://cloud.google.com/logging/docs/export/configure_export_v2) to any destination including Stackdriver. It is not required to push the GCP logs into Stackdriver for the Sumo Logic Apps to work. Any GCP logs can be [excluded](https://cloud.google.com/logging/docs/exclusions) from Stackdriver logging and still can be [exported](https://cloud.google.com/logging/docs/export/) to Sumo logic.
+Logs from GCP services can be [exported](https://cloud.google.com/logging/docs/export/configure_export_v2) to any destination. Any GCP logs can be [excluded](https://cloud.google.com/logging/docs/exclusions) from Logs router.
:::
### Configure a Google Cloud Platform Source
@@ -139,28 +139,28 @@ However, this is not recommended since you cannot define specific Source Categor
This Source will be a Google Pub/Sub-only Source, which means that it will only be usable for log data formatted as data coming from Google Pub/Sub.
1. [**Classic UI**](/docs/get-started/sumo-logic-ui-classic). In the main Sumo Logic menu, select **Manage Data > Collection > Collection**.
[**New UI**](/docs/get-started/sumo-logic-ui). In the Sumo Logic top menu select **Configuration**, and then under **Data Collection** select **Collection**. You can also click the **Go To...** menu at the top of the screen and select **Collection**.
-2. Select an existing Hosted Collector upon which to add the Source. If you do not already have a Collector you'd like to use, create one, using the instructions on [Configure a Hosted Collector](/docs/send-data/hosted-collectors/configure-hosted-collector).
-3. Click **Add Source** next to the Hosted Collector and click **Google Cloud Platform**.
-4. Enter a **Name** to display for the Source. A **Description** is optional.
-5. **Source Host** (Optional). The Source Host value is tagged to each log and stored in a searchable [metadata](/docs/search/get-started-with-search/search-basics/built-in-metadata) field called _sourceHost. Avoid using spaces so you do not have to quote them in [keyword search expressions](/docs/search/get-started-with-search/build-search/keyword-search-expressions.md). This can be a maximum of 128 characters.
-6. **Source Category** (Optional). The Source Category value is tagged to each log and stored in a searchable [metadata](/docs/search/get-started-with-search/search-basics/built-in-metadata) field called `_sourceCategory`. See our [Best Practices: Good Source Category, Bad Source Category](/docs/send-data/best-practices). Avoid using spaces so you do not have to quote them in [keyword search expressions](/docs/search/get-started-with-search/build-search/keyword-search-expressions.md). This can be a maximum of 1,024 characters.
-7. **Fields**. Click the **+Add Field** link to add custom log metadata [Fields](/docs/manage/fields), then define the fields you want to associate. Each field needs a name (key) and value. Look for one of the following icons and act accordingly:
+1. Select an existing Hosted Collector upon which to add the Source. If you do not already have a Collector you'd like to use, create one, using the instructions on [Configure a Hosted Collector and Source](/docs/send-data/hosted-collectors/configure-hosted-collector).
+1. Click **Add Source** next to the Hosted Collector and click **Google Cloud Platform**.
+1. Enter a **Name** to display for the Source. A **Description** is optional.
+1. **Source Host** (Optional). The Source Host value is tagged to each log and stored in a searchable [metadata](/docs/search/get-started-with-search/search-basics/built-in-metadata) field called _sourceHost. Avoid using spaces so you do not have to quote them in [keyword search expressions](/docs/search/get-started-with-search/build-search/keyword-search-expressions.md). This can be a maximum of 128 characters.
+1. **Source Category** (Optional). The Source Category value is tagged to each log and stored in a searchable [metadata](/docs/search/get-started-with-search/search-basics/built-in-metadata) field called `_sourceCategory`. See our [Best Practices: Good Source Category, Bad Source Category](/docs/send-data/best-practices). Avoid using spaces so you do not have to quote them in [keyword search expressions](/docs/search/get-started-with-search/build-search/keyword-search-expressions.md). This can be a maximum of 1,024 characters.
+1. **Fields**. Click the **+Add Field** link to add custom log metadata [Fields](/docs/manage/fields), then define the fields you want to associate. Each field needs a name (key) and value. Look for one of the following icons and act accordingly:
*  If an orange triangle with an exclamation point is shown, use the option to automatically add or enable the nonexistent fields before proceeding to the next step. The orange icon indicates that the field doesn't exist, or is disabled, in the Fields table schema. If a field is sent to Sumo that does not exist in the Fields schema or is disabled it is ignored, known as dropped.
*  If a green circle with a checkmark is shown, the field exists and is already enabled in the Fields table schema. Proceed to the next step.
-8. **Advanced Options for Logs**.
+1. **Advanced Options for Logs**.
* **Timestamp Parsing**. This option is selected by default. If it's deselected, no timestamp information is parsed at all.
* **Time Zone**. There are two options for Time Zone. You can use the time zone present in your log files, and then choose an option in case time zone information is missing from a log message. Or, you can have Sumo Logic completely disregard any time zone information present in logs by forcing a time zone. It's very important to have the proper time zone set, no matter which option you choose. If the time zone of logs cannot be determined, Sumo Logic assigns logs UTC; if the rest of your logs are from another time zone your search results will be affected.
* **Timestamp Format**. By default, Sumo Logic will automatically detect the timestamp format of your logs. However, you can manually specify a timestamp format for a Source. See [Timestamps, Time Zones, Time Ranges, and Date Formats](/docs/send-data/reference-information/time-reference) for more information.
-9. **Processing Rules**. Configure any desired filters, such as allowlist, denylist, hash, or mask, as described in [Create a Processing Rule](/docs/send-data/collection/processing-rules/create-processing-rule).
-10. When you are finished configuring the Source, click **Save**.
+1. **Processing Rules**. Configure any desired filters, such as allowlist, denylist, hash, or mask, as described in [Create a Processing Rule](/docs/send-data/collection/processing-rules/create-processing-rule).
+1. When you are finished configuring the Source, click **Save**.
### Configure a Pub/Sub Topic for GCP
You need to configure a Pub/Sub Topic in GCP and add a subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. Once you configure the Pub/Sub, you can export data from Google Logging to the Pub/Sub. For example, you can export Google App Engine logs, as described on [Collect Logs for Google App Engine](/docs/integrations/google/app-engine#collecting-logs-for-the-google-app-engine-app).
-1. Create a Pub/Sub Topic in GCP. See [Google Cloud documentation](https://cloud.google.com/pubsub/docs/admin#creating_a_topic) for the latest configuration steps.
-2. Create a Pub/Sub subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. See [Google Cloud documentation](https://cloud.google.com/pubsub/docs/admin#creating_subscriptions) for the latest configuration steps.
+1. Create a Pub/Sub Topic in GCP. Refer to the [Google Cloud documentation](https://cloud.google.com/pubsub/docs/admin#creating_a_topic) for the latest configuration steps.
+1. Create a Pub/Sub subscription to the Source URL that belongs to the Sumo Logic Google Cloud Platform Source you created. See [Google Cloud documentation](https://cloud.google.com/pubsub/docs/admin#creating_subscriptions) for the latest configuration steps.
* Use a **Push Delivery Method** to the Sumo Logic Source URL. To determine the URL, navigate to the Source on the **Collection** page in Sumo Logic and click **Show URL**.
@@ -186,14 +186,16 @@ We recommend the following:
In this step you export logs to the Pub/Sub topic you created in the previous step.
1. Go to **Logging** and click **Logs Router**.
-2. Click **Create Sink**.
-3. As part of **Create logs routing sink**, add the following information.
- 1. Enter a Sink Name. For example, "gce-vm-instance".
- 2. Select "Cloud Pub/Sub" as the **Sink Service**.
- 3. Set **Sink Destination** to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
- 4. In **Choose logs to include in sink** section for `resource_type`, replace `""` with `"gce_subnetwork"`.
- 5. Click **Create Sync**.
-
+1. Click **Create Sink**.
+1. As part of **Create logs routing sink**, add the following information.
+ 1. Enter a **Sink Name**. For example, `gce-vm-instance`.
+ 1. Select **Cloud Pub/Sub** as the **Sink Service**.
+ 1. Set **Sink Destination** to the Pub/Sub topic you created in the Google Cloud Platform Source procedure. For example, "pub-sub-logs".
+ 1. In **Choose logs to include in sink** section for `resource_type`, replace `""` with `"gce_subnetwork"`.
+ 1. Click **Create Sync**.
+:::note
+By default, GCP logs are stored within Cloud Logging, but you can configure Log Router to exclude them as detailed [here](https://cloud.google.com/logging/docs/exclusions#overview) without affecting the export to Sumo Logic as outlined above.
+:::
## Installing the Google Cloud VPC app
@@ -212,102 +214,29 @@ import ViewDashboards from '../../reuse/apps/view-dashboards.md';
### Overview
-See an overview of the top 10 IPs, ports, and VMs; along with trends in traffic and VPC flows in your Google Cloud VPC.
+This dashboard includes insights on incoming and outgoing source IP addresses, top 10 external IPs by traffic, VPC flows, source and destination VMs, and traffic trends by subnetwork, project, and VPC. The dashboard provides a concise overview of network performance and data flow patterns for informed decision-making.
-**Inbound Source Address Locations**. Shows number of incoming source IP addresses in the last 24 hours and their location on a world map.
-
-**Outbound Source Address Locations**. Shows number of outgoing source IP addresses in the last 24 hours and their location on a world map.
-
-**Top 10 External IPs by Traffic (MiB).** Shows the top 10 external IP addresses and total bytes sent (in blue) or received (in green) last 24 hours on a horizontal bar graph.
-
-**Top 10 External IPs by VPC Flows**. Shows the top 10 external IP addresses by VPC flows in the last 24 hours on a horizontal bar graph.
-
-**Top 10 External Destination Ports by VPC Flows**. Shows the top 10 external destination ports by VPC flows in the last 24 hours on a horizontal bar graph.
-
-**Top 10 Internal Destination Ports by VPC Flows**. Shows the top 10 internal destination ports by VPC flows in the last 24 hours on a horizontal bar graph.
-
-**Top 10 Source VMs by Traffic (MiB).** Shows the top 10 source VMs by Traffic (MiB) in the last 24 hours on a horizontal bar graph.
-
-**Top 10 Destination VMs by Traffic (MiB).** Shows the top 10 Destination VMs by Traffic (MiB) in the last 24 hours on a horizontal bar graph.
-
-**Traffic (MiB) by Subnetwork.** Shows trends in traffic in the last 24 hours by Subnetwork on a line graph with two lines.
-
-**Traffic (MiB) by Project.** Shows trends in traffic in the last 24 hours by Project on a line graph.
-
-**Traffic (MiB) by VPC.** Shows trends in traffic in the last 24 hours by VPC on a line graph.
-
-**Top 10 Source VMs per VPC by Traffic (MiB).** Shows the top 10 Source VMs per VPC by Traffic (MiB) in the last 24 hours on a column graph.
-
-**Top 10 Destination VMs per VPC by Traffic (MiB).** Show the top 10 Destination VMs per VPC by Traffic (MiB) in the last 24 hours on a column graph.
-
-**VPC Flows per Protocol by Hour**. See the VPC flows per protocol by hour in the last 24 hours on a column graph.
-
### VPC Activity
-See the details of your Google Cloud VPC activity including the trends of traffic, packets, and average latency; along with totals of VPC flows.
-
+This dashboard tracks network metrics like traffic trends, packets, and latency over the last hour. It shows trends in traffic and packets by Subnetwork ID, Source/Destination VM, and average latency by Subnetwork ID, Source/Destination VPC. Additionally, it visualizes VPC flows by source/destination IP address and port, aiding in monitoring network activity effectively.
-**Traffic (MiB) by Subnetwork ID.** Shows trends in traffic by Subnetwork ID in the last hour on a line graph.
-
-**Traffic (MiB) by Source VM Over Time.** Shows trends in traffic by Source VM over time in the last hour on a column graph.
-
-**Traffic (MiB) by Destination VM Over Time.** Shows trends in traffic by Destination VM over time in the last hour on a column graph.
-
-**Packets by Subnetwork ID.** Shows trends in packets by Subnetwork ID in the last hour on a line graph.
-
-**Packets by Source VM Over Time.** Shows packets by Source VM over time in last hour on a column graph.
-
-**Packets by Destination VM Over Time.** Shows packets by Destination VM over time in last hour on a column graph.
-
-**Average Latency (ms) by Subnetwork ID.** Shows the average latency by Subnetwork ID in the last hour on a line graph.
-
-**Average Latency (ms) by Source VPC.** Shows the average latency by Source VPC in the last hour on a line graph.
-
-**Average Latency (ms) by Destination VPC.** Shows the average latency by Destination VPC in the last hour on a line graph.
-
-**VPC Flows by Source Address**. Shows the count of VPC flows by source IP address in the last hour on a pie chart.
-
-**VPC Flows by Source Port**. Shows the count of VPC flows by source port in the last hour on a pie chart.
-
-**VPC Flows by Destination Address**. Shows the count of VPC flows by destination IP address in the last hour on a pie chart.
-
-**VPC Flows by Destination Port**. Shows the count of VPC flows by destination port in the last hour on a pie chart.
-
### Advanced metrics
-See the details of your Google Cloud VPC traffic including the trend and outlier in messages, total bytes, packets and latency.
+This dashboard provides insights on message frequency, data transfer rates, packet transmission, and latency trends over the last hour. It includes visualizations such as outliers in messages per minute, trends in total bytes per minute, and packets sent per minute. Additionally, it displays box plots for total bytes, packets, and latency, offering a comprehensive view of the data distribution within the specified timeframe.
-**Messages per Minute - Outlier**. Shows outliers in messages per minute over the last hour on a line graph with threshold.
-
-**Messages per Minute - Trend**. Shows trends in messages per minute over the last hour on a scatter plot graph along with the predicted progression.
-
-**Messages per Minute - Project**. Shows the count in messages per minute over the last hour on a column graph.
-
-**MBs per Minute - Outlier**. Shows outliers in the total bytes per minute sent over the last hour on a line graph with threshold.
+### Security and Direction
-**MBs per Minute - Trend**. Shows trends in total bytes per minute sent over the last hour on a scatter plot graph along with the predicted values.
+This dashboard lists malicious IPs (source and destination) with threat level and location. This dashboard can be used for traffic direction distribution and traffic activity by subnetwork ID, and project and time comparison of VPC flow logs.
-**MBs Box Plot.** Shows a box plot of the MBs sent in the last hour with the maximum, upper quartile, median, lower quartile, and minimum values.
-
-**Packets per Minute - Outlier**. Shows outliers in the total packets sent per minute over the last hour on a line graph showing threshold.
-
-**Packets per Minute - Trend**. Shows trends in the total packets sent per minute over the last hour on a scatter plot graph along with the predicted progression.
-
-**Packets Box Plot**. Shows a box plot of the packets sent in the last hour with the maximum, upper quartile, median, lower quartile, and minimum values.
-
-**Average Latency (ms) per Minute - Outlier.** Shows the outliers in average latency per minute in the last hour on a line graph with threshold.
-
-**Average Latency (ms) per Minute - Trend.** Shows the trend in average latency per minute in the last hour on a scatter plot graph.
-
-**Latency (ms) Box Plot.** Shows a box plot of the latency in the last hour with the maximum, upper quartile, median, lower quartile, and minimum values.
+
## Upgrade/Downgrade the Google Cloud VPC app (Optional)
@@ -319,4 +248,19 @@ import AppUpdate from '../../reuse/apps/app-update.md';
import AppUninstall from '../../reuse/apps/app-uninstall.md';
-
\ No newline at end of file
+
+
+## Create monitors for GCP load balancer app
+
+import CreateMonitors from '../../reuse/apps/create-monitors.md';
+
+
+
+### GCP VPC alerts
+These alerts are VPC flow log based.
+
+| Alert Name | Alert Description and Conditions | Default Alert Condition | Default Recover Condition |
+|:--|:--|:--|:--|
+| `Google Cloud VPC - Access from Highly Malicious Sources Alert` | This alert gets triggered when an VPC is accessed from highly malicious IP addresses. | Count >= 1 | Count < 1 |
+| `Google Cloud VPC - Latency Alert` | This alert is triggered when latency of any subnetwork goes above threshold value (default is 1000 ms) | Count >= 1 | Count < 1 |
+| `Google Cloud VPC - Total traffic Alert` | This alert is triggered when the total traffic (in MB) for a subnetwork is greater then the threshold value (Default is 2GB) | Count >= 1 | Value < 1 |
\ No newline at end of file
diff --git a/static/img/integrations/google/Google-Cloud-VPC-Security-and-Direction.png b/static/img/integrations/google/Google-Cloud-VPC-Security-and-Direction.png
new file mode 100644
index 0000000000..f269584a29
Binary files /dev/null and b/static/img/integrations/google/Google-Cloud-VPC-Security-and-Direction.png differ