Skip to content

Commit

Permalink
feat: Modularize logging components (#781)
Browse files Browse the repository at this point in the history
* Initial commit

* New inline Centralized Logging module

* New logbucket destination module path

* -Using new inline centralized logging module for log
-Added logbucket as new logging destination

* Fix missing logbucket name in doc

* Add support to Cloud KMS CryptoKey

* Fix typos

* Reviewed module documentation

* Fix readme log sink filter

* Fix variable description and improve module documentation

* Project id removed from Log Bucket name because it is not global unique as storage names

* Added information about Log bucket free cost

* Added link with additional information

Co-authored-by: Daniel Andrade <dandrade@ciandt.com>

* Added links with additional information about sink destinations

Co-authored-by: Daniel Andrade <dandrade@ciandt.com>

* Improve to clarify documentation

Co-authored-by: Daniel Andrade <dandrade@ciandt.com>

* Added link with additional info

* Clean unused locals

* Fix example codes

* -Improve auto-generated names for sinks and target
-Improve code readability using maps and lookup

* Fix var description

Co-authored-by: Bharath KKB <bharathkrishnakb@gmail.com>

* Refactor all destinations in one module call

* Duplicated validation Removed

* Fix handle retention_policy object

* Fix added logbucket default location

* Fix test output values to not break module

* Fix PR reviews

* Fix outputs and remote state vars

Co-authored-by: Daniel Andrade <dandrade@ciandt.com>
Co-authored-by: Bharath KKB <bharathkrishnakb@gmail.com>
  • Loading branch information
3 people committed Sep 7, 2022
1 parent 0019b00 commit a1d636e
Show file tree
Hide file tree
Showing 11 changed files with 620 additions and 79 deletions.
4 changes: 2 additions & 2 deletions 1-org/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,8 +78,8 @@ Enabling Data Access logs might result in your project being charged for the add
For details on costs you might incur, go to [Pricing](https://cloud.google.com/stackdriver/pricing).
You can choose not to enable the Data Access logs by setting variable `data_access_logs_enabled` to false.

**Note:** This module creates a sink to export all logs to Google Storage. It also creates sinks to export a subset of security-related logs
to Bigquery and Pub/Sub. This will result in additional charges for those copies of logs.
**Note:** This module creates a sink to export all logs to Google Storage and Log Bucket. It also creates sinks to export a subset of security related logs
to Bigquery and Pub/Sub. This will result in additional charges for those copies of logs. For Log Bucket destination, logs retained for the default retention period (30 days) [don't incur a storage cost](https://cloud.google.com/stackdriver/pricing#:~:text=Logs%20retained%20for%20the%20default%20retention%20period%20don%27t%20incur%20a%20storage%20cost.).
You can change the filters & sinks by modifying the configuration in `envs/shared/log_sinks.tf`.

**Note:** Currently, this module does not enable [bucket policy retention](https://cloud.google.com/storage/docs/bucket-lock) for organization logs, please, enable it if needed.
Expand Down
2 changes: 2 additions & 0 deletions 1-org/envs/shared/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,8 @@
| dns\_hub\_project\_id | The DNS hub project ID |
| domains\_to\_allow | The list of domains to allow users from in IAM. |
| interconnect\_project\_id | The Dedicated Interconnect project ID |
| logs\_export\_bigquery\_dataset\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
| logs\_export\_logbucket\_name | The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets |
| logs\_export\_pubsub\_topic | The Pub/Sub topic for destination of log exports |
| logs\_export\_storage\_bucket\_name | The storage bucket for destination of log exports |
| org\_audit\_logs\_project\_id | The org audit logs project ID |
Expand Down
120 changes: 46 additions & 74 deletions 1-org/envs/shared/log_sinks.tf
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
locals {
parent_resource_id = local.parent_folder != "" ? local.parent_folder : local.org_id
parent_resource_type = local.parent_folder != "" ? "folder" : "organization"
parent_resources = { resource = local.parent_resource_id }
main_logs_filter = <<EOF
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
Expand All @@ -34,88 +35,59 @@ resource "random_string" "suffix" {
special = false
}

/******************************************
Send logs to BigQuery
*****************************************/
module "logs_export" {
source = "../../modules/centralized-logging"

resources = local.parent_resources
resource_type = local.parent_resource_type
logging_destination_project_id = module.org_audit_logs.project_id

module "log_export_to_biqquery" {
source = "terraform-google-modules/log-export/google"
version = "~> 7.3.0"
destination_uri = module.bigquery_destination.destination_uri
filter = local.main_logs_filter
log_sink_name = "sk-c-logging-bq"
parent_resource_id = local.parent_resource_id
parent_resource_type = local.parent_resource_type
include_children = true
unique_writer_identity = true
/******************************************
Send logs to BigQuery
*****************************************/
bigquery_options = {
use_partitioned_tables = true
logging_sink_name = "sk-c-logging-bq"
logging_sink_filter = local.main_logs_filter
dataset_name = "audit_logs"
expiration_days = var.audit_logs_table_expiration_days
delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy
}
}

module "bigquery_destination" {
source = "terraform-google-modules/log-export/google//modules/bigquery"
version = "~> 7.3.0"
project_id = module.org_audit_logs.project_id
dataset_name = "audit_logs"
log_sink_writer_identity = module.log_export_to_biqquery.writer_identity
expiration_days = var.audit_logs_table_expiration_days
delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy
}

/******************************************
Send logs to Storage
*****************************************/

module "log_export_to_storage" {
source = "terraform-google-modules/log-export/google"
version = "~> 7.3.0"
destination_uri = module.storage_destination.destination_uri
filter = local.all_logs_filter
log_sink_name = "sk-c-logging-bkt"
parent_resource_id = local.parent_resource_id
parent_resource_type = local.parent_resource_type
include_children = true
unique_writer_identity = true
}

module "storage_destination" {
source = "terraform-google-modules/log-export/google//modules/storage"
version = "~> 7.3.0"
project_id = module.org_audit_logs.project_id
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
log_sink_writer_identity = module.log_export_to_storage.writer_identity
uniform_bucket_level_access = true
location = var.log_export_storage_location
retention_policy = var.log_export_storage_retention_policy
force_destroy = var.log_export_storage_force_destroy
versioning = var.log_export_storage_versioning
}
/******************************************
Send logs to Storage
*****************************************/
storage_options = {
logging_sink_filter = local.all_logs_filter
logging_sink_name = "sk-c-logging-bkt"
storage_bucket_name = "bkt-${module.org_audit_logs.project_id}-org-logs-${random_string.suffix.result}"
location = var.log_export_storage_location
retention_policy_is_locked = var.log_export_storage_retention_policy == null ? null : var.log_export_storage_retention_policy.is_locked
retention_policy_period_days = var.log_export_storage_retention_policy == null ? null : var.log_export_storage_retention_policy.retention_period_days
force_destroy = var.log_export_storage_force_destroy
versioning = var.log_export_storage_versioning
}

/******************************************
Send logs to Pub\Sub
*****************************************/
/******************************************
Send logs to Pub\Sub
*****************************************/
pubsub_options = {
logging_sink_filter = local.main_logs_filter
logging_sink_name = "sk-c-logging-pub"
topic_name = "tp-org-logs-${random_string.suffix.result}"
create_subscriber = true
}

module "log_export_to_pubsub" {
source = "terraform-google-modules/log-export/google"
version = "~> 7.3.0"
destination_uri = module.pubsub_destination.destination_uri
filter = local.main_logs_filter
log_sink_name = "sk-c-logging-pub"
parent_resource_id = local.parent_resource_id
parent_resource_type = local.parent_resource_type
include_children = true
unique_writer_identity = true
/******************************************
Send logs to Logbucket
*****************************************/
logbucket_options = {
logging_sink_name = "sk-c-logging-logbkt"
logging_sink_filter = local.all_logs_filter
name = "logbkt-org-logs-${random_string.suffix.result}"
location = local.default_region
}
}

module "pubsub_destination" {
source = "terraform-google-modules/log-export/google//modules/pubsub"
version = "~> 7.3.0"
project_id = module.org_audit_logs.project_id
topic_name = "tp-org-logs-${random_string.suffix.result}"
log_sink_writer_identity = module.log_export_to_pubsub.writer_identity
create_subscriber = true
}

/******************************************
Billing logs (Export configured manually)
Expand Down
14 changes: 12 additions & 2 deletions 1-org/envs/shared/outputs.tf
Original file line number Diff line number Diff line change
Expand Up @@ -90,11 +90,21 @@ output "domains_to_allow" {
}

output "logs_export_pubsub_topic" {
value = module.pubsub_destination.resource_name
value = module.logs_export.pubsub_destination_name
description = "The Pub/Sub topic for destination of log exports"
}

output "logs_export_storage_bucket_name" {
value = module.storage_destination.resource_name
value = module.logs_export.storage_destination_name
description = "The storage bucket for destination of log exports"
}

output "logs_export_logbucket_name" {
value = module.logs_export.logbucket_destination_name
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
}

output "logs_export_bigquery_dataset_name" {
value = module.logs_export.bigquery_destination_name
description = "The log bucket for destination of log exports. See https://cloud.google.com/logging/docs/routing/overview#buckets"
}
93 changes: 93 additions & 0 deletions 1-org/modules/centralized-logging/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
# Centralized Logging Module

This module handles logging configuration enabling one or more resources such as organization, folders, or projects to send logs to multiple destinations: [GCS bucket](https://cloud.google.com/logging/docs/export/using_exported_logs#gcs-overview), [Big Query](https://cloud.google.com/logging/docs/export/bigquery), [Pub/Sub](https://cloud.google.com/logging/docs/export/using_exported_logs#pubsub-overview), and [Log Buckets](https://cloud.google.com/logging/docs/routing/overview#buckets).

## Usage

Before using this module, get familiar with the [log-export](https://registry.terraform.io/modules/terraform-google-modules/log-export/google/latest) module that is the base for it.

The following example exports audit logs from two folders to the same storage destination:

```hcl
module "logs_export" {
source = "terraform-google-modules/terraform-example-foundation/google//1-org/modules/centralized-logging"
resources = {
fldr1 = "<folder1_id>"
fldr2 = "<folder2_id>"
}
resource_type = "folder"
logging_destination_project_id = "<log_destination_project_id>"
storage_options = {
logging_sink_filter = ""
logging_sink_name = "sk-c-logging-bkt"
storage_bucket_name = "bkt-logs"
location = "us-central1"
}
bigquery_options = {
dataset_name = "ds_logs"
logging_sink_name = "sk-c-logging-bq"
logging_sink_filter = <<EOF
logName: /logs/cloudaudit.googleapis.com%2Factivity OR
logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR
logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR
logName: /logs/compute.googleapis.com%2Fvpc_flows OR
logName: /logs/compute.googleapis.com%2Ffirewall OR
logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency
EOF
}
}
```

**Note:** When the destination is a Log Bucket and a sink is been created in the same project, set variable `logging_project_key` with the **key** used to map the Log Bucket project in the `resources` map.
Get more details at [Configure and manage sinks](https://cloud.google.com/logging/docs/export/configure_export_v2#dest-auth:~:text=If%20you%27re%20using%20a%20sink%20to%20route%20logs%20between%20Logging%20buckets%20in%20the%20same%20Cloud%20project%2C%20no%20new%20service%20account%20is%20created%3B%20the%20sink%20works%20without%20the%20unique%20writer%20identity.).

The following example exports all logs from three projects - including the logging destination project - to a Log Bucket destination. As it exports all logs be aware of additional charges for this amount of logs:

```hcl
module "logging_logbucket" {
source = "terraform-google-modules/terraform-example-foundation/google//1-org/modules/centralized-logging"
resources = {
prj1 = "<log_destination_project_id>"
prj2 = "<prj2_id>"
prjx = "<prjx_id>"
}
resource_type = "project"
logging_destination_project_id = "<log_destination_project_id>"
logging_project_key = "prj1"
logbucket_options = {
logging_sink_name = "sk-c-logging-logbkt"
logging_sink_filter = ""
name = "logbkt-logs"
}
}
```

<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
## Inputs

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| bigquery\_options | Destination BigQuery options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- dataset\_name: The name of the bigquery dataset to be created and used for log entries.<br>- expiration\_days: (Optional) Table expiration time. If null logs will never be deleted.<br>- partitioned\_tables: (Optional) Options that affect sinks exporting data to BigQuery. use\_partitioned\_tables - (Required) Whether to use BigQuery's partition tables.<br>- delete\_contents\_on\_destroy: (Optional) If set to true, delete all contained objects in the logging destination.<br><br>Destination BigQuery options example:<pre>bigquery_options = {<br> logging_sink_name = "sk-c-logging-bq"<br> dataset_name = "audit_logs"<br> partitioned_tables = "true"<br> expiration_days = 30<br> delete_contents_on_destroy = false<br> logging_sink_filter = <<EOF<br> logName: /logs/cloudaudit.googleapis.com%2Factivity OR<br> logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR<br> logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR<br> logName: /logs/compute.googleapis.com%2Fvpc_flows OR<br> logName: /logs/compute.googleapis.com%2Ffirewall OR<br> logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency<br>EOF<br>}</pre> | `map(string)` | `null` | no |
| logbucket\_options | Destination LogBucket options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- name: The name of the log bucket to be created and used for log entries matching the filter.<br>- location: The location of the log bucket. Default: global.<br>- retention\_days: (Optional) The number of days data should be retained for the log bucket. Default 30.<br><br>Destination LogBucket options example:<pre>logbucket_options = {<br> logging_sink_name = "sk-c-logging-logbkt"<br> logging_sink_filter = ""<br> name = "logbkt-org-logs"<br> retention_days = "30"<br> location = "global"<br>}</pre> | `map(any)` | `null` | no |
| logging\_destination\_project\_id | The ID of the project that will have the resources where the logs will be created. | `string` | n/a | yes |
| logging\_project\_key | (Optional) The key of logging destination project if it is inside resources map. It is mandatory when resource\_type = project and logging\_target\_type = logbucket. | `string` | `""` | no |
| pubsub\_options | Destination Pubsub options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- topic\_name: The name of the pubsub topic to be created and used for log entries matching the filter.<br>- create\_subscriber: (Optional) Whether to create a subscription to the topic that was created and used for log entries matching the filter. If 'true', a pull subscription is created along with a service account that is granted roles/pubsub.subscriber and roles/pubsub.viewer to the topic.<br><br>Destination Storage options example:<pre>pubsub_options = {<br> logging_sink_name = "sk-c-logging-pub"<br> topic_name = "tp-org-logs"<br> create_subscriber = true<br> logging_sink_filter = <<EOF<br> logName: /logs/cloudaudit.googleapis.com%2Factivity OR<br> logName: /logs/cloudaudit.googleapis.com%2Fsystem_event OR<br> logName: /logs/cloudaudit.googleapis.com%2Fdata_access OR<br> logName: /logs/compute.googleapis.com%2Fvpc_flows OR<br> logName: /logs/compute.googleapis.com%2Ffirewall OR<br> logName: /logs/cloudaudit.googleapis.com%2Faccess_transparency<br>EOF<br>}</pre> | `map(any)` | `null` | no |
| resource\_type | Resource type of the resource that will export logs to destination. Must be: project, organization, or folder. | `string` | n/a | yes |
| resources | Export logs from the specified resources. | `map(string)` | n/a | yes |
| storage\_options | Destination Storage options:<br>- logging\_sink\_name: The name of the log sink to be created.<br>- logging\_sink\_filter: The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs.<br>- storage\_bucket\_name: The name of the storage bucket to be created and used for log entries matching the filter.<br>- location: (Optional) The location of the logging destination. Default: US.<br>- Retention Policy variables: (Optional) Configuration of the bucket's data retention policy for how long objects in the bucket should be retained.<br> - retention\_policy\_is\_locked: Set if policy is locked.<br> - retention\_policy\_period\_days: Set the period of days for log retention. Default: 30.<br>- versioning: (Optional) Toggles bucket versioning, ability to retain a non-current object version when the live object version gets replaced or deleted.<br>- force\_destroy: When deleting a bucket, this boolean option will delete all contained objects.<br><br>Destination Storage options example:<pre>storage_options = {<br> logging_sink_name = "sk-c-logging-bkt"<br> logging_sink_filter = ""<br> storage_bucket_name = "bkt-org-logs"<br> location = "US"<br> force_destroy = false<br> versioning = false<br>}</pre> | `map(any)` | `null` | no |

## Outputs

| Name | Description |
|------|-------------|
| bigquery\_destination\_name | The resource name for the destination BigQuery. |
| logbucket\_destination\_name | The resource name for the destination Log Bucket. |
| pubsub\_destination\_name | The resource name for the destination Pub/Sub. |
| storage\_destination\_name | The resource name for the destination Storage. |

<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->

0 comments on commit a1d636e

Please sign in to comment.