From 821b2a891d0cb6dd6b186c08566230eda1f736ee Mon Sep 17 00:00:00 2001 From: Claire Waters Date: Thu, 6 May 2021 11:41:20 -0500 Subject: [PATCH 1/2] adding draft bigquery content --- .../logs/src/content/logpush/bigquery/index.md | 14 ++++++++++++++ 1 file changed, 14 insertions(+) create mode 100644 products/logs/src/content/logpush/bigquery/index.md diff --git a/products/logs/src/content/logpush/bigquery/index.md b/products/logs/src/content/logpush/bigquery/index.md new file mode 100644 index 000000000000000..262481d085ef58f --- /dev/null +++ b/products/logs/src/content/logpush/bigquery/index.md @@ -0,0 +1,14 @@ +--- +title: Enable BigQuery +order: 61 +--- + +# Enable BigQuery + +Configure Logpush to send batches of Cloudflare logs to BigQuery. + +BigQuery supports loading up to 1,500 jobs per table per day (including failures) with up to 10 million files in each load. That means you can load into BigQuery once per minute and include up to 10 million files in a load. See BigQuery's quotas for load jobs for more information. + +Logpush delivers batches of logs as soon as possible, which means you could receive more than one batch of files per minute. Ensure your BigQuery job is configured to ingest files on a given time interval, like every minute, as opposed to as files are received. Ingesting files into BigQuery as each Logpush file is received could exhaust your BigQuery quota quickly. + +For an example of how to set up a schedule job load with BigQuery, see the [Cloudflare + Google Cloud | Integrations repository](https://github.com/cloudflare/cloudflare-gcp/tree/master/logpush-to-bigquery). \ No newline at end of file From 5d8701231e1f16398e3874df23c08f00f5b0d167 Mon Sep 17 00:00:00 2001 From: Claire Waters Date: Thu, 20 May 2021 10:10:41 -0500 Subject: [PATCH 2/2] kody edits --- products/logs/src/content/logpush/bigquery/index.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/products/logs/src/content/logpush/bigquery/index.md b/products/logs/src/content/logpush/bigquery/index.md index 262481d085ef58f..83e56c5724ce7a0 100644 --- a/products/logs/src/content/logpush/bigquery/index.md +++ b/products/logs/src/content/logpush/bigquery/index.md @@ -1,6 +1,6 @@ --- -title: Enable BigQuery order: 61 +pcx-content-type: concept --- # Enable BigQuery @@ -9,6 +9,6 @@ Configure Logpush to send batches of Cloudflare logs to BigQuery. BigQuery supports loading up to 1,500 jobs per table per day (including failures) with up to 10 million files in each load. That means you can load into BigQuery once per minute and include up to 10 million files in a load. See BigQuery's quotas for load jobs for more information. -Logpush delivers batches of logs as soon as possible, which means you could receive more than one batch of files per minute. Ensure your BigQuery job is configured to ingest files on a given time interval, like every minute, as opposed to as files are received. Ingesting files into BigQuery as each Logpush file is received could exhaust your BigQuery quota quickly. +Logpush delivers batches of logs as soon as possible, which means you could receive more than one batch of files per minute. Ensure your BigQuery job is configured to ingest files on a given time interval, like every minute, as opposed to when files are received. Ingesting files into BigQuery as each Logpush file is received could exhaust your BigQuery quota quickly. For an example of how to set up a schedule job load with BigQuery, see the [Cloudflare + Google Cloud | Integrations repository](https://github.com/cloudflare/cloudflare-gcp/tree/master/logpush-to-bigquery). \ No newline at end of file