From 67a57b5fe462698a6b8b42295301a54071fd6a25 Mon Sep 17 00:00:00 2001 From: zhenyami Date: Mon, 4 Oct 2021 17:12:22 +0300 Subject: [PATCH] out_bigquery: add `skipInvalidRows` and `ignoreUnknownValues` params Added BigQuery request body parameters `skipInvalidRows` and `ignoreUnknownValues` that are used for the streaming inserts. Updated these docs in sync with the changes in the main Fluent Bit repo. Signed-off-by: zhenyami --- pipeline/outputs/bigquery.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/pipeline/outputs/bigquery.md b/pipeline/outputs/bigquery.md index 82faa4d0a..2acd6e056 100644 --- a/pipeline/outputs/bigquery.md +++ b/pipeline/outputs/bigquery.md @@ -40,6 +40,10 @@ Fluent Bit BigQuery output plugin uses a JSON credentials file for authenticatio | project\_id | The project id containing the BigQuery dataset to stream into. | The value of the `project_id` in the credentials file | | dataset\_id | The dataset id of the BigQuery dataset to write into. This dataset must exist in your project. | | | table\_id | The table id of the BigQuery table to write into. This table must exist in the specified dataset and the schema must match the output. | | +| skip_invalid_rows | Insert all valid rows of a request, even if invalid rows exist. The default value is false, which causes the entire request to fail if any invalid rows exist. | Off | +| ignore_unknown_values | Accept rows that contain values that do not match the schema. The unknown values are ignored. Default is false, which treats unknown values as errors. | Off | + +See Google's [official documentation](https://cloud.google.com/bigquery/docs/reference/rest/v2/tabledata/insertAll) for further details. ## Configuration File