Skip to content

Commit

Permalink
Provide documentation for using a Service Account to connect to BigQu…
Browse files Browse the repository at this point in the history
…ery (#8462)

* Provide documentation for using a Service Account to connect to BigQuery

* Alter line wrapping for shorter lines

* Whitespace commit to trigger another build (flake)

* Another meaningless whitespace change to trigger another build
  • Loading branch information
Will Barrett authored and mistercrunch committed Oct 29, 2019
1 parent 8b74745 commit 1adf742
Showing 1 changed file with 32 additions and 0 deletions.
32 changes: 32 additions & 0 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -434,8 +434,40 @@ The connection string for BigQuery looks like this ::

bigquery://{project_id}

Additionally, you will need to configure authentication via a
Service Account. Create your Service Account via the Google
Cloud Platform control panel, provide it access to the appropriate
BigQuery datasets, and download the JSON configuration file
for the service account. In Superset, Add a JSON blob to
the "Secure Extra" field in the database configuration page
with the following format ::

{
"credentials_info": <contents of credentials JSON file>
}

The resulting file should have this structure ::

{
"credentials_info": {
"type": "service_account",
"project_id": "...",
"private_key_id": "...",
"private_key": "...",
"client_email": "...",
"client_id": "...",
"auth_uri": "...",
"token_uri": "...",
"auth_provider_x509_cert_url": "...",
"client_x509_cert_url": "...",
}
}

You should then be able to connect to your BigQuery datasets.

To be able to upload data, e.g. sample data, the python library `pandas_gbq` is required.


Elasticsearch
-------------

Expand Down

0 comments on commit 1adf742

Please sign in to comment.