Skip to content
Import files (data) from Intercom, FTP(S), SFTP, MySQL, etc. servers into BigQuery.
Branch: master
Clone or download
Latest commit 4240ca5 Apr 15, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
amocrm Update Dec 16, 2018
expertsender Update Dec 12, 2018
ftp Update Dec 12, 2018
ftps Update Apr 15, 2019
https Update Dec 12, 2018
intercom Update Dec 12, 2018
mysql Update Dec 12, 2018
sftp Update Dec 12, 2018 added Aug 30, 2018 Update Dec 9, 2018 Update Dec 9, 2018


Import files (data) from Intercom, FTP(S), SFTP, MySQL, etc. servers into BigQuery.

This documentation is also available in RU.

What is BigQuery-integrations

BigQuery-integrations is a set of Python scripts that let you automate data import to Google BigQuery using Google Cloud Functions.

The current version of BigQuery-integrations features scripts for importing from:

How it works

An HTTP POST request invokes a Cloud function that gets the file from the server and uploads it to a BigQuery table. If the table already exists in the selected dataset, it will be rewritten.


To launch any of these scripts, you need:

  • A Google Cloud Platform project with an activated billing account;
  • Read access to the data source;
  • The WRITER access to the dataset and Job User roles for the Cloud functions service account in the BigQuery project to which you are going to upload the table;
  • An HTTP client for POST requests invoking the Cloud function.

Setup and usage

To launch the scripts, follow the steps described in the readme files in the function folder. You don’t need to edit the functions’ code. The detailed documentation for each function is by the links below:


If you have any questions on the BigQuery-integrations scripts, write us at

You can’t perform that action at this time.