Skip to content Exporter to Parse Loader
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Form to Parse'erator Exporter to Parse Loader.

Exports data using the Responses API and loads it into a Parse DB using the Parse API.

  • A great and very compatible, acceesible and friendly web form SaaS product.
  • Parse: A great framework for running a MBaaS that you can host yourself on a variety of platforms.

Overview Diagram

overview diagram

Parse Table (Dashboard)

parse entries

What does it do?

It exports data from a form to a Parse DB on a scheduled interval.

Where does this run?

This is built to run on the Google App Engine Standard Environment as a Scheduled Task.

How does it work?

It queries a Response API (doc) and loads the data into a hosted Parse DB with the Parse API (doc).


See the requirements.txt for the list of Python package dependencies.

This relies on successful responses from a Responses API and hosted Parse API.

This is built to operate on Google App Engine and thus has dependencies on all of the relevant underlying infrastructure on Google Cloud Platform.

Google Cloud Platform Service Dependencies:

  1. App Engine (Standard Environment)
  2. Memcache (Shared)
  3. Key Management Service
  4. Cloud Storage



  1. Google Cloud Platform Account.
  2. Parse API Credentials.
  3. API Credentials.


  1. Python 2.7.
  2. Working pip installation.
  3. Installation of gcloud SDK and the loaded into your PATH (doc).


Cron Schedule

See cron.yaml.

Secure Key Storage

To securely store the Parse API Credentials and API Credentials for access by the service from Google App Engine I have chosen to use Google's Key Management Service. Two initial one-time steps need to be completed for this to work.

  1. Encrypt and upload the secrets to Google's Key Management Service.
  2. Grant the appropriate Service Account access to decrypt the secrets.

Fetch your Parse API Credentials to be able to proceed.

  1. Encrypt Secrets

We will create a Service Account in Google IAM to be able to encrypt / decrypt our secrets (which you could create separate encrypt/decrypt accounts and permissions if you would like).

To create a Service Account:

$ gcloud --project PROJECT_ID iam service-accounts create SERVICE_ACCOUNT_NAME
$ gcloud --project PROJECT_ID iam service-accounts keys create key.json \

This creates a Service Account and a JSON file with the credentials which we can use to encrypt / decrypt our secrets outside of KMS.

One of the easiest ways to interact with Google KMS is to start with the samples from the GCP Samples here.

A quick note about compatibility. If you already have this repository cloned, a change was pushed that updates how blobs are encoded/decoded. It is not backwards compatible with previous code and will error your application at runtime with something like the SO post mentioned in the commit. This code is updated to support only the new encoding/decoding but if you have trouble because you previously cloned this reposity, git pull, re-encrypt and you should be good to go.

Once you have this repository cloned, you will create a keyring and cryptokey:

$ gcloud --project PROJECT_ID kms keyrings create KEYRING_NAME --location global

$ gcloud --project PROJECT_ID kms keys create parse --location global --keyring KEYRING_NAME --purpose encryption
$ gcloud --project PROJECT_ID kms keys create typeform --location global --keyring KEYRING_NAME --purpose encryption

$ gcloud --project PROJECT_ID kms keys add-iam-policy-binding parse --location global \
--keyring KEYRING_NAME --member \
--role roles/cloudkms.cryptoKeyEncrypterDecrypter
$ gcloud --project PROJECT_ID kms keys add-iam-policy-binding typeform --location global \
--keyring KEYRING_NAME --member \
--role roles/cloudkms.cryptoKeyEncrypterDecrypter

You will also need to grant the project service account access to decrypt the keys for this implementation. You could use a more secure setup if you would like.

gcloud --project PROJECT_ID kms keys add-iam-policy-binding parse --location global \
--keyring KEYRING_NAME --member \
--role roles/cloudkms.cryptoKeyDecrypter
gcloud --project PROJECT_ID kms keys add-iam-policy-binding typeform --location global \
--keyring KEYRING_NAME --member \
--role roles/cloudkms.cryptoKeyDecrypter

If you haven't used the KMS service before the SDK will error with a URL to go to to enable:

$ gcloud --project PROJECT_ID kms keyrings create KEYRING_NAME --location global
ERROR: (gcloud.kms.keyrings.create) FAILED_PRECONDITION: Google Cloud KMS API has not been used in this project before, or it is disabled. Enable it by visiting then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.

Once that is completed, navigate to kms > api-client in the GCP Samples repository and create a with the following content:

echo 'THE_SECRET' > /tmp/test_file
  /tmp/test_file /tmp/test_file.encrypted 
  /tmp/test_file.encrypted /tmp/test_file.decrypted
cat /tmp/test_file.decrypted

Fill in the PROJECT_ID from Google, the KEYRING_NAME you chose above, and THE_SECRET to encrypt.

The expected form for THE_SECRET for Parse:

app_id: ...
rest_key: ...
master_key: ...

The expected form for THE_SECRET for

typeform_api_key: ...

Before you run the script you need to set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of key.json that you generated previously.

This will look something like:


If you now run bash it should print the API Key and the Encrypted version should be stored in /tmp/test_file.encrypted. You can copy this file to somewhere else to temporarily store before we upload and then run the same script with the parse API Secret. In the below example I have renamed the file to parse.encrypted.

  1. Upload Secrets

Once you have the encrypted secret file you will need to upload it to Google Cloud Storage for fetching in App Engine (and eventual decryption). Assuming the file is called parse.encrypted, you would run something like the following:

$ gsutil mb -p PROJECT_ID gs://BUCKET_NAME
Creating gs://BUCKET_NAME/...

$ gsutil cp parse.encrypted gs://BUCKET_NAME/keys/

$ gsutil ls gs://BUCKET_NAME/keys

And do the same for a file, call it typeform.encrypted:

$ gsutil cp typeform.encrypted gs://BUCKET_NAME/keys/

$ gsutil ls gs://BUCKET_NAME/keys


Initially, you will need to install the dependencies into a lib directory with the following command:

pip install -t lib -r requirements.txt

This lib directory is excluded from git.

Local Development

The included loaded into your PATH is the best/easiest way to test before deployment (doc).

It can easily be launched with: app.yaml

And then view http://localhost:8000/cron to run the cron locally. For this to work you will need to mock the KMS/GCS fetches otherwise you will get a 403 on the call to GCS bucket. I have not found a way around this at this point.


This might be the easiest thing you own / operate as is the case with many things that are built to run on GCP.


$ gcloud --project PROJECT_ID app deploy
$ gcloud --project PROJECT_ID app deploy cron.yaml

On your first run if this is the first App Engine application you will be prompted to choose a region.


No unit tests at this time.

Once deployed, you can hit the /run path on the URL.


Google's Stackdriver service is sufficient for the logging needs of this service.

To view logs, you can use the gcloud CLI:

$ gcloud --project PROJECT_ID app logs read --service=default --limit 10

If you are not using the default project, you will need to change that parameter.

If you want to view the full content of the logs you can use the beta logging command:

$ gcloud beta logging read "resource.type=gae_app AND logName=projects/[PROJECT_ID]/logs/" --limit 10 --format json

Filling in the appropriate [PROJECT_ID] from GCP.

You can also see all available logs with the following command:

gcloud beta logging logs list


Most of the pieces here cost money and doing some quick math to make sure you are comfortable with the costs likely makes sense.

Parse: Depending on how you are running Parse, there may be costs to making these requests. Depending on the plan you purchase, there are various costs. Refer to their website for details. This purposely does not run with a Cloud Function using callbacks because that would require the PRO+ plan at this time which may be too expensive for some folks.

Google Cloud Platform: The App Engine Standard Environment has three costs associated with it for this project.

  1. Compute: Per-instance hour cost (here).
  2. Network: Outgoing network traffic (here).
  3. Key Management Service: Key versions + Key use operations (here).

The Memcache being used is the Shared Memcache (doc) which is Free at this time.


  • Parse API: Consult your documentation on your deployment for any limits.
  • Responses API: Maximum of 2 requests per second.

Pull Requests

Sure, but please give me some time.


Apache 2.0.

You can’t perform that action at this time.