Triggers GCP Cloud SQL export backups to GCS.
Solution documentation:
https://cloud.google.com/solutions/scheduling-cloud-sql-database-exports-using-cloud-scheduler
main.py
- the coderequirements.txt
- the pip modules to bootstrapdeploy.sh
- upload the code and deps
Upload the function to GCF in the current GCP project - this script will call gcloud functions deploy
with the required switches:
./deploy.sh
- a Cloud PubSub topic
- Cloud Scheduler jobs to trigger backups
- see
gcp_cloud_schedule_sql_exports.sh
in DevOps Bash tools repo
- see
- a service account with permissions to access Cloud SQL
- see
gcp_sql_create_readonly_service_account.sh
in DevOps Bash tools repo
- see
- each Cloud SQL instance to be backed up requires objectCreator permissions to the GCS bucket
- see
gcp_sql_grant_instances_gcs_object_creator.sh
in DevOps Bash tools repo
- see
Instead of deploy.sh
you can alternatively use the Serverless framework for which a serverless.yml
config is provided:
serverless deploy
If this is your first time using Serverless then you'll need to install the GCP plugin:
serverless plugin install --name serverless-google-cloudfunctions
The serverless.yml
config expects to find $GOOGLE_PROJECT_ID
and $GOOGLE_REGION
environment variables.
Serverless requires additional permissions for the service account: Deployment Manager Editor and Storage Admin to create deployments and staging buckets.
You can also build a serverless artifact to .serverless/
without deploying it (generates Google Deployment Manager templates and a zip file - useful to check what would be uploaded / ignored):
serverless package