Get a BigQuery OAuth Client ID: https://console.cloud.google.com/apis/credentials . Add https://yourdomain.appspot-preview.com/oauth2callback as the OAuth2 callback URL. If you want to test this locally, you should also add localhost:8080.
Generate a secure cookie hash key:
openssl rand -hex 64and encryption key:
openssl rand -hex 32
Save these values as constants in
package main const googleOAuthClientID = "..." const googleOAuthClientSecret = "..." var cookieHashKey = mustDecodeHex("...") var cookieEncryptionKey = mustDecodeHex("...")
Create a Cloud SQL instance. Edit bqcost.go and bqcost.yaml to reference the correct name.
Edit bqcost.go to reference the correct host name that will serve your app.
aedeploy gcloud app deploy --project=(PROJECT) bqcost.yaml
You can run a local copy against cloud SQL with
go run bqcost.go credentials.go --cloudSQLProxy=true
You can also run a local copy using SQLite, but I need to figure out a way to make this work without breaking deploys to App Engine Flexible.
- Downloading the state of all the tables is incredibly slow. It needs to be parallelized.
- The "get table data from BigQuery" job is just a goroutine. If the instance restarts, the job is stuck forever and you will never be able to access the project. This should be split into smaller chunks that get retried.