Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auth Database Migration Scripts #22427

Closed
13 tasks done
bolyachevets opened this issue Jul 24, 2024 · 4 comments
Closed
13 tasks done

Auth Database Migration Scripts #22427

bolyachevets opened this issue Jul 24, 2024 · 4 comments
Assignees
Labels
sbc-connect SRE SRE team task

Comments

@bolyachevets
Copy link
Collaborator

bolyachevets commented Jul 24, 2024

Dev

  • Create CloudSQL instance
  • Create GCP Cloud Job
  • Test Data Migration

Test

  • Create CloudSQL instance
  • Create GCP Cloud Job
  • Test Data Migration

Prod

  • Create CloudSQL instance
  • Create GCP Cloud Job
  • Test Data Migration
  • Prod to Postgres15 upgrade strategy

Sandbox

  • Create CloudSQL instance
  • Test Data Migration
@bolyachevets bolyachevets added sbc-connect SRE SRE team task labels Jul 24, 2024
@bolyachevets bolyachevets self-assigned this Jul 24, 2024
@bolyachevets
Copy link
Collaborator Author

bolyachevets commented Jul 30, 2024

For upgrade to Postgres15 in prod:

  1. Upgrade image for backup containers to postgres15
  2. Run custom pg_dump commands (as backup container is not flexible enough):

pg_dump -Fp -h postgresql-prod -p 5432 -U postgres -d "auth-db" --no-owner --no-acl -n public > backup.sql
sed '/^CREATE SCHEMA public;/d' backup.sql > filtered_backup.sql
sed '/^ALTER SCHEMA public OWNER TO/d' filtered_backup.sql > final_backup.sql
sed -i '1i CREATE EXTENSION IF NOT EXISTS "uuid-ossp";' final_backup.sql

  1. Trigger Cloud Job from BCOnline Prod project in gcp, updating relevant parameters (e.g. set DUMP_FILE to final_backup.sql)

@bolyachevets bolyachevets changed the title Auth Database Migration Auth Database Migration Scripts Jul 30, 2024
@bolyachevets
Copy link
Collaborator Author

Can manually move data between prod and sandbox:

gcloud sql export sql auth-db-prod gs://auth-db-dump-prod/dump.sql.gz --database=auth-db
gcloud sql databases create auth-db --instance=auth-db-tools
gcloud --quiet sql import sql auth-db-tools gs://auth-db-dump-prod/dump.sql.gz --database=auth-db --user=postgres

@thorwolpert
Copy link
Collaborator

possibly, the DBAs have always asked us to obfuscate/mask data moved to the lower environments.
Auth might be a little different as we want to use the same IDIM between Prod/Sandbox, like our IDir users, our BCSC users don't have test accounts.

@thorwolpert
Copy link
Collaborator

or maybe we entertain a registry-id for sandbox work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sbc-connect SRE SRE team task
Projects
None yet
Development

No branches or pull requests

3 participants