Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add S3 import functionality which is supported for MySQL instances #289

Merged

Conversation

bryantbiggs
Copy link
Member

Description

  • add S3 import functionality which is supported for MySQL instances
  • add new example for demonstrating and verifying S3 import functionality
  • update pre-commit hook dependencies & update required provider syntax
  • snapshot_identifier and replicate_source_db defaults changed to null due to error "s3_import": conflicts with snapshot_identifier|replicate_source_db
  • add sensitive = true to password outputs due to error message
  • gitignore updated using GitHub provided default standard + 0.14 lockfile ignore

Motivation and Context

Breaking Changes

  • No

How Has This Been Tested?

  • Tested using new example directory s3-import-mysql; directions added for others to replicate testing/confirmation


```bash
$ docker run --name percona-xtrabackup-8.0 --mount type=bind,src=/tmp/backup,dst=/backup --volumes-from percona-server-mysql-8.0.20 percona/percona-xtrabackup:8.0 xtrabackup --backup --data-dir=/var/lib/mysql --target-dir=/backup --user=root --password=root
$ s3 sync /tmp/backup/ s3://s3-import-<UPDATE-NAME>/
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please make a tiny SQL backup file in this example dir, create S3 bucket and upload it to the new bucket using aws_s3_bucket_object resource, and then reference it in a test?

The goal is to run just "terraform apply" to get end-to-end examples to work.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yep - I can add that

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI - there are quite a few files (you can see them in 6be7666) so I couldn't use the s3 bucket object resource. I did try with the fileset() but that doesn't support recursively adding everything so I had to go with a local-exec to make it work

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, I didn't know there would be so many files needed for the small demo. Thought there would be just 1-3 :)

Anyway, your solution with unzip and s3 sync looks perfect as the working example.

@bryantbiggs
Copy link
Member Author

@antonbabenko this one should be ready for review - I'll take a look at the one you linked and see how best to proceed with that one

@antonbabenko antonbabenko merged commit 6523602 into terraform-aws-modules:master Feb 22, 2021
@bryantbiggs bryantbiggs deleted the feature/s3_import branch February 22, 2021 18:59
@antonbabenko
Copy link
Member

v2.21.0 has been just released.

@github-actions
Copy link

I'm going to lock this pull request because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems related to this change, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 14, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants