-
Notifications
You must be signed in to change notification settings - Fork 9.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AWS access keys stored in .tfstate file and cross account access #4376
Comments
Totally agree with #1. I would go further and suggest that the whole "remote" section should be stored in a separate .terraform file and only kept local. The "remote" has nothing to do with the "state". Imagine a dev being able to store the remote state file to S3 but have you CI system grab it from http that uses that S3 bucket to serve files. It is the "state" that is important. The location from where I get the state file shouldn't matter. |
It would be prudent to verify the other providers as well, but also prevent this from happening in general as we add providers and mark secret data appropriately. I did just do a test run with the GCE provider and did not notice any credentials stored in the tfstate. |
I landed here looking for a solution to this problem. The value part of a |
Hi all! The ability to set credentials directly as arguments is something Terraform offers for pragmatism, but indeed it's best saved only for when it cannot be avoided because then Terraform will store these settings in the cached backend configuration. (Note that as of Terraform 0.9, that's not part of the state snapshot, even though the file it's stored in is still called Since this issue was originally opened, we've documented Multi-account AWS Architecture as the canonical way to use Terraform across multiple AWS accounts, which includes the practice of using a different account for the backend than for the provider. The timeline here isn't totally clear, but we believe that at the time this issue was opened the AWS provider and S3 backend didn't yet have all of the features required to implement what's described in that guide, but the "assume role" support in the AWS provider is the key feature that makes that approach possible, allowing Terraform to work from a single root set of AWS credentials but use those to get temporary, controlled access to deploy into other AWS accounts as needed. This issue was also filed long enough ago that it predates the Terraform 0.9 backend refactoring itself. The specific problem of the backend configuration being included in the state snapshots is no longer present, because Terraform does now (as this issue suggested) store the backend configuration in a local file, separate from the state snapshots. With all of this said, we think the main asks of this issue are now met via a combination of architectural changes, feature enhancements in the AWS provider, and improved documentation, and so we're going to close this issue now. Thanks for sharing all of these use-cases, and sorry for the long delay in responding to this issue. |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further. |
This is sort of covered in #1964 but not exactly so I'd thought I'd create a separate issue.
My situation is this:
I have a wrapper script that automatically configures a remote s3 backup if a ./.terraform/terraform.tfstate does not exist. The section that set ups the remote config is similar to this:
This doc https://www.terraform.io/docs/commands/remote-config.html suggests passing access keys in via environment variables instead so that they don't get stored in the .tfstate file like this:
But using environment variables doesn't work for me as the s3 bucket is in one account (I'll call this Master Account) and the resources that terraform is going to manage is in different account (I'll call this Product Account). Currently I set AWS_PROFILE so that it uses the access keys for a user in the Product Account. I have another set of access keys that are used to setup remote config so that terraform can read/write and update the tfstate files in the bucket in the Master Account. If I set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables these take precedence over the profile in AWS_PROFILE and then my wrapper script that invokes terraform can't manage resources in the Product Account.
My questions are as follows:
I hope I've clearly explained my situation.
Thanks!
The text was updated successfully, but these errors were encountered: