Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

terraform env command doesn't work with different AWS account ID that needs 2 IAM roles #13700

Closed
jrlonan opened this issue Apr 17, 2017 · 14 comments

Comments

@jrlonan
Copy link

jrlonan commented Apr 17, 2017

Summary

Configuring terraform env with 2 AWS account ID (1 staging env account, 1 production env account) does not work.

Terraform Version

0.9.3

Affected Resource(s)

terraform env

Expected Behavior

terraform env should be able to be used with 2 AWS accounts

Actual Behavior

terraform env does not work for 2 separate AWS accounts

Steps to Reproduce

  1. Current default environment runs on staging
  2. Create new, empty production environment
    terraform env new prod
  3. Switch to production env
  4. Run STS assume role to assume role for production environment
  5. I want to pull remote state, therefore I run
terraform init     
    -backend-config="bucket=prod-bucket" \
    -backend-config="key=tfstate/prod.tfstate" \
    -backend-config="region=ap-northeast-1" \

NOTE: S3 configuration is already under a separate file, called backend.tf

  backend "s3" {
    encrypt = "true"
  }
}
  1. Run terraform plan, but terraform shows that it wants to recreate existing resources
  2. I want to switch back to default, access denied
  3. Run STS assume role command to switch back to staging account, terraform env switch default works
  4. I want to delete production env, run terraform env delete production, access denied
  5. Run STS assume role command to switch back to prod account, run terraform env delete production, access denied
  6. Now I'm stuck with an unusable production terraform environment, because it seems like everytime i want to delete, Terraform wants to check both environments, and my STS assume-role has access tokens for only 1 environment at a time

Important Factoids

Previously in terraform 0.8, i use the remote command to switch environments:

  1. Run STS assume-role to switch to production environment role
  2. Run the following command
terraform remote config -disable

# rename any backup tfstate
mv ${WORKSPACE}/tf/*.tfstate ${WORKSPACE}/tf/.terraform/sento.tfstate.backup.$(date "+%Y-%m-%d.%H:%M:%S")

# pull and resync tfstate from new env in S3
terraform remote config \
    -state="${TFSTATE}" \
    -backend=S3 \
    -backend-config="bucket=${S3_BUCKET_NAME}" \
    -backend-config="key=tfstate/${TFSTATE}" \
    -backend-config="region=${REGION}" \
    -backend-config="encrypt=true"

I would expect the same with the new env and init commands but apparently it's not that simple anymore.

@jrlonan jrlonan changed the title terraform env doesn't work with different AWS account ID that needs 2 IAM roles terraform env command doesn't work with different AWS account ID that needs 2 IAM roles Apr 17, 2017
@jbardin jbardin added enhancement and removed bug labels Apr 17, 2017
@jbardin
Copy link
Member

jbardin commented Apr 17, 2017

Hi @jrlonan,

This is working as expected right now, though I think we can handle a similar workflow with some manual intervention. I'll work on an example with the new backend system.

I think we will want to at least have a documented way to access different environments with different credentials, even if it still requires the user to properly setup the S3 bucket policies.

@jrlonan
Copy link
Author

jrlonan commented Apr 18, 2017

@jbardin
Thank you! Looking forward to the enhancement! :)

By the way, is there any way to fix the terraform production env, apart from adding cross-account bucket permissions to the IAM role?

@jbardin
Copy link
Member

jbardin commented Apr 18, 2017

@jrlonan,

Yes, besides being able to write to the corresponding state file, the credentials you're operating with will need to be able to list all keys in the bucket (or at least keys prefixed with env:/. I don't want to specify the implementation details, but that may be required for a multi-user policy).

You can't use environments if you're going to try and reconfigure the backend between switching your logical "environments" locally. The point of the env command is change to a named state file, so changing the bucket and key defeats that purpose.

When you ran terraform env new prod, that created an environment named "prod" and the associated state file in the backend you already had configured. When you changed the configuration to a "prod" bucket and state file, you reconfigured the backend to use those new settings, possibly leaving behind the old state files (you didn't specify what the migration output was during the init command). The terraform init command doesn't pull anything, the state is always stored and accessed remotely.

@dav009
Copy link

dav009 commented Apr 19, 2017

@jbardin as for the sake of this use case.
a workaround could be to handle two backends:

  • avoid using env
  • have for example a symlinked file backend.tf pointing to the env defined as the current env i.e: backend.stg
  • switching env would be:
    • do assume role
    • symlink to your env i.e: backend.tf -> backend.prd

do you see any drawbacks with the mentioned approach ?

@jbardin
Copy link
Member

jbardin commented Apr 19, 2017

@dav009,

Off the top of my head, I think that would work as long as you don't migrate the states when switching. It might be good to just remove .terraform/terraform.tfstate when switching envs before running init again.

I don't like the idea of having to recommend users touch the .terraform/ files, since how terraform uses them may change over time.

I mentioned it in an unrelated issue, but I was thinking about a flag like terraform init -reconfigure to allow a user to change the init configuration without checking the previously stored config.

@bendavies
Copy link

bendavies commented May 4, 2017

I'm doing this as follows:

I have an AWS organisation, with multiple accounts. I always authenticate with the AWS credentials of the root account. That account can AssumeRoleinto the sub accounts to provision them.

So, given:

variable "account_ids" {
  description = "Terraform state environments mapped to the target account ID"
  type        = "map"

  default = {
    "dev"     = "111111111111"
    "staging" = "222222222222"
    "prod"    = "333333333333"
  }
}
provider "aws" {
  allowed_account_ids = ["${lookup(var.account_ids, terraform.env)}"]
  region              = "${var.region}"

  assume_role {
    role_arn = "arn:aws:iam::${lookup(var.account_ids, terraform.env)}:role/OrganizationAccountAccessRole"
  }
}
...

then

terraform env select dev
terraform plan

This will plan/apply into the correct AWS account by assuming the correct role, and will nicely fail if you have forgotten to select the terraform env select or the selected env does not exist in account_ids.

@fewknow
Copy link

fewknow commented Dec 8, 2017

Since the release of workspace for the new backend has this been addressed? Please provide documentation on the correct way to use workspaces around separate aws accounts or the work around as I seem to can't find it anywhere. Thank you.

@purkhusid
Copy link

Are there any plans to support this? An ideal solution would allow the usage of multiple accounts and allow the state to be stored in different buckets depending on the account. That way you can keep your dev/stg/prd states in your dev/stg/accounts.

@davidgoate
Copy link

Did anyone find a recommended way to do this with workspaces?

@apparentlymart
Copy link
Contributor

I don't think this works back as far as Terraform 0.9, but recent versions have direct support for assuming roles, so you can specify the role_arn in the backend configuration to specify which role you wish to use.

The S3 backend docs have a guide on setting up cross-account access which shows one way to do this using AssumeRole for the AWS provider but a centralized S3 bucket. This is the setup that works best with Terraform's workspace workflow.

Workspaces are generally not the best way to separate different environments. They work better for creating temporary separate deployments for development/testing purposes. To fully isolate your environments, it's better to instead have a separate root configuration for each and use modules to define the common elements. The various separate root modules then give you a proper place to keep the various differences between environments, such as different backend configuration.

@davidgoate
Copy link

@apparentlymart Thank you for this, I find this concept very interesting and will go to read about modules.

The headline seems relevant in as much as it says:

Modules are used to create reusable components in Terraform as well as for basic code organization.

I presume in my case the "reuse" part would be what I'm after. e.g. it might not be best practice, but I could create a single module called "infrastructure" which I can "reuse" in each environment

To make sure I understand correctly though, are modules more suitable than workspaces even in the condition that the infrastructure being managed is 100% identical? In my specific case it would seem that workspaces fit my needs all apart from being able to "tie" a workspace to a particular AWS account.

I touched on this here and with you being a contributor would love your input on that stackoverflow question.

@apparentlymart
Copy link
Contributor

Hi @davidgoate! I wrote a more lengthy comment on a similar topic over in #18632 recently. I think that answers the questions you asked here.

@gdavison
Copy link
Contributor

gdavison commented Sep 8, 2023

As the terraform env command is no longer supported, I'm going to close this issue.

@gdavison gdavison closed this as not planned Won't fix, can't repro, duplicate, stale Sep 8, 2023
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 10, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

10 participants