Environment variables in .tf files #62

Closed
hstove opened this Issue Jul 28, 2014 · 46 comments

Projects

None yet
@hstove
hstove commented Jul 28, 2014

Hey! This is a very cool project.

I browsed all over the documentation and couldn't find a way to interpolate environment variables in a configuration file. Is this possible?

@bobbytables

@hstove I've been doing terraform apply -var "variable=${ENV_NAME}"

@mitchellh
Member

It isn't possible, but let's make it so. I really like the approach we took with Packer with environmental variables: they can only be used as default values for traditional variable entries. I think we'll do the same thing here.

Syntax can be ${env.NAME}.

@hstove
hstove commented Jul 28, 2014

Syntax can be ${env.NAME}.

Wonderful! This was one of the first things I tried.

@deoxxa
Contributor
deoxxa commented Aug 9, 2014

+1 on this, it'd make configs more easily portable.

@sparkprime
Collaborator

Why do you want to look up environment variables? It seems to me that it would make it much harder to understand what a given tf script will do, or why it's failing, if it's internally using some environment variable that you don't know about. Better to make all the inputs explicit I feel.

@deoxxa
Contributor
deoxxa commented Aug 10, 2014

Basically I just want to keep my AWS tokens out of my config.

@sparkprime
Collaborator

And you don't want to use a commandline parameter?

@sparkprime
Collaborator

I suppose you could also read from a file using the file interpolator.

@sparkprime
Collaborator

Actually I've just discovered terraform.tfvars (can override the filename with -var-file). Using that is probably the best approach at the moment.

@sparkprime
Collaborator

Well, except for #153

@wtcross
wtcross commented Aug 21, 2014

I think a nice middle ground, that is nicely constrained, would be to only allow environment variable interpolation in a tfvars file. This will prevent what @sparkprime brought up- not knowing where environment variables are used. An example of something that would become cleaner is working with a service like TravisCI. I would no longer have to have some layer of abstraction.

My team uses environment variables for anything secret. This works well in all scenarios and doesn't assume a tfvars file is present. So far we have been using bash scripts that just interpolate environment variables when executing the terraform command.

This isn't initially platform independent and we needed it to be. Today I threw together a Ruby gem so that Windows users can work with our devops tools. I'd love to no longer have to use this gem :)

@wtcross
wtcross commented Aug 21, 2014

On a side note, this tool is awesome! I have put it into use on two projects so far. One of them is greatly simplified thanks to Terraform. Thank you for the great work!

@wtcross wtcross referenced this issue in bandwidthcom/genesis Aug 21, 2014
Open

[option] tfvars file #2

@sparkprime
Collaborator

Environment variables are better for security than commandline options, because in /proc the environment is not world-readable (the commandline is). On the other hand, environment variables are worse for understanding the behaviour of a configuration because they are very implicit. They come from a variety of sources (system profile, .bashrc, .xinitrc, sshd, ...). When they are used, they often have a default value provided so that if you miss an environment variable, it's a silent error. What happens when your ops person is in a coma, and you have to run the script yourself, but it has some slightly different behavior because your environment isn't quite the same. I've seen this happen. Commandline parameters are a lot more explicit in that regard.

wtcross's compromise does reduce the problem because you can always look in the tfvars file to see what environment vars are being used.

I think just putting secrets in files on disk have the best of both worlds. Their access permissions can be controlled, and their existence is explicit. There is no influence from the system or other software. Just mandate a specific filename / glob for these secrets and add it to .gitignore so that no-one commits their secrets by accident.

@wtcross
wtcross commented Aug 21, 2014

@sparkprime

When they are used, they often have a default value provided so that if you miss an environment variable

If I am able to interpolate environment variables into a tfvars file then I would expect terraform to yell at me for not having one set. I won't have default values for a target environment. That being said, I do realize that it can happen.

I would argue that files are just as easy to cause problems with and have seen that happen as well. Also, now I have to ship files around and can't work as easily with TravisCI and other hosted build systems. I feel that environment variables work better for automation.

Thoughts?

@sparkprime
Collaborator

The situation indeed improves if you disallow environment variable defaults. Until somoene opens a bug asking for reflection capabilities over the environment :)

Don't most people work with environment variables just by putting them in their profile?

I think shipping around files is no harder than telling people to go to a site and copy paste a key into their .bashrc.

I don't understand what the problem with TravisCI is? Are you running terraform inside Travis? Continuous integration tests of your entire production service? :)

@wtcross
wtcross commented Aug 21, 2014

We are using Terraform to create and tear down AWS instances for integration testing in our CI/CD pipeline. Basically we use Terraform to create aws_instance resources and then map the public_ip attribute of these resources to an Ansible inventory file. We also set up other things in AWS like Route53 (more in the future). From there we configure away.

@wtcross
wtcross commented Aug 21, 2014

Also, this is probably not a normal thing, but I store my environment variables in Pass. Here is what a snippet of my .zshrc looks like:

export AWS_ACCESS_KEY="$(pass aws/access-key)"
export AWS_SECRET_KEY="$(pass aws/secret-key)"
export HEROKU_API_KEY="$(pass dev/heroku-api-key)"
export GITHUB_TOKEN="$(pass dev/github/tokens/app-deployment)"
@sparkprime
Collaborator

I guess the fundamental point is that keys can't be typed interactively so they must be stored somewhere. That means the configuration language needs to be parameterised on a per-user rather than per-project basis. Here are some storage options:

.bashrc (exported environment variables)
.bash_history (e.g. you press up in bash, until you find your keys, works for cmdline opts and env vars)
specific file in project dir
specific file in $HOME
some remote key store that has some credential system that works for you,

There are probably more options too. Whatever the option, you want the input to be:

  • secure
  • hermetic (i.e. the configuration doesn't depend on implicit things that trip you up)
  • easy to distribute to coworkers / set up on a new machine or whatever
  • easy to lookup within the configuration
@wtcross
wtcross commented Aug 21, 2014

@sparkprime Agreed.

@sparkprime
Collaborator

An easy solution to the hermeticity problem is actually to put the set of allowed commandline variables on the commandline. E.g. terraform -env AWS_ACCESS_KEY -env AWS_SECRET_KEY ...

If you try and get at the environment var inside a config and it wasn't listed on the commandline, you get an error (you don't get empty string or null, as then you'd have a problem somewhere else and have to chase the problem back to its source).

That way, you don't have the actual value on the commandline, and it's clear where the data is coming from. I think this is practically equivalent to only allow environment variables within a tfvars file though (as long as tfvars files remain extremely limited in expressive power so that they are easy to read for people of all skill levels). At least if there is an existing equivalence between tfvars files and the commandline vars params.

@wtcross
wtcross commented Aug 21, 2014

Are you suggesting that if environment variables could only be interpolated in a tfvars file that you would still have to use the -env commandline argument?

@sparkprime
Collaborator

I suppose the cleanest design would be to allow two options:

  1. Access to env vars when specifying arguments to config parameters in the tfvars file (whatever syntax you want)
  2. Access to env vars when specifying arguments to config parameters on the commandline, but not via bash $FOO expansion as that is insecure. Instead, some new syntax that is expanded by terraform.

That way there is no practical difference between using tfvars files and using the commandline, and everything remains secure and predictable.

It would not be possible to access env vars in general .tf files.

@alekstorm
Contributor

I favor @sparkprime's solution; it's quite clever. I assume he meant the syntax to be more like terraform plan -env tf_var_name=ENV_VAR_NAME?

@sparkprime
Collaborator

Sounds good to me. After this discussion I actually went away and implemented something similar in Jsonnet:

-e means execute the code snippet (like sed)

The following yields an error: you can't just access any old environment var:
$ jsonnet -e 'std.env("AWS_ACCESS_KEY")'

The following explicitly forwards it in but will only work if the env var was in .profile:
$ jsonnet -E AWS_ACCESS_KEY -e 'std.env("AWS_ACCESS_KEY")'

The following will always print "foo":
$ AWS_ACCESS_KEY="foo" jsonnet -E AWS_ACCESS_KEY -e 'std.env("AWS_ACCESS_KEY")'

Importing files was always possible in Jsonnet so that covers the tfvars case.

@mitchellh mitchellh added the core label Oct 11, 2014
@delitescere

👍 soon?

@motdotla

Loving terraform. It's been great.

I'd personally be a fan of taking a .env implementation. [1]

Process:

  1. Drop a .env file in the same directory as the terraform.tf file.
  2. Those are automatically loaded into the environment.
  3. you can reference those with "${env.AWS_ACCESS_KEY}"

Make sure NOT TO commit your .env file to source control.

[1] https://github.com/bkeepers/dotenv

@dhobbs
dhobbs commented Dec 10, 2014

+1 to doing it like packer.

Any solution that requires a secret key to be in a file is a disaster waiting to happen. Just look at all the AWS keys already in github for evidence.

@clstokes
Member

👍 to the ${env.NAME} syntax for this.

@roll
roll commented Dec 16, 2014

👍 to ${env.NAME} in variable definitions

I was quite surprised to see terraform.tfvars after packer great workflow with tokens.

@pasviegas

+1 to doing it like packer.

@willejs
willejs commented Jan 9, 2015

👍 for packer like ${env.NAME}

@mzizzi
mzizzi commented Jan 23, 2015

👍 for ${env.NAME} support

@ryancharette

+1 for packer like ${env.NAME}

@joegoggins

+1 for ${env.NAME} ! :shipit: 😄

@mhamrah
mhamrah commented Jan 29, 2015

+1. Also, HCL is very similar to HOCON as used by the TypeSafe config library. It's not a standard, but there are probably some patterns you might find beneficial, like the hierarchy of variable substitutions across file, command line, and environment.

@mitchellh
Member

@mhamrah Thanks for pointing that out, I'm going to take a look!

@mbrevoort

+1

@sparkprime
Collaborator

So I have done a writeup of a casestudy of using Jsonnet on top of Terraform (+ Packer and a few other things). In there I solve this credentials problem by importing a file from disk that is not checked into version control, this is discussed on page 2.

Jsonnet is a bit like HOCON by the way, but is more powerful and has a principled design.

@woodhull

+1

@kilhage
kilhage commented Mar 17, 2015

+1

@spesnova

+1

@woodhull

We ended up solving this by wrapping terraform in a ruby script that stuffs
env variables into var command line arguments and provisons things that
terraform does not support yet.

On Monday, March 16, 2015, Seigo Uchida notifications@github.com wrote:

+1


Reply to this email directly or view it on GitHub
#62 (comment).

This is my personal email address. If you are emailing me about
ControlShift, please send a note to nathan@controlshiftlabs.com

@dentarg
dentarg commented Mar 19, 2015

No-one here having the problem that secrets end up in terraform.tfstate? (#516) Example: the Heroku provider stores every ENV var from Heroku (heroku config) in terraform.tfstate. Could be a lot of secrets...

@mitchellh
Member

@dentarg There is another issue where people have definitely brought this up. We're still discussing it there.

@7heo
Contributor
7heo commented Mar 30, 2015

Since the syntax in terraform is type.name.attr, wouldn't it be more logical to use ${shell.env.NAME} (for $NAME)?

Also, is that already implemented? I've been trying to use it with 0.4.0-dev and it fails, so far. Am I missing something, or?

@mitchellh
Member

@7heo We have precedent for 2-element things: ${path.module}, etc. I think that is much cleaner.

@ggiamarchi ggiamarchi added a commit to ggiamarchi/terraform that referenced this issue Apr 21, 2015
@ggiamarchi ggiamarchi Support environment variables in *.tf config
Fix #62
9ccf2c7
@mitchellh mitchellh closed this in #1621 Apr 22, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment