-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Are OTF (env) vars forwarded to OTF agents? #590
Comments
Could it be that there is some dedicated action required to forward the variables in OTF to the agents? From my current experience it seems that no variables are existent. Probably missing something obvious since otherwise many people would have reported this way earlier 🤔 |
terraform
vars defined in OTF to agents?
I've just tried with a single workspace variable and the variable is coming through okay. (Having said that, I recently refactored variable handling in order to introduce variable sets, so I wouldn't be surprised if there are some regressions). You'll have to give me information for me to debug:
|
Will do! I've just updated both the server and agent to 0.1.10 and faced another issue
This happens if either the server or agent is refreshed and the respective other stays untouched. Refreshing the other one as well helps to resolve it. FYI:
I've done the config changes and hope the following helps somehow:
server
agent
In OTF I've tried both, env vars and TF vars. In TF the var is then used as follows variable "<VAR>" {
sensitive = true
} and then further as The workspace is using the "cloud" backend. The TF code part which consumes the var looks as provider "vault" {
address = "<address>"
auth_login {
path = "<path>"
parameters = {
password = var.<VAR>
}
}
} The agents run on k8s using https://github.com/pat-s/otf-agent-helm/. It might also be that there are additional config issues in the agent setup but the "only" important piece should be the actual connection between server and agent - which exists - via the The final error then looks as
|
Ah, sensitive variable values are "scrubbed" before being sent over the wire. I'll need to change this accordingly. |
🤖 I have created a release *beep* *boop* --- ## [0.1.11](v0.1.10...v0.1.11) (2023-09-11) ### Features * update vcs provider token ([#594](#594)) ([29a0be6](29a0be6)), closes [#576](#576) ### Bug Fixes * dont scrub sensitive variable values for agent ([#591](#591)) ([a333ee6](a333ee6)), closes [#590](#590) * **integration:** prevent -32000 error ([39318f1](39318f1)) * **integration:** wait for alpinejs to load ([346024e](346024e)) * resubscribe subsystems when their subscription is terminated ([#593](#593)) ([3195e17](3195e17)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Thanks for checking! I've just upgrade both server and agent to 0.1.12 and unfortunately I still face the same issues, both for normal If there's any way I can help, let me know! |
I suggest trying with the simplest configuration that demonstrates the problem, e.g.:
And then set a workspace variable for |
Thanks. I tried that but this only helps for non-sensitive vars as the value will never be shown in the Normal vars work just fine. But even after recreating the sensitive vars, these don't seem to arrive within the build:
When trying with non-sensitive vars and some dummy entries, I get a different error stating "invalid password" as expected. So the claim of "missing password" might really be an empty variable definition. How did you check that sensitive vars make it through in #591? Is there a chance they still don't make it through when being sent to remote agents? v0.1.12 |
Works now! I had an issue updating my fork and merging in the respective commits which resolved the issue. Totally a fault of mine! 🙏 thanks for the fix and sorry for the noise... |
When executing a workspace run through either "Remote" or "Agent" I get an error for the latter stating that a specific
terraform
variable could not be found/is missing.Rerunning the very same state with "Remote" works just fine.
I am wondering: is there maybe an issue which prevents the
terraform
var (defined in OTF) from being honored when using the "agent" option? On the first look it seems like it's not being exported to the runner.I also don't have an idea how to debug this properly. Any pointer is welcome :)
The text was updated successfully, but these errors were encountered: