Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add Support for HCL Output #225

Open
straubt1 opened this issue Jul 21, 2020 · 20 comments
Open

[Feature] Add Support for HCL Output #225

straubt1 opened this issue Jul 21, 2020 · 20 comments

Comments

@straubt1
Copy link

@straubt1 straubt1 commented Jul 21, 2020

Community Note

  • Please vote on this issue by adding a 馃憤 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Description

Currently the output is JSON, but it would be great if we could output HCL.
While we can use cd cdktf.out && terraform plan, but could see real value/interesting solutions created if the output was HCL.

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jul 21, 2020

That's interesting, do you have a specific use-case in mind?

@JayDoubleu
Copy link

@JayDoubleu JayDoubleu commented Jul 21, 2020

Programmatically generating TF files and modules for future edits by users is one of potential use cases.

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jul 21, 2020

Programmatically generating TF files and modules for future edits by users is one of potential use cases.

I was thinking about generating modules as a use case as well. Sort of like having a Construct package which would work with cdktf and has a compatible module output. For this, JSON should be good though.

When thinking about HCL for actual users to continue to work with, I'm wondering if that would make sense for this project. It's very focused on automatically generating JSON configurations which are not optimised for human consumption. It still could in theory be achieved by using something like this with some sprinkles of jq to clean up the JSON a bit.

@straubt1
Copy link
Author

@straubt1 straubt1 commented Jul 21, 2020

I had similar thoughts to @JayDoubleu here.

json2hcl could be a potential work around, but I really like the idea of native support.

@anubhavmishra
Copy link
Member

@anubhavmishra anubhavmishra commented Jul 21, 2020

Hey @straubt1, this is interesting! For now users can use json2hcl to convert Terraform JSON configuration to HCL. You might run something like cdktf synth --json or cat cdktf.out/cdk.tf.json | json2hcl

@JayDoubleu
Copy link

@JayDoubleu JayDoubleu commented Jul 21, 2020

I'll even go step further:

  • Ability to spit out pure HCL
  • Ability to fully programmatically run terraform plan, apply and other commands directly from python/typescript (*not just executing terraform binary)

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jul 22, 2020

  • Ability to fully programmatically run terraform plan, apply and other commands directly from python/typescript (*not just executing terraform binary)

There's the idea of supporting a workflow which is fully backed by the Terraform Cloud, so that you wouldn't need the Terraform CLI locally at all. Technically, that's still a similar concept to what we have right now, though.

@JayDoubleu
Copy link

@JayDoubleu JayDoubleu commented Jul 22, 2020

If I could run local terraform cli via some sort of API then IMHO it would be useful in CI like Azure DevOps etc. If I could print out to screen using terraform cli default interface but run commands programmatically behind the scenes...

I like terraform cli and HCl from user perspective, it's just a bit pain to orchestrate so I started looking into things like pulumi.

If I could programmatically

  • create actual terraform HCL
  • have an API to interact with terraform cli (not cloud)

This would open wide range of possibilities like templating HCL using typescript/python but with intention for the end users to use HCL.

API to interact with terraform cli is IMO very much needed feature for pipelines and orchestration of entire environments using terraform and external data which would come from other programming languages

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jul 22, 2020

I like terraform cli and HCl from user perspective, it's just a bit pain to orchestrate

If you're just after Terraform orchestration, have you looked at Terragrunt? For cdktf, there will likely be support for multiple stacks as well (see #35).

If I could programmatically

  • create actual terraform HCL
  • have an API to interact with terraform cli (not cloud)

This would open wide range of possibilities like templating HCL using typescript/python but with intention for the end users to use HCL.

This sounds like a one-off code generator, since I wouldn't know how you'd reconcile changes made by the user with the generated configuration. Are you after a generator similar to create-react-app, but for Terraform HCL projects?

@JayDoubleu
Copy link

@JayDoubleu JayDoubleu commented Jul 22, 2020

The ability to produce HCL has many use cases, the human readable template generation is just one of them and yes it would look similar to react app generator you mentioned but I'm hoping cdktf is just the right tool for that already . What would make it even better is the ability to spit out HCL. ;)

Regarding the api itself and terragrunt, I'm aware of terragrunt however it's just a wrapper around terraform which can be implemented in any language.

What i'm after is something similar to below:

app = App()
MyStack(app, "automate-everything")
synth = app.synth()
plan = app.plan(synth)

if plan.changes is not None:
    if len(plan.changes.module.custom.fruits) > 4:
       app.apply(synth, target='module.custom.fruits')

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jul 23, 2020

Regarding the api itself and terragrunt, I'm aware of terragrunt however it's just a wrapper around terraform which can be implemented in any language.

What i'm after is something similar to below:

app = App()
MyStack(app, "automate-everything")
synth = app.synth()
plan = app.plan(synth)

if plan.changes is not None:
    if len(plan.changes.module.custom.fruits) > 4:
       app.apply(synth, target='module.custom.fruits')

That's pretty interesting! Could you create a separate issue for this?

The ability to produce HCL has many use cases, the human readable template generation is just one of them and yes it would look similar to react app generator you mentioned but I'm hoping cdktf is just the right tool for that already . What would make it even better is the ability to spit out HCL. ;)

Cool, thanks - I think I understood the use-case you were mentioning. The one-off generator use-case could also be dedicated issue and reference this issue here.

@marcoferrer
Copy link
Contributor

@marcoferrer marcoferrer commented Jan 19, 2021

@skorfmann Just wanted to chime in on this discussion. Supporting HCL as a target output format brings the benefits of allowing sources generated by this project to be better supported by existing tools from the terraform ecosystem.

For example, these are a few of the limitations we've experienced when trying to adopt this tool.

  • IntelliJ HCL plugin does not support resolving definitions from tf.json sources. This normally wouldn't be a problem if the entire module was defined in json. In our case, we actually plan to distribute common cdk modules to engineers to help them conform to an archetype / interface. They then implement their application-specific tf definitions in HCL within the same directory.
  • The SCA tool checkov also seems to lack support for scanning tf.json sources

Currently, we have a convoluted workaround where we strip out the // metadata keys from the cdk output, run it through json2hcl and then finally run terraform 0.12upgrade since the syntax generated by json2hcl is dated and produces errors in our IDE. Although it works we don't feel like its a sustainable option long term. Native support for HCL would be preferable.

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jan 19, 2021

@marcoferrer thanks for your input! Integration into the broader ecosystem of tooling for Terraform HCL is a very good point indeed. So far I was assuming that the JSON representation of HCL would just work in most cases, haven't looked at it in detail though.

IntelliJ HCL plugin does not support resolving definitions from tf.json sources. This normally wouldn't be a problem if the entire module was defined in json.

What you're saying is, that either pure JSON or pure HCL is supported by the extension, but a mix of both is not?

In our case, we actually plan to distribute common cdk modules to engineers to help them conform to an archetype / interface. They then implement their application-specific tf definitions in HCL within the same directory.

I'm very interested in more details about your intended workflow. By cdk module you mean an actual Terraform module containing the JSON output of cdktf?

The SCA tool checkov also seems to lack support for scanning tf.json sources

Haven't tested it, but judging from the code it should be supported.

Currently, we have a convoluted workaround where we strip out the // metadata keys from the cdk output, run it through json2hcl

json2hcl doesn't deal with // nicely or what's the reason to strip it out?

and then finally run terraform 0.12upgrade since the syntax generated by json2hcl is dated and produces errors in our IDE. Although it works we don't feel like its a sustainable option long term.

That's essentially your release process for the packages / modules you're planning to distribute?

@marcoferrer
Copy link
Contributor

@marcoferrer marcoferrer commented Jan 19, 2021

What you're saying is, that either pure JSON or pure HCL is supported by the extension, but a mix of both is not?

It's actually a little worse than that. Using tf.json sources is completely unsupported by the IDE plugin. Of course, this has no effect on the validity of the sources or the ability to use them with the terraform cli, but it does negatively impact the module development experience.

I'm very interested in more details about your intended workflow. By cdk module you mean an actual Terraform module containing the JSON output of cdktf?

We've defined standards within our organization for terraform modules for specific types of applications. These standards define at a minimum what providers / alias are expected as well as a predefined set of variables and outputs. This forms a pseudo-interface we expect each module to conform to. Ensuring modules are conforming to this interface allows us to build org specific automation and policy governance around our infrastructure lifecycle.

Since terraform isn't exactly friendly to reuse of variables, outputs, and providers we were planning on defining cdk modules in Java and publishing them as a jar into an internal artifact repository. Then using maven/gradle engineers are able to pull a semantically versioned modules and instantiate it from within their local cdk module sources. This prevents the need to continually copy/paste specific files throughout all the projects.

I can supply an example project to further demonstrate if you'd like.

Haven't tested it, but judging from the code it should be supported.

That's odd, I tested it locally with one of our projects and ran into issues. Ill give it another shot to see if there was something I missed.

json2hcl doesn't deal with // nicely or what's the reason to strip it out?

Include the // when running json2hcl results in invalid hcl definitions. Most cases the // key gets converted into an hcl keyword. Since json2hcl is no longer being actively developed its output is in HCL1 if I recall correctly. Cause of this, the IDE ends up displaying various errors and once again autocomplete is rendered useless in other non-generated sources.

Its seems that some tools arent happy with the // keys in the output and complain about it being an invalid definition. If this is truly part of the tf.json spec then I would assume its on the individual tool maintainers to fix any issues. If not, there is the option of potentially adding a feature flag to cdktf to omit these keys / metadata from the output.

That's essentially your release process for the packages / modules you're planning to distribute?

In a nutshell yes. Its purely to maintain the developer experience and allow the IDE to resolve definitions created by cdktf from other files being authored by engineers. The terraform cli also produces errors when trying to read hcl that includes the converted // keys.

@jsteinich
Copy link
Collaborator

@jsteinich jsteinich commented Jan 19, 2021

@marcoferrer out of curiosity, do you plan to have other teams eventually transition to writing their configuration using Java rather than hcl?

@marcoferrer
Copy link
Contributor

@marcoferrer marcoferrer commented Jan 19, 2021

@jsteinich Currently, that seems to be the long-term goal but we do want to be able to support cases where we might have to supplement a module with manually written HCL. The move to Java is to allow our engineers to leverage their existing experience and dependency management tools so that they are able to focus purely on terraform concepts without too much overhead of learning hcl specifics or code reuse via copy/paste.

@skorfmann
Copy link
Collaborator

@skorfmann skorfmann commented Jan 19, 2021

I can supply an example project to further demonstrate if you'd like.

Yes, an example would be great!

Include the // when running json2hcl results in invalid hcl definitions. Most cases the // key gets converted into an hcl keyword. Since json2hcl is no longer being actively developed its output is in HCL1 if I recall correctly. Cause of this, the IDE ends up displaying various errors and once again autocomplete is rendered useless in other non-generated sources.

That's certainly not ideal. Perhaps it would make sense to fork and fix json2hcl for now, but don't know the effort required to do so. I'll check if there's an easy option to do this.

Its seems that some tools arent happy with the // keys in the output and complain about it being an invalid definition. If this is truly part of the tf.json spec then I would assume its on the individual tool maintainers to fix any issues. If not, there is the option of potentially adding a feature flag to cdktf to omit these keys / metadata from the output.

The purpose of // is just metadata at the moment. I could imagine that we add an option to not render the metadata at all, which would potentially lead to a degraded experience around debugging in error handling when we start to implement these topics (right now there wouldn't be a difference). It should be noted though, that the // are part of the JSON syntax of Terraform. In the long run, fixing the tools to deal with // would be certainly better :)

@marcoferrer
Copy link
Contributor

@marcoferrer marcoferrer commented Jan 20, 2021

Here's a distilled example of using cdk in a java project using gradle. It also demonstrates sharing common configuration between multiple modules as well as mixing HCL and json sources. #510

That's certainly not ideal. Perhaps it would make sense to fork and fix json2hcl for now, but don't know the effort required to do so. I'll check if there's an easy option to do this.

After looking at the implementation of json2hcl it looks like it relies on the first party hcl libraries published by hashicorp. It seems to be a simple light wrapper. I can try forking the project and bumping the dependencies to see if that solves the issues Im seeing since its still using the hcl lib from 2016.

The purpose of // is just metadata at the moment. I could imagine that we add an option to not render the metadata at all, which would potentially lead to a degraded experience around debugging in error handling when we start to implement these topics (right now there wouldn't be a difference). It should be noted though, that the // are part of the JSON syntax of Terraform. In the long run, fixing the tools to deal with // would be certainly better :)

This is good to know and I agree. As long as the json syntax spec for terraform is being adhered to any issues with // should be handled by the tool developers.

@Satak
Copy link

@Satak Satak commented Mar 20, 2021

Programmatically generating TF files and modules for future edits by users is one of potential use cases.

Yes this is exactly what we would do if this was supported. For example ServiceNow Cloud management only supports HCL files, not JSON. Native HCL output would be expected from this tool.

@mark-e-kibbe
Copy link

@mark-e-kibbe mark-e-kibbe commented Jul 15, 2021

For those of us that have implemented automation with the Terraform Api's for Cloud & Enterprise, would this issue be relevant to parsing a java object out to HCL format?

For example, by taking a Map<String, someJavaModel> and doing a hclParse to obtain the string to utilize in the Terraform Api's?

Example Use Case:
A complex type is required to run some Terraform. This type is a map of objects, enabling the terraform to utilize said collection of objects to use in foreach.

Current Scenario:
Manually having to parse Terraform Complex Types to HCL string on a use-case basis, versus being able to parse to HCL string.

I came across this issue in researching better ways to handle this use case currently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
8 participants