Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Terraform crash with "runtime error: invalid memory address or nil pointer dereferencegoroutine 3126" when removing count attribute #30771

Closed
tomaswalander opened this issue Mar 30, 2022 · 3 comments
Labels
bug new new issue not yet triaged waiting-response An issue/pull request is waiting for a response from the community

Comments

@tomaswalander
Copy link

First time issue-opener here... Please, let me know if this is not according to expectations.

Terraform Version

```terraform
Terraform v1.1.0
on darwin_amd64
+ provider registry.terraform.io/hashicorp/azuread v2.15.0
+ provider registry.terraform.io/hashicorp/azurerm v2.89.0
+ provider registry.terraform.io/hashicorp/random v3.1.2

Terraform Configuration Files

I was working on a "monitoring" module (to be used in many of our projects - e.g., setting up diagnostic settings to forward application insights log data to an event hub, in turn consumed by an Azure Function that forwards all logs to our DataDog - but probably not relevant for this error.).

While it was work in progress I wanted it to only run in our test environment so I added the count attribute to "target" it only to our test environment.Once ready I removed the "count" to put it in all the environments. Re-adding count solved the issue.

module "monitoring" {
  source      = "./monitoring"
  location    = var.location
  environment = var.environment

  application_insights_instrumentation_id = azurerm_application_insights.common_appi.id

  count = var.environment == "test" ? 1 : 0
}

Debug Output

runtime error: invalid memory address or nil pointer dereferencegoroutine 3126 [running]:
runtime/debug.Stack()
	/usr/local/go/src/runtime/debug/stack.go:24 +0x65
runtime/debug.PrintStack()
	/usr/local/go/src/runtime/debug/stack.go:16 +0x19
github.com/hashicorp/terraform/internal/logging.PanicHandler()
	/home/circleci/project/project/internal/logging/panic.go:44 +0xb5
panic({0x221d240, 0x3ed5b10})
	/usr/local/go/src/runtime/panic.go:1038 +0x215
github.com/hashicorp/terraform/internal/instances.(*expanderModule).resourceInstances(0x0, {0xc000931120, 0x1, 0x1}, {{}, 0x4d, {0xc000906ec0, 0x10}, {0xc000906ed0, 0x4}}, ...)
	/home/circleci/project/project/internal/instances/expander.go:376 +0xc3
github.com/hashicorp/terraform/internal/instances.(*expanderModule).resourceInstances(0xc001e26b70, {0xc000931100, 0x2, 0x2}, {{}, 0x4d, {0xc000906ec0, 0x10}, {0xc000906ed0, 0x4}}, ...)
	/home/circleci/project/project/internal/instances/expander.go:385 +0x293
github.com/hashicorp/terraform/internal/instances.(*Expander).ExpandResource(0xc0000989e0, {{}, {0xc000931100, 0x2, 0x2}, {{}, 0x4d, {0xc000906ec0, 0x10}, {0xc000906ed0, ...}}})
	/home/circleci/project/project/internal/instances/expander.go:157 +0x16f
github.com/hashicorp/terraform/internal/terraform.(*NodePlannableResourceInstanceOrphan).deleteActionReason(0xc002261dd0, {0x2aecc00, 0xc0022c49a0})
	/home/circleci/project/project/internal/terraform/node_resource_plan_orphan.go:204 +0x23b
github.com/hashicorp/terraform/internal/terraform.(*NodePlannableResourceInstanceOrphan).managedResourceExecute(0xc002261dd0, {0x2aecc00, 0xc0022c49a0})
	/home/circleci/project/project/internal/terraform/node_resource_plan_orphan.go:140 +0x58c
github.com/hashicorp/terraform/internal/terraform.(*NodePlannableResourceInstanceOrphan).Execute(0x0, {0x2aecc00, 0xc0022c49a0}, 0x20)
	/home/circleci/project/project/internal/terraform/node_resource_plan_orphan.go:49 +0x90
github.com/hashicorp/terraform/internal/terraform.(*ContextGraphWalker).Execute(0xc000ab37a0, {0x2aecc00, 0xc0022c49a0}, {0x7f321d21e370, 0xc002261dd0})
	/home/circleci/project/project/internal/terraform/graph_walk_context.go:133 +0xc2
github.com/hashicorp/terraform/internal/terraform.(*Graph).walk.func1({0x2553ec0, 0xc002261dd0})
	/home/circleci/project/project/internal/terraform/graph.go:74 +0x2f0
github.com/hashicorp/terraform/internal/dag.(*Walker).walkVertex(0xc00136d500, {0x2553ec0, 0xc002261dd0}, 0xc0015ef400)
	/home/circleci/project/project/internal/dag/walk.go:381 +0x2f1
created by github.com/hashicorp/terraform/internal/dag.(*Walker).Update
	/home/circleci/project/project/internal/dag/walk.go:304 +0xf85
##[error]Error: Terraform Plan failed with exit code: 11

Expected Behavior

It would be awesome if Terraform could handle such a change - but I'm guessing that's more of a feature request.

Minimum expected behaviour: It gives a clear error message without crashing.


Actual Behavior

The plan for our test environment failed (see above) but the plan for other environments succeeded. The other environments use the same config but different tfvars (file + command-line args).


Steps to Reproduce

In short, this is what happened (but over a longer course of time as there's been intermittent plans/applies):

  1. Create a module with a count attribute that depends on a "setting" limitting it to 1 or 0 resources
  2. Plan and apply it
  3. Remove the count attribute

Plan now fails for the "setting" that had been applied in (2) while any new "setting" works.

  1. Re-add count

Plan now works again.


Additional Context

  • Running in Azure Devops Pipelines using this task: TerraformTaskV2@2

References

At first I commented here #30760 because I was unsure if it was related or not.

@tomaswalander tomaswalander added bug new new issue not yet triaged labels Mar 30, 2022
@jbardin
Copy link
Member

jbardin commented Mar 30, 2022

Hi @tomaswalander,

Thanks for filing the issue. Can you reproduce this in the current release of Terraform v1.1.7? There was a very similar crash fixed in v1.1.1, and it would be good to rule that out before investigating further.

Thanks

@jbardin jbardin added the waiting-response An issue/pull request is waiting for a response from the community label Mar 30, 2022
@jbardin
Copy link
Member

jbardin commented Jun 14, 2022

Closing as we have not heard back in a while, and this was likely fixed in a subsequent patch release.

@jbardin jbardin closed this as completed Jun 14, 2022
@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jul 15, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug new new issue not yet triaged waiting-response An issue/pull request is waiting for a response from the community
Projects
None yet
Development

No branches or pull requests

2 participants