Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic: Error reading level state: strconv.ParseInt: parsing "1591104108820": value out of range (32 bit architecture) #6518

Open
thecodeassassin opened this issue Jun 3, 2020 · 10 comments

Comments

@thecodeassassin
Copy link

thecodeassassin commented Jun 3, 2020

Note: The original comment is displayed at #6518 (comment) (or in history), and this has been replaced by a maintainer (@rileykarson).

The Terraform binary is available for multiple architectures, including for 32-bit systems, as can be seen on https://developer.hashicorp.com/terraform/install. The Terraform Plugin SDK, the framework used to develop Terraform providers, internally uses the runtime architecture-specific int type rather than integers of a specific size. Providers are only able to specify that fields are an integer as the schema.TypeInt type.

This was not taken into account when developing the Google Cloud provider, and many integerfields within the provider accept 64-bit values. In particular, millisecond timestamps like creation_time and last_modified_time are int64 values represented as schema.TypeInt. While it would be possible to correct these values by converting them to schema.TypeString & we could identify many of them with heuristics like specific field names, this would be a breaking change required to happen in a major release and would not be guaranteed to be exhaustive.

For practical purposes, this means that the Google Cloud provider must be used on 64-bit systems like AMD64 and ARM64. This may change in the future- the Terraform Plugin Framework should support int64 internally and obviate this issue, for example.

This shouldn't be an issue for most (and possibly all) users, who are using Terraform with the Google Cloud provider on 64-bit systems. However, issues of this type come up fairly commonly when users have accidentally installed the 32-bit version of Terraform, which runs on 64-bit systems as well as 32-bit ones. In these cases, you can install the 64-bit version to resolve the issue.

We believe this happens most commonly when folks download binaries off the website, as 32-bit versions are listed first, such as for Linux:

image

Other reports have included package managers, however.

Debugging

✔️ To identify if this is the case, run terraform -v and check the architecture of your Terraform version. For most users, you'll want a version string like the following, with an amd64 or arm64 suffix:

Terraform v1.5.4
on darwin_arm64
Terraform v1.5.2
on linux_amd64
Terraform v1.0.2
on windows_amd64

⚠️ On the other hand, you'll want to replace your Terraform binary and reinitialize your providers if you see a message from the following:

Terraform v1.6.6
on linux_386
Terraform v1.7.2
on windows_386

References

@megan07
Copy link
Contributor

megan07 commented Jun 4, 2020

Hi @thecodeassassin! I'm sorry this stopped working. It looks to be an issue with the SDK, I think, so I have opened up an issue with them. hashicorp/terraform-plugin-sdk#469

@megan07
Copy link
Contributor

megan07 commented Mar 16, 2022

Hi @thecodeassassin, I wanted to check to see if this is still an issue? Thanks!

@rileykarson rileykarson removed their assignment Sep 22, 2022
modular-magician added a commit to modular-magician/terraform-provider-google that referenced this issue Sep 30, 2022
Signed-off-by: Modular Magician <magic-modules@google.com>
modular-magician added a commit that referenced this issue Sep 30, 2022
Signed-off-by: Modular Magician <magic-modules@google.com>

Signed-off-by: Modular Magician <magic-modules@google.com>
@radureau
Copy link

radureau commented Nov 5, 2022

Hello, I have the same issue.

│ Error: Plugin did not respond
│
│   with module.main.google_bigquery_dataset.ds_tmp,
│   on ..\bigquery.tf line 95, in resource "google_bigquery_dataset" "ds_tmp":
│   95: resource "google_bigquery_dataset" "ds_tmp" {
│
│ The plugin encountered an error, and failed to respond to the plugin.(*GRPCProvider).PlanResourceChange call. The plugin logs may contain more details.
╵

Stack trace from the terraform-provider-google_v4.25.0_x5.exe plugin:

panic: Error reading level state: strconv.ParseInt: parsing "1655974603933": value out of range

It only happens on Windows.

terraform.exe -chdir=iac/main/rec state show module.main.google_bigquery_dataset.ds_tmp
# module.main.google_bigquery_dataset.ds_tmp:
resource "google_bigquery_dataset" "ds_tmp" {
    creation_time                   = 1655974603933
    dataset_id                      = "DS_TMP"

I think that the lines of code are

func flattenBigQueryDatasetCreationTime(v interface{}, d *schema.ResourceData, config *Config) interface{} {

func stringToFixed64(v string) (int64, error) {

I wonder if strconv.ParseInt(v, 10, 64) is troublesome if the provider is compiled for 32 bits on Windows. (I have a 64 bits system)

@melinath
Copy link
Collaborator

@edwardmedia @megan07 do we still believe this is upstream-terraform?

@slevenick
Copy link
Collaborator

I created hashicorp/terraform-plugin-sdk#1236 because the original SDK issue was closed

@Boardtale
Copy link

Boardtale commented Jan 2, 2024

This looks like a blocker, is there any workaround?
I have Windows 64 bit and I just can't work with terraform.

Upgraded GCP provider and terraform and no success.
There's no option to choose how to run Terraform (64 vs 32) so why it is trying to run it be default using x32?

The problematic thing is:
"module": "module.gcp_bigquery",
"mode": "managed",
"type": "google_bigquery_dataset",
we have there
"creation_time": 1682188988890,

actually, there are much more ints. Also in google_bigquery_table and in google_compute_managed_ssl_certificate - certificate_id.

@rileykarson
Copy link
Collaborator

You should verify that your Terraform binary was not unintentionally installed as 32 bit- thats the most common scenario we see cause this issue. Notably, i386 appears first on https://developer.hashicorp.com/terraform/install when effectively all users outside of niche scenarios should be using AMD64.

@Boardtale
Copy link

Boardtale commented Jan 13, 2024

@rileykarson you were right ;) I wonder why thou this happens. Maybe ppl correlate (me included) AMD64 with AMD cpu while i for intel and ppl does not relate to that - I know it's not that, but sometimes I make mind shortcuts ;p
Or that first on the left is i386, while as you said, should be niche.
Some UX questions that could prevent that mistake in future ;)

Anyway, thanks, helped me! :)

@slevenick
Copy link
Collaborator

Ah, this is the canonical issue, reopening

@slevenick slevenick reopened this Jan 16, 2024
@rileykarson rileykarson changed the title Value out of range when trying to run plan panic: Error reading level state: strconv.ParseInt: parsing "1591104108820": value out of range (32 bit architecture) Mar 5, 2024
@rileykarson
Copy link
Collaborator

rileykarson commented Mar 5, 2024

I'm gonna edit the parent here to cover the general case. Here's the original parent comment:


Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request.
  • Please do not leave +1 or me too comments, they generate extra noise for issue followers and do not help prioritize the request.
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment.
  • If an issue is assigned to the modular-magician user, it is either in the process of being autogenerated, or is planned to be autogenerated soon. If an issue is assigned to a user, that user is claiming responsibility for the issue. If an issue is assigned to hashibot, a community member has claimed the issue already.

E

Terraform Version

0.12.25

Affected Resource(s)

google_bigquery_dataset

Terraform Configuration Files

provider "google" {
  version = "3.22.0"
  project = var.gke_project
  region  = var.gke_region
}

provider "google-beta" {
  version = "3.22.0"
  project = var.gke_project
  region  = var.gke_region
}

resource "google_compute_subnetwork" "subnetwork-ip-alias" {
  name          = "${var.cluster_name}-subnet"
  region        = var.gke_region
  network       = var.vpc_self_link
  ip_cidr_range = var.ipv4_main_range

  secondary_ip_range {
    range_name    = var.ipv4_pods_range_name
    ip_cidr_range = var.ipv4_pods_range
  }

  secondary_ip_range {
    range_name    = var.ipv4_services_range_name
    ip_cidr_range = var.ipv4_services_range
  }
}

resource "google_bigquery_dataset" "dataset" {
  dataset_id    = replace("gke_usage_${var.cluster_name}", "-", "_")
  friendly_name = "gke-usage-${var.cluster_name}"
  description   = "GKE usage - ${var.cluster_name}"
  location      = "EU"
  project       = var.gke_project

  labels = {
    env     = var.env
    cluster = var.cluster_name
  }

  access {
    role          = "OWNER"
    special_group = "projectOwners"
  }
  access {
    role          = "READER"
    special_group = "projectReaders"
  }
  access {
    role          = "WRITER"
    special_group = "projectWriters"
  }
}

resource "google_container_cluster" "gke_cluster" {
  name                    = var.cluster_name
  description             = var.cluster_description
  location                = var.gke_region
  min_master_version      = var.gke_version
  node_version            = var.gke_version
  enable_kubernetes_alpha = "false"
  provider                = google-beta

  # Remove default node pool
  remove_default_node_pool = true
  initial_node_count       = 1

  # Network to which the cluster is connected
  network    = var.vpc_self_link
  subnetwork = google_compute_subnetwork.subnetwork-ip-alias.name

  master_auth {
    username = ""
    password = ""

    client_certificate_config {
      issue_client_certificate = false
    }
  }

  cluster_autoscaling {
    enabled = true

    auto_provisioning_defaults {
      oauth_scopes = var.node_autoprovisioning_oath_scopes
    }

    resource_limits {
      resource_type = "cpu"
      minimum       = var.node_autoprovisioning_settings.min_cpu
      maximum       = var.node_autoprovisioning_settings.max_cpu
    }

    resource_limits {
      resource_type = "memory"
      minimum       = var.node_autoprovisioning_settings.min_mem
      maximum       = var.node_autoprovisioning_settings.max_mem
    }
  }


  maintenance_policy {
    recurring_window {
      start_time = var.default_maintenance_policy_recurring_window.start_time
      end_time   = var.default_maintenance_policy_recurring_window.end_time
      recurrence = var.default_maintenance_policy_recurring_window.recurrence
    }
  }

  ip_allocation_policy {
    cluster_secondary_range_name  = var.ipv4_pods_range_name
    services_secondary_range_name = var.ipv4_services_range_name
  }

  resource_labels = {
    team        = "mls"
    type        = "compute"
    environment = var.env
  }

  vertical_pod_autoscaling {
    enabled = false
  }

  addons_config {
    dns_cache_config {
      enabled = true
    }
  }

  resource_usage_export_config {
    enable_network_egress_metering       = false
    enable_resource_consumption_metering = true

    bigquery_destination {
      dataset_id = google_bigquery_dataset.dataset.dataset_id
    }
  }
}

Debug Output

https://gist.github.com/thecodeassassin/f9e0c436100cefb2028eee96ab9faf18

Panic Output

Crash log doesn't exist.

Expected Behavior

Plan should be created successfully.

Actual Behavior

Terraform exists, we found this error:

module..google_bigquery_dataset.dataset: Refreshing state... [id=projects/***/datasets/gke_usage_europe_west]
2020-06-03T15:46:30.339Z [DEBUG] plugin.terraform-provider-google_v3.24.0_x5: panic: Error reading level state: strconv.ParseInt: parsing "1591104108820": value out of range

resulting in:

Error: rpc error: code = Canceled desc = context canceled

Steps to Reproduce

  1. terraform plan

Important Factoids

Things were working fine before. This suddenly stopped working, none of our code eve changd, the entire pipeline just died.

References

  • #0000

b/304968076

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

10 participants