Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rego "import future.keywords.in" causes template failure #1827

Closed
asm2git opened this issue Feb 1, 2022 · 66 comments
Closed

Rego "import future.keywords.in" causes template failure #1827

asm2git opened this issue Feb 1, 2022 · 66 comments
Assignees
Labels
bug Something isn't working gator cmd

Comments

@asm2git
Copy link

asm2git commented Feb 1, 2022

What steps did you take and what happened:

After using konstraint to create a ConstraintTemplate YAML file, then doing "helm install" to install the file, I got the following error:

Error: admission webhook "validation.gatekeeper.sh" denied the request: 1 error occurred: templates["admission.k8s.gatekeeper.sh"]["VaultSecretPath"]:4: rego_parse_error: unexpected import path, must begin with one of: {data, input}, got: future
    import future.keywords.in

What did you expect to happen:

I expected there not to be any errors. https://www.openpolicyagent.org/docs/latest/policy-language/#membership-and-iteration-in documents that Opa uses the notation "import future.keywords" to allow new constructs, in this case, the "in" keyword. That link also shows a sample Rego file that uses the "in" keyword.

Anything else you would like to add:
In vendor/github.com/open-policy-agent/opa/ast/parser.go I see code for WithFutureKeywords(), WithAllFutureKeywords(), futureParser(), and "var futureKeywords" so maybe this is just missing documentation, in that there's a Helm value or a command line arg that enables "import future..."? I've searched all the code and the whole Helm chart and if such a flag or option or parameter is there, I can't find it (maybe because I don't know exactly what to look for?).

Environment:

  • Gatekeeper version: v3.7.0
  • Kubernetes version: (use kubectl version): 1.20.2
@asm2git asm2git added the bug Something isn't working label Feb 1, 2022
@maxsmythe
Copy link
Contributor

@srenatus Do you know off the top of your head whether this is due to needing to upgrade G8r's version of OPA? Or is there some bootstrapping that needs doing to enable these imports?

@srenatus
Copy link
Contributor

srenatus commented Feb 3, 2022

GK is at OPA 0.35.0 right now, I think. The future keywords machinery was introduced with "in" in 0.34.0 (https://github.com/open-policy-agent/opa/releases/tag/v0.34.0).

Not sure where the problem comes from, I'd need a closer look...

@srenatus
Copy link
Contributor

srenatus commented Feb 3, 2022

Generally there are two ways to enable the future keywords, the import alone should be sufficient. In situations where that isn't available, you can provide the parser options. But one or the other is enough.

@srenatus
Copy link
Contributor

srenatus commented Feb 3, 2022

Hmm looks like 3.7.0 and 3.7.1 of GK use a version of OPA that's too old, 0.29.4.

@srenatus
Copy link
Contributor

srenatus commented Feb 3, 2022

💭 Perhaps it would be useful to mention the OPA version used in a GK version more prominently? I've thought about using badges before, like img, but these would have to be updated manually, too.

@ritazh
Copy link
Member

ritazh commented Feb 3, 2022

Opa upgrade now comes from framework. Can you try latest commit in master? that is referencing framework that’s using opa v0.35 https://github.com/open-policy-agent/frameworks/pull/167/files#diff-45830b37d8fe6cfa6317672df498ec856ff48ffd48cd2537d18337e6281d6618R14

+1 on showing opa version badge in master, and perhaps for each release doc. thoughts?

@leeming
Copy link

leeming commented Feb 3, 2022

I am trialing gatekeeper currently and I'm super glad this wasn't just me having this issue.

I'm not clear on the above fixes things. I have pulled master for open-policy-agent/gatekeeper and have re-run helm template charts/gatekeeper/ | kubectl apply -f - followed by deploying a rule with the import for "in", but I still get "Request is invalid". Whereas the same rule in https://play.openpolicyagent.org/ works as expected

@asm2git
Copy link
Author

asm2git commented Feb 3, 2022

@ritazh When I try to build Gatekeeper locally it fails. My boss doesn't want me spending time trying to debug the local build.

I'm installing from the published Helm charts. If there were a snapshot or bugfix Helm chart I could use I could test that. Or a bugfix or snapshot Docker image that I could specify in my "helm install".

Thanks,
Adam

@maxsmythe
Copy link
Contributor

maxsmythe commented Feb 5, 2022

IIRC we build an image from HEAD for every commit, tagged with the shortened commit hash:

https://github.com/open-policy-agent/gatekeeper/tree/master/charts/gatekeeper

It looks like you can change the image tag used in the Helm chart via values.image.release:

https://hub.docker.com/r/openpolicyagent/gatekeeper/tags

Mixing Helm chart versions and image versions probably isn't a great long term idea, since the required config for an image can change over time, but should work.

The Helm chart that is built from HEAD for a specific commit lives here:

https://github.com/open-policy-agent/gatekeeper/tree/master/manifest_staging/charts/gatekeeper

@ritazh If we need to upgrade OPA before we're comfortable upgrading the constraint framework, we could cut a new release that overrides the version of OPA used.

@maxsmythe
Copy link
Contributor

also, @srenatus thanks for digging in!

@ritazh
Copy link
Member

ritazh commented Feb 5, 2022

@ritazh If we need to upgrade OPA before we're comfortable upgrading the constraint framework, we could cut a new release that overrides the version of OPA used.

I think that makes sense. But this is technically not a patch since we are upgrading opa. So we would need to cut v3.8.0. Are we ready for that?

@maxsmythe
Copy link
Contributor

Do we have specific features we want in 3.8? Sharding?

@ritazh
Copy link
Member

ritazh commented Feb 5, 2022

yes, sharding

@maxsmythe
Copy link
Contributor

Okay, let's see how the timelines go

@asm2git
Copy link
Author

asm2git commented Feb 14, 2022

I just tested this again. I used the command "helm install gatekeeper gatekeeper/gatekeeper --set image.release=$1 --set postInstall.labelNamespace.image.tag=$1" where "$1" was every Docker image tag (from https://hub.docker.com/r/openpolicyagent/gatekeeper/tags) "dev" back to "e47b31c". All of them failed with the error "Error: admission webhook "validation.gatekeeper.sh" denied the request: invalid ConstraintTemplate: invalid import: bad import" That's a different error than I was getting before.

Maybe I'm not getting the right Docker image? Is there a specific image tag you'd like me to test?

Here are the Helm charts I'm using to test: http://cakewalk.menlo.com/gatekeeper/; the raw REGO file is also there.

@maxsmythe
Copy link
Contributor

Anything tagged with a hash in the past little while (fc374da is the most recent hash-specific build) should have the current version of OPA that we're using in HEAD.

I think the issue might be our locked down importing that we use to enforce sandboxing:

https://github.com/open-policy-agent/frameworks/blob/7950750c4ec6a56d55a8521b40d85ef35de25170/constraint/pkg/regorewriter/regorewriter.go#L281-L283

It'd need to be updated to allow future, in addition to data

@willbeason, it looks like the config path for passing in allowed data fields has changed as part of sharding. Right now it doesn't look like the client option AllowedDataFields will be honored if set. Is that going to be revamped?

@willbeason
Copy link
Member

@maxsmythe Yes - I'm fixing this and moving it to be an option on Driver. Client shouldn't care about this, and it makes sense as a Driver configuration. There's never a reason to change this at runtime, so it will be an immutable part of Driver specified via an Arg in the constructor.

@asm2git
Copy link
Author

asm2git commented Feb 28, 2022

@willbeason, @maxsmythe: Any idea when this change will make it into a release (3.7.x or 3.8)? Please understand that I'm not trying to rush you or put pressure on you, I just need to schedule some other work so it would help me if I had an idea of when this might be ready? Thanks.

@willbeason
Copy link
Member

@willbeason, @maxsmythe: Any idea when this change will make it into a release (3.7.x or 3.8)? Please understand that I'm not trying to rush you or put pressure on you, I just need to schedule some other work so it would help me if I had an idea of when this might be ready? Thanks.

Yes - 3.8 is pushed back to March 14th, but this change will be included.

@asm2git
Copy link
Author

asm2git commented Apr 11, 2022

I just tested this in v3.8.0-rc.1 by doing the following:

helm install gatekeeper gatekeeper/gatekeeper --namespace gatekeeper-system \
    --set image.release=v3.8.0-rc.1 --set postInstall.labelNamespace.image.tag=v3.8.0-rc.1

followed by "helm install" of my Gatekeeper templates and constraints.

Unfortunately I'm still getting the "bad import" error when I try to use a template that contains "import future.keywords.in".

Do I need to do something more to get the latest release candidate installed?

I thought this issue was going to be fixed in the 3.8 release; was the fixed pushed out to a later release?

Thanks.

@willbeason
Copy link
Member

@asm2git This was added to Frameworks in open-policy-agent/frameworks#207, and updated in Gatekeeper in #1968, which is included in v3.8.0-rc.1. We have tests in place which validate that ConstraintTemplates using "import future.keywords.in" both compile and execute.

Can you provide an example ConstraintTemplate which exhibits this behavior? My best guess is that either there is a problem with the ConstraintTemplate, or we aren't testing this properly. With the ConstraintTemplate I'll be able to narrow this down.

@asm2git
Copy link
Author

asm2git commented Apr 11, 2022

@willbeason here's what I'm, using; sorry it's so long

apiVersion: templates.gatekeeper.sh/v1beta1
kind: ConstraintTemplate
metadata:
  creationTimestamp: null
  name: vaultsecretpath
spec:
  crd:
    spec:
      names:
        kind: VaultSecretPath
  targets:
  - libs:
    - |-
      package lib.core


      default is_gatekeeper = false

      default name = "NO_NAME"

      is_gatekeeper {
        has_field(input, "review")
        has_field(input.review, "object")
      }

      resource = input.review.object {
        is_gatekeeper
      }

      resource = input {
        not is_gatekeeper
      }

      format(msg) = gatekeeper_format {
        is_gatekeeper
        gatekeeper_format = {"msg": msg}
      }

      format(msg) = msg {
        not is_gatekeeper
      }

      format_with_id(msg, id) = msg_fmt {
        msg_fmt := {
          "msg": sprintf("%s: %s", [id, msg]),
          "details": {"policyID": id},
        }
      }

      apiVersion = resource.apiVersion

      name = resource.metadata.name

      kind = resource.kind

      labels = resource.metadata.labels

      annotations = resource.metadata.annotations

      gv = split(apiVersion, "/")

      group = gv[0] {
        contains(apiVersion, "/")
      }

      group = "core" {
        not contains(apiVersion, "/")
      }

      version = gv[minus(count(gv), 1)]

      has_field(obj, field) {
        not object.get(obj, field, "N_DEFINED") == "N_DEFINED"
      }

      missing_field(obj, field) {
        obj[field] == ""
      }

      missing_field(obj, field) {
        not has_field(obj, field)
      }

      is_deployment {
        lower(kind) == "deployment"
      }

      is_pod {
        lower(kind) == "pod"
      }
    rego: |-
      package vault_secret_path

      import data.lib.core as core
      import future.keywords.in

      PolicyID := "VSP-0001"

      image_secrets := {
        "dev-app": [
          "db/example-1",
        ],
        "prod-app": [
          "a/b/c/d/example-1",
          "db/example-1",
          "example-1"
        ],
        "com.mathworks.test.my-app": [
          "1/2/3/4/aaa",
          "1/2/3/4/ccc",
          "1/aaa",
          "1/bbb",
          "aaa",
          "aaadev1",
          "aaaprod",
          "db/aaa"
        ],
        "com.mathworks.txapps.foo": [
          "txapps/11/password",
          "txapps/22/password",
          "txapps/33/password"
        ],
        "com.mathworks.txapps.bar": [
          "txapps/44/password",
          "txapps/55/password",
          "txapps/66/password"
        ],
        "com.mathworks.txapps.foo:1.22.333": [
          "txapps/77/password",
          "txapps/88/password",
          "txapps/99/password"
        ],
        "com.mathworks.licensing.foo": [
          "licensing/11/password",
          "licensing/22/password",
          "licensing/33/password"
        ]
      }

      violation[msg] {
        core.is_pod
        keys := pathKeys(core.resource.metadata.annotations)
        count(keys) >= 0
        matches := getMatches(AllowedPathRegex, keys, core.resource.metadata.annotations)
        not all(matches)
        msg := "vault path fail"
      }

      violation[msg] {
        core.is_pod
        images := getImages(core.resource.spec.containers)
        count(images) >= 0

        keys := pathKeys(core.resource.metadata.annotations)
        count(keys) >= 0

        secrets := {
          secret |
          some image in images;
          tmp := getSecrets(image)
          some tx in tmp;
          secret := tx
        }
        count(secrets) == 0
        msg := "permission denied (no entry found)"
      }

      violation[msg] {
        core.is_pod

        images := getImages(core.resource.spec.containers)
        count(images) >= 0

        keys := pathKeys(core.resource.metadata.annotations)
        count(keys) >= 0

        secrets := {
          secret |
          some image in images;
          tmp := getSecrets(image)
          some tx in tmp;
          secret := tx
        }
        count(secrets) > 0

        paths := {
          path |
          some key in keys;
          path := getSecretBase(core.resource.metadata.annotations[key])
        }
        count(paths) > 0

        found := intersection({secrets,paths})

        found != paths

        msg := "vault permission fail"
      }

      Realm = "/supported" {
        core.resource.metadata.namespace == "prod"
      } else = "/(un)?supported" {
        core.resource.metadata.namespace != "prod"
      }

      Variant = "/prod" {
        core.resource.metadata.namespace == "prod"
      } else = sprintf("/(?:%s|NOT_PROD)", [core.resource.metadata.namespace]) {
        core.resource.metadata.namespace != "prod"
      }

      PathRegexStart := "^[^/]+/data"

      Depth := "(?:/.*)?"

      AllowedPathRegex := [ PathRegexStart, Realm, Depth, Variant, "$" ]

      getImages(containers) = images {
        images := [image | some c in containers; image := c.image]
      }

      getMatches(regex, keys, annotations) = matches {
        matches := [m | key = keys[_]; m = re_match(concat("", regex), annotations[key])]
      }

      getSecretBase(path) = base {
        parts := split(path, "/")

        base := concat("/", array.slice(parts, 3, count(parts) - 1))
      }

      getSecrets(image) = secrets {
        name := imageNameWithTag(image)
        tmp := object.get(image_secrets, name, [])
        count(tmp) > 0
        secrets := tmp
      }

      getSecrets(image) = secrets {
        nn := imageNameWithTag(image)
        tt := object.get(image_secrets, nn, [])
        count(tt) == 0

        name := imageName(image)
        tmp := object.get(image_secrets, name, [])
        count(tmp) > 0
        secrets := tmp
      }

      imageName(image) = name {
        slash := indexof(image, "/")
        colon := indexof(image, ":")
        name := substring(image, slash + 1, colon - 1 - slash)
      }

      imageNameWithTag(image) = name {
        slash := indexof(image, "/")
        name := substring(image, slash + 1, count(image) - slash - 1)
      }

      pathKeys(annotations) = keys {
        keys := [key | annotations[key]; startswith(key, "vault.hashicorp.com/agent-inject-secret-")]
      }
    target: admission.k8s.gatekeeper.sh
status: {}

@asm2git
Copy link
Author

asm2git commented Apr 11, 2022

@willbeason just in case you need it, here's the constraint that goes along with it

apiVersion: constraints.gatekeeper.sh/v1beta1
kind: VaultSecretPath
metadata:
  name: vaultsecretpath
spec:
  match:
    kinds:
    - apiGroups:
      - ""
      kinds:
      - Pod

Let me know if you need the full Helm Charts I'm using.

@willbeason
Copy link
Member

That's perfect! I'll try to replicate and check back in the next couple days.

@asm2git
Copy link
Author

asm2git commented Apr 18, 2022

@willbeason I just tried my test again with rc2, unfortunately it still fails. I don't know if this is expected or not.

Just to recap, here's how I'm installing Gatekeeper:

helm install gatekeeper gatekeeper/gatekeeper --namespace gatekeeper-system \
    --set image.release=v3.8.0-rc.2 --set postInstall.labelNamespace.image.tag=v3.8.0-rc.2

Please let me know if there's something I need to do differently.

Thanks.

@ritazh
Copy link
Member

ritazh commented Apr 19, 2022

As an aside, @asm2git you should always upgrade the helm chart release to make sure you get all the chart/yaml updates in addition to just updating the image tag. e.g CRD spec updates. Though I don’t think that’s the issue here.

@asm2git
Copy link
Author

asm2git commented Apr 19, 2022

@ritazh Absolutely, although it looks like there's no chart for rc2, just rc1.

I'll try my tests again tomorrow with "helm install ... --version 3.8.0-rc.2" (and fall back to "rc.1" if that doesn't find a newer chart). I'll post the results.

@ZiaUrRehman-GBI
Copy link

ZiaUrRehman-GBI commented Apr 19, 2022

admission webhook "validation.gatekeeper.sh" denied the request: invalid ConstraintTemplate: invalid import: bad import

apiVersion: templates.gatekeeper.sh/v1beta1
kind: ConstraintTemplate
metadata:
  name: loadbalancerconstraint
spec:
  crd:
    spec:
      names:
        kind: LoadBalancerConstraint
  targets:
    - target: admission.k8s.gatekeeper.sh
      rego: |
        package constraint
        import future.keywords.in
        violation[{"msg": msg}] {
          input.review.kind.kind = "Service"
          input.review.operation = "CREATE"
          input.review.object.spec.type in {"LoadBalancer", "NodePort"}
          not input.review.userInfo.username in {"test-service-account@PROJECT_ID.iam.gserviceaccount.com"}
          msg := sprintf("review object: %v", [input.review])
        }

opa gatekeeper install today
kubectl apply -f https://raw.githubusercontent.com/open-policy-agent/gatekeeper/master/deploy/gatekeeper.yaml
GKE version v1.22.7-gke.1500

@asm2git
Copy link
Author

asm2git commented Apr 27, 2022

@maxsmythe On a freshly-installed Debian 11 droplet at DigitalOcean, wholly disconnected from anything to do with my company, I copied your exact commands and got the same error.

On a new KinD cluster, I tried "helm install gatekeeper gatekeeper/gatekeeper --namespace gatekeeper-system --version 3.8.0"; same error.

On yet another KinD cluster, I tried "helm install gatekeeper gatekeeper/gatekeeper --set 'image.repository=openpolicyagent/gatekeeper' --set 'image.release=v3.8.0'"; same error.

@asahnovskiy-deloitte have you tried it with the v3.8.0 release and if so, did you get the same error I'm getting or does it now work for you?

@maxsmythe It sure seems to me there's something different between your environment and mine. If you look back in this thread at the comments from @srenatus and @ritazh there was some discussion of which version of OPA was being included (pulled? loaded?) in the latest release of Gatekeeper. Is it possible that you already have a new version of OPA somehow "available" that Gatekeeper is using, and that's why it's working for you and not for me?

Anyway, I still can't get this to work. Should we re-open this issue?

Thanks,
Adam

@willbeason willbeason reopened this Apr 27, 2022
@willbeason willbeason assigned ritazh and maxsmythe and unassigned willbeason Apr 27, 2022
@willbeason
Copy link
Member

Adding @ritazh as an assignee since this could be a helm issue

@maxsmythe
Copy link
Contributor

@asm2git This is weird. Can you post the image sha running on the pod? It should be in the pod status. That will be less ambiguous than label.

For me, kind should be pulling directly from Docker (but it should also be doing that for you), so I'm not sure why we'd be getting different images.

@asm2git
Copy link
Author

asm2git commented May 2, 2022

apply.gatekeeper-controller-manager.pod.yaml.txt

image: docker.io/openpolicyagent/gatekeeper:v3.8.0
imageID: docker.io/openpolicyagent/gatekeeper@sha256:6b5597d1cd5cdfed3f8bd9c63ff2c63312bb640001295ac82bef211841f9d0c1

I confirmed that I get the same image and imageID whether I use "helm install" or "kubectl apply -f https://raw.githubusercontent.com/..."

I also confirmed that I am still getting the same failure as I posted last week.

Please let me know if there's any additional information you need or if you need me to run any other experiments. If necessary I might be able to arrange for you to get access to my external instance.

@maxsmythe
Copy link
Contributor

This is very weird. I have the exact same image sha. Everything points to an image mismatch, however.

e.g. if I have a "bad import" error, here is what happens for me:

$ kubectl apply -f tmptmpl.yaml
The request is invalid

but if I make it a verbose request, I get this:

...
I0503 23:10:17.068246 2463878 request.go:1181] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"admission webhook \"validation.gatekeeper.sh\" denied the request: invalid ConstraintTemplate: 1 error occurred: template:3: rego_parse_error: unexpected import path, must begin with one of: {data, future, input}, got: dogs\n\timport dogs.keywords.mom\n\t       ^","code":422}
F0503 23:10:17.068413 2463878 helpers.go:118] The request is invalid
...

It might be worth setting up a video call to see this in real time. Not sure why there is a version mismatch (Rego version is hardlinked in to Gatekeeper), but everything is pointing to that.

@asm2git
Copy link
Author

asm2git commented May 4, 2022

I'm happy to do a video call. I'm in US/Eastern, typically working 0830 to 1600 (UTC 1230 to 2000) but I can do a call as late as 2000 (UTC 0000) if that's better for you. Monday, Wednesday, and Friday are better for me.

It might be better if we do this on my personal instance (Debian 11). I'll work on recreating that, then I can snapshot it and just bring it up when we're ready for the call.

@maxsmythe
Copy link
Contributor

I've started getting the same errors as you. Not sure why I wasn't before. Looking into it, the "from future" stuff doesn't work as of 3.8.0, but on newer commits. If you wanted to play around with it, this image should work:

openpolicyagent/gatekeeper:ecf6092

Sorry for the confusion, not sure why I wasn't seeing the errors before. Maybe the webhook hadn't spun up before I created the constraint template the first time?

In any case, we'd need to cut a release if you want it on a non-commit-hash-based tag.

@asm2git
Copy link
Author

asm2git commented May 5, 2022

@maxsmythe Still no joy :-(

Installed with: helm install gatekeeper gatekeeper/gatekeeper --namespace gatekeeper-system --set "image.release=ecf6092" --set "postInstall.labelNamespace.image.tag=ecf6092"

Error when applying bad ConstraintTemplate: Error from server: error when creating "policy/test-import/template.yaml": admission webhook "validation.gatekeeper.sh" denied the request: invalid ConstraintTemplate: invalid import: bad import

Image check: "kubectl get deployment -n gatekeeper-system gatekeeper-controller-manager -oyaml | grep image" gives "image: openpolicyagent/gatekeeper:ecf6092"

I haven't tried this yet on my outside instance but I have less reason to believe this is due to any of our proxying or caching.

Please let me know what you'd like me to try next, or if you want to go ahead with setting up a call.

@maxsmythe
Copy link
Contributor

Sigh, my version struggle bus continues. I think the dev tag was impacted by:

#2041 (comment)

This time I've verified the following image should work. If it doesn't work, we should set up a call for Monday (giving us Friday to find a time):

openpolicyagent/gatekeeper:ffea423

@asm2git
Copy link
Author

asm2git commented May 6, 2022

@maxsmythe My tests pass with ffea423. Any chance of getting a 3.8.2 release that includes this fix? If so, roughly how long would that take?

Thanks again,
Adam

@maxsmythe
Copy link
Contributor

Whew, glad we sorted this out!

@sozercan Would we cut this as 3.8.2? What would be the barriers to releasing this?

@asm2git
Copy link
Author

asm2git commented May 16, 2022

@sozercan @maxsmythe Any word on a new release?

@maxsmythe
Copy link
Contributor

@sozercan any barriers to starting a new release chain incorporating everything in main?

@ritazh
Copy link
Member

ritazh commented May 18, 2022

Per community call, we have decided to cut v3.9.0-beta.1 to include this change.

@maxsmythe
Copy link
Contributor

Release v3.9.0-beta.1 cut!

https://github.com/open-policy-agent/gatekeeper/releases/tag/v3.9.0-beta.1

@thomasmckay
Copy link
Contributor

And 3.8.2 will be created as well?

@ritazh
Copy link
Member

ritazh commented May 19, 2022

Looking at v3.8.0 and v3.8.0-rc.1, both include #1968, @maxsmythe from your tests do you know why v3.8.0 and v3.8.0-rc.1 are missing the OPA upgrade? Or was the change introduced in a different commit that came after #1968?

@maxsmythe
Copy link
Contributor

There was a frameworks change to regorewriter that was necessary in order to recognize the new keyword

@ritazh
Copy link
Member

ritazh commented May 20, 2022

Do you mind point me to the PR/commit where this change was made?

@sozercan
Copy link
Member

sozercan commented May 20, 2022

@ritazh open-policy-agent/frameworks#217

GK 3.8.0 pins frameworks to https://github.com/open-policy-agent/frameworks/releases/tag/v0.5.0
while this change was added after that in open-policy-agent/frameworks@370fa37

@ritazh
Copy link
Member

ritazh commented Aug 3, 2022

As discussed on 8/3/3022 community call, this has been addressed by open-policy-agent/frameworks#217 and is part of GK v3.9.0.

@ritazh ritazh closed this as completed Aug 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working gator cmd
Projects
None yet
Development

No branches or pull requests

10 participants