Skip to content

Conversation

@panpan0000
Copy link
Contributor

@panpan0000 panpan0000 commented Sep 14, 2022

To address issue #1610

@gclawes
Copy link
Contributor

gclawes commented Sep 14, 2022

Chart/manifest diff looks fine to me. I'm less familiar with the webook logic, if someone else can verify this is OK then I think it's good to go.

@fedepaol
Copy link
Member

@panpan0000 thanks for the PR!
Couple of notes:

  • the commit subject is cut, please fix it and add a body with the reason why we need this (we are trying to adhere to https://www.sangfor.com/)
  • I think you'll have to run inv generatemanifests to generate the all-in-one manifests out of the config

@panpan0000
Copy link
Contributor Author

panpan0000 commented Sep 15, 2022

Thanks @fedepaol . I've update the commit message as you suggested.

But I'm newbie for invoke, I met problem with inv generatemanifests like below:

#inv generatemanifests
Error: rawResources failed to read Resources: Load from path ../native failed: '../native' must be a file (got d='/root/metallb/config/native')

I guess maybe tool version compatible issue.

I will have to manually run below which I think equals to inv generatemanifests:
controller-gen crd:crdVersions=v1 rbac:roleName=manager-role webhook paths=\"./api/...\" output:crd:artifacts:config=config/crd/bases


my tool chain version..

# controller-gen --version
Version: (unknown)  --------> I complied from latest src code https://github.com/kubernetes-sigs/controller-tools.git
# kustomize version
Version:kustomize/v4.5.7 
# go version
go version go1.19.1 linux/amd64

But a lot of changes show up , I will take another look.

@panpan0000
Copy link
Contributor Author

I think I've addressed the second controller-gen problem.
The controller-gen should be exactly v0.7.0, otherwise, many additional change will show up ( I previously used v0.9.2 ).

I have to re-compile controller-gen binary to ensure "controller-gen --version" shows "v0.7.0".

#controller-gen --version
Version: v0.7.0

#controller-gen crd:crdVersions=v1 rbac:roleName=manager-role webhook paths=\"./api/...\" output:crd:artifacts:config=config/crd/bases^C

#git diff
# nothing more shows up,  ** I guess my PR is all set now ?**

This takes the newbie contribution a bit hard. maybe we should note it somewhere like contributor guide docs.

@fedepaol
Copy link
Member

I think I've addressed the second controller-gen problem. The controller-gen should be exactly v0.7.0, otherwise, many additional change will show up ( I previously used v0.9.2 ).

I have to re-compile controller-gen binary to ensure "controller-gen --version" shows "v0.7.0".

#controller-gen --version
Version: v0.7.0

#controller-gen crd:crdVersions=v1 rbac:roleName=manager-role webhook paths=\"./api/...\" output:crd:artifacts:config=config/crd/bases^C

#git diff
# nothing more shows up,  ** I guess my PR is all set now ?**

This takes the newbie contribution a bit hard. maybe we should note it somewhere like contributor guide docs.

Yup, or we should update controller-gen to a newer version and provide a handy way to get it. I'll file an issue for that, and thanks for the feedback!

readinessProbe should wait webhook-server to be ready.

Currently , the readiness of conttroller uses `/metrics`,
which will be ready sooner whenpod starts,
but another important webhook-service will be slower.
So `helm --wait` should wait critical condition of web-hook service
ready.
Otherwise, apply CR right after helm install finishes,
errors will show up:
```
failed to call webhook: Post
"https://metallb-webhook-service.metallb-system.svc:443/validate-metallb-io-v1beta1-ipaddresspool?timeout=10s":
  dial tcp 100.67.81.84:443: connect: connection refused
```
@panpan0000
Copy link
Contributor Author

panpan0000 commented Sep 16, 2022

github action still fails the inv generatemanifests , my previous controll-gen alone was not enough, the kustomize should be run as well.

my local running of inv generatemanifests with error message:

Error: rawResources failed to read Resources: Load from path ../native failed: '../native' must be a file (got d='/root/metallb/config/native')

After digging into tasks.py, I found out the error comes from command kubectl kustomize config/frr > config/manifests/metallb-frr.yaml
although my kustomize version is up to date, but my kubectl is very old , v1.20.2.

after I upgrade my kubectl to v1.23.7, issue gone!

so my PR has been updated (git push -f) with final successful run of inv generatemanifests.
Hi, github action ,please trigger again and bless all green.

@panpan0000
Copy link
Contributor Author

panpan0000 commented Sep 18, 2022

Way is not easy...
Directly call webhook root path "IP:9443:/" will get HTTP code 404, and kubernetes treats it as not ready.that's why previous e2e test fails.
while API like "IP:9443:/validate-metallb-io-v1beta1-addresspool" without data shows:

{"response":{"uid":"","allowed":false,"status":{"metadata":{},"message":"contentType=,expected application/json","code":400}}}

but at least it responses 200 HTTP code.

So I added second commit : 9b24f45
I've tested it, pod will be ready after 20s in my cluster.

I know it's not elegant, because when the readinessProbe path may change if v1beta1 bumps to v1.

But seems it's the only solution out ?

webhook path "/" will be HTTP code 404,
kubernetes treat it as not ready.
while API like "/validate-metallb-io-v1beta1-addresspool" without body
shows
```
{"response":{"uid":"","allowed":false,"status":{"metadata":{},"message":"contentType=,
expected application/json","code":400}}}
```
but at least it responses 200 HTTP code.
@panpan0000
Copy link
Contributor Author

Hi, Can someone help me to approve the workflow to trigger test again. Thanks in advanced!

@fedepaol
Copy link
Member

Thanks!

@fedepaol fedepaol merged commit 6d208de into metallb:main Sep 19, 2022
@panpan0000
Copy link
Contributor Author

Thanks @fedepaol

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants