Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI: K8sServicesTest Checks service across nodes Tests NodePort BPF Tests with XDP, direct routing and Hybrid #12281

Closed
tklauser opened this issue Jun 25, 2020 · 2 comments · Fixed by #12248
Assignees
Labels
area/CI Continuous Integration testing issue or flake

Comments

@tklauser
Copy link
Member

tklauser commented Jun 25, 2020

Seen on net-next tests for #12254

/home/jenkins/workspace/Cilium-PR-K8s-oldest-net-next/src/github.com/cilium/cilium/test/ginkgo-ext/scopes.go:514
Request from k8s1 to service tftp://192.168.36.12:31234/hello failed
Expected command: kubectl exec -n kube-system log-gatherer-smsnq -- /bin/bash -c 'fails=""; id=$RANDOM; for i in $(seq 1 10); do if curl --path-as-is -s -D /dev/stderr --fail --connect-timeout 5 --max-time 20 tftp://192.168.36.12:31234/hello -H "User-Agent: cilium-test-$id/$i"; then echo "Test round $id/$i exit code: $?"; else fails=$fails:$id/$i=$?; fi; done; if [ -n "$fails" ]; then echo "failed: $fails"; fi; cnt="${fails//[^:]}"; if [ ${#cnt} -gt 0 ]; then exit 42; fi' 
To succeed, but it failed:
Exitcode: 42 
Stdout:
 	 
	 Hostname: testds-znvgg
	 
	 Request Information:
	 	client_address=192.168.36.11
	 	client_port=57579
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/1 exit code: 0
	 
	 Hostname: testds-ztdgk
	 
	 Request Information:
	 	client_address=10.10.0.49
	 	client_port=39444
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/2 exit code: 0
	 
	 Hostname: testds-znvgg
	 
	 Request Information:
	 	client_address=192.168.36.11
	 	client_port=32773
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/3 exit code: 0
	 
	 Hostname: testds-znvgg
	 
	 Request Information:
	 	client_address=192.168.36.11
	 	client_port=54477
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/4 exit code: 0
	 
	 Hostname: testds-znvgg
	 
	 Request Information:
	 	client_address=192.168.36.11
	 	client_port=38194
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/5 exit code: 0
	 
	 Hostname: testds-znvgg
	 
	 Request Information:
	 	client_address=192.168.36.11
	 	client_port=50590
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/6 exit code: 0
	 
	 Hostname: testds-ztdgk
	 
	 Request Information:
	 	client_address=10.10.0.49
	 	client_port=51870
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/8 exit code: 0
	 
	 Hostname: testds-znvgg
	 
	 Request Information:
	 	client_address=192.168.36.11
	 	client_port=33737
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/9 exit code: 0
	 
	 Hostname: testds-ztdgk
	 
	 Request Information:
	 	client_address=10.10.0.49
	 	client_port=48269
	 	real path=/hello
	 	request_scheme=tftp
	 
	 Test round 32157/10 exit code: 0
	 failed: :32157/7=28
	 
Stderr:
 	 command terminated with exit code 42
	 

/home/jenkins/workspace/Cilium-PR-K8s-oldest-net-next/src/github.com/cilium/cilium/test/k8sT/Services.go:749

3904f832_K8sServicesTest_Checks_service_across_nodes_Tests_NodePort_BPF_Tests_with_XDP,_direct_routing_and_Hybrid.zip

https://jenkins.cilium.io/job/Cilium-PR-K8s-oldest-net-next/963

Possibly related to or a recurrence of #11029? /cc @borkmann @brb @joestringer

@tklauser tklauser added the area/CI Continuous Integration testing issue or flake label Jun 25, 2020
@brb
Copy link
Member

brb commented Jul 6, 2020

Possibly related to or a recurrence of #11029?

It has a different symptom - error code 42:

CURLE_ABORTED_BY_CALLBACK (42)

Aborted by callback. A callback returned "abort" to libcurl. 

@jrajahalme
Copy link
Member

Cilium agent logs have this:

2020-06-24T17:10:22.087649187Z level=debug msg="DNS Proxy bound to address" address="[::]:44189" subsys=fqdn/dnsproxy

and thousands of these:

2020-06-24T17:11:05.227290261Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240629369Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.24064879Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240651933Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.24065434Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.2406567Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240835317Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240842302Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240844841Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240859911Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240862216Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240864429Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240866762Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240868993Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
2020-06-24T17:11:05.240871218Z level=debug msg="dnsproxy: Wrote DNS response (12/12 bytes) from 192.168.36.11:44189 to 10.10.1.14:69" subsys=fqdn/dnsproxy
...

So this was fixed by #12248, closing.

@borkmann borkmann removed this from TODO (untriaged & unsorted) in 1.9 kube-proxy removal & general dp optimization Jul 10, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/CI Continuous Integration testing issue or flake
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants