Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kubespawner: pending spawn in loop until urllib3.exceptions.ReadTimeoutError generated #382

Closed
andreitoupinets opened this issue Nov 29, 2018 · 6 comments

Comments

@andreitoupinets
Copy link

I use jupyter with kubespawner (k8s in azure), when jupyterhub run more than 2 hours, it doesn't create new pod for logged user.
Log:

[I 2018-11-29 08:25:24.998 JupyterHub log:158] 302 GET /user/toupinets/ -> /hub/user/toupinets/ (@10.1.32.66) 0.74ms
[I 2018-11-29 08:25:28.250 JupyterHub base:1012] Pending spawn for toupinets didn't finish in 3.0 seconds
[I 2018-11-29 08:25:28.250 JupyterHub base:1018] toupinets is pending spawn

....

[W 2018-11-29 08:53:27.810 JupyterHub user:471] toupinets's server failed to start in 600 seconds, giving up
[E 2018-11-29 08:53:28.258 JupyterHub gen:974] Exception in Future <Task finished coro=<BaseHandler.spawn_single_user.<locals>.finish_user_spawn() done, defined at /usr/local/lib/python3.6/dist-packages/jupyterhub/handlers/base.py:619> exception=TimeoutError('Timeout',)> after timeout

... after some time it generate Exception:

[E 2018-11-29 08:59:07.068 JupyterHub gen:974] Exception in Future <Future finished exception=ReadTimeoutError("HTTPSConnectionPool(host='main-370a339f.hcp.southeastasia.azmk8s.io', port=443): Read timed out. (read timeout=None)",)> after timeout

It start work... It is possible to define read timeout for kubernetes service?

@minrk
Copy link
Member

minrk commented Dec 4, 2018

There's a very good chance that a read timeout here means there's a network problem between the hub and the kubernetes service. Increasing the timeout may well just delay the Hub noticing the error which might be solved by restarting the Hub pod.

@yvan
Copy link

yvan commented Mar 20, 2019

I think I'm having a similar problem. I wrote a detailed issue here:

jupyterhub/jupyterhub#2480

Basically my azure kuberenetes service cluster has trouble spawning singleuser pods. It happens before node assignment. It usually goes away after a couple attempts but sometimes it persists and makes it hard to get into a server. I'm looking for ways to check the networking to see what's going wrong during this process. In terms of feedback from jupyterhub I'm not getting a lot, just that the server was requested, a few times, and that there was a timeout in the spawner and then jupyterhub logically stops trying to spawn the server.

If anything comes up I'll ref/post it here.

@andreitoupinets
Copy link
Author

@yvan , because azure load balancer drop connection after ~5min inactivity. Try keepalive, just put into jupyterhub_config.py

import socket
from urllib3 import connection

connection.HTTPConnection.default_socket_options += [(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1),
                                                    (socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, 60),
                                                    (socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, 60),
                                                    (socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 3)]

@yvan
Copy link

yvan commented Apr 24, 2019

@andreitoupinets i tested a similar solution and it works.

@meeseeksmachine
Copy link

This issue has been mentioned on Jupyter Community Forum. There might be relevant details there:

https://discourse.jupyter.org/t/spawn-failed-timeout-even-when-start-timeout-is-set-to-3600-seconds/8098/2

@ivanov
Copy link
Member

ivanov commented Mar 27, 2024

Hey there, I'm going through old issues and it seems to me that it makes sense to close this one, sounds like a workaround was figured out.

Thanks everyone and happy hacking! :bowtie:

@ivanov ivanov closed this as not planned Won't fix, can't repro, duplicate, stale Mar 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants