Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Repeatedly logging [DEBUG] Closing connection. #2568

Closed
ntorba opened this issue Oct 21, 2020 · 12 comments
Closed

Repeatedly logging [DEBUG] Closing connection. #2568

ntorba opened this issue Oct 21, 2020 · 12 comments

Comments

@ntorba
Copy link

ntorba commented Oct 21, 2020

Describe the bug

Hello,
Recently, my seldon deployments started logging large sequences like this:

[2020-10-21 18:57:25 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:25 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:27 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:30 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:30 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:32 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:35 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:35 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:37 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:40 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:40 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:42 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:45 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:45 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:47 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:50 +0000] [22] [DEBUG] Closing connection.
[2020-10-21 18:57:50 +0000] [22] [DEBUG] Closing connection.
.... etc

This had never happened until recently, and I'm not sure what I changed to have this happen. I was curious if anyone has seen similar effects.

To reproduce

Happens for any seldon deployment I make.

Expected behaviour

Before recently, The only logs shown would be for startup and requests sent to the model.

Environment

Running this on EKS cluster

echo "#### Kubernetes version:\n $(kubectl version) \n\n#### Seldon Images:\n$(kubectl get --namespace seldon-system deploy seldon-controller-manager -o yaml  | grep seldonio)"
Error from server (NotFound): namespaces "seldon-system" not found
#### Kubernetes version:\n Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2", GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean", BuildDate:"2020-09-16T21:51:21Z", GoVersion:"go1.15.2", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"17+", GitVersion:"v1.17.9-eks-4c6976", GitCommit:"4c6976793196d70bc5cd29d56ce5440c9473648e", GitTreeState:"clean", BuildDate:"2020-07-17T18:46:04Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"} \n\n#### Seldon Images:\n

Model Details

  • Images of your model: [Output of: kubectl get seldondeployment -n <yourmodelnamespace> <seldondepname> -o yaml | grep image: where <yourmodelnamespace>]
  • Logs of your model: [You can get the logs of your model by running kubectl logs -n <yourmodelnamespace> <seldonpodname> <container>]
@ntorba ntorba added bug triage Needs to be triaged and prioritised accordingly labels Oct 21, 2020
@axsaucedo
Copy link
Contributor

Thank you for the heads up @ntorba - can you provide the version of the seldon containers for your model and for the seldon core controller?

@ntorba
Copy link
Author

ntorba commented Oct 21, 2020

Here is a truncated description of my deployment. Let me know if anything else you need is not here

(amp) HQSML-1712547:src ntorba605$ kubectl describe deploy seldon-controller-manager
Name:                   seldon-controller-manager
  Service Account:  seldon-manager
  Containers:
   manager:
    Image:       docker.io/seldonio/seldon-core-operator:1.2.3
    Ports:       443/TCP, 8080/TCP
    Host Ports:  0/TCP, 0/TCP

      ENGINE_CONTAINER_IMAGE_AND_VERSION:           docker.io/seldonio/engine:1.2.3
      EXECUTOR_CONTAINER_IMAGE_AND_VERSION:         docker.io/seldonio/seldon-core-

@axsaucedo
Copy link
Contributor

Thanks @ntorba - are you using prepackaged servers for deploying your model? Or are you using language wrapper? If the latter can you share the image you used to build it?

@ntorba
Copy link
Author

ntorba commented Oct 21, 2020

We are using the language wrapper. Do you know where I can look to find which language wrapper image was used?

@axsaucedo
Copy link
Contributor

Hmm that is basically created at the time of containerisation, ie s2i . <image>

@axsaucedo
Copy link
Contributor

This is not something we've observed so I'm not sure what could be causing this - is this something you are seeing in the executor or in your model container?

@ukclivecox ukclivecox added awaiting-feedback and removed triage Needs to be triaged and prioritised accordingly labels Oct 22, 2020
@ntorba
Copy link
Author

ntorba commented Oct 22, 2020

It's in the model container

@ntorba
Copy link
Author

ntorba commented Oct 22, 2020

I'm using seldon_core version 1.3.0.
What confuses me is that I can't find anywhere in the code where "Closing Connection" is logged.
Do you know where this message could be coming from?

@ukclivecox
Copy link
Contributor

Is it coming from gunicorn or if you are using grpc the the grpc server?

@ntorba
Copy link
Author

ntorba commented Oct 22, 2020

gunicorn

@ntorba
Copy link
Author

ntorba commented Oct 22, 2020

@cliveseldon @axsaucedo Adding this issue we found during community call: benoitc/gunicorn#1952

I should have mentioned in the first comment that my endpoint is still working... sorry!

So this message is harmless for now, and also something that is likely being generated by some service running on my infra.

I will let your team know if I identify the source of these logs!

@axsaucedo
Copy link
Contributor

@ntorba as discussed, this seems to be a common issue in gunicorn so closing for now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants