Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to pass string as input to the predict function. #2553

Closed
girishpillai17 opened this issue Oct 16, 2020 · 17 comments
Closed

Not able to pass string as input to the predict function. #2553

girishpillai17 opened this issue Oct 16, 2020 · 17 comments
Assignees
Labels

Comments

@girishpillai17
Copy link

girishpillai17 commented Oct 16, 2020

I have deployed a website classification model via seldon.
The link to the repo is : Model file

Describe the bug

So basically the model accepts string as an input and gives the category name as output
Eg: model.predict("www.microsoft.com") will give output as ['Computers']

I tried to run the model by going inside the container, the model runs without any bugs. But when I try to pass the curl function and the json. I get the below error

{ "status": { "code": 203, "info": "com.google.protobuf.InvalidProtocolBufferException: Expect message object but got: \"\u003c!DOCTYPE\"", "reason": "Microservice error", "status": "FAILURE" } }

I have passed curl file in many combination but I dont get the desired output.

curl -v http://10........../api/v0.1/predictions -H "Content-Type: application/json" -d '{'url' : 'www.microsoft.com'}'

curl -v http://10........../api/v0.1/predictions -H "Content-Type: application/json" -d "{"url" : "www.microsoft.com"}"

curl -v http://10........../api/v0.1/predictions -d '{"url" : "www.microsoft.com"}' -H "Content-Type: application/json"

curl -v http://10........../api/v0.1/predictions -H 'Content-Type: application/json' -d '{ "data": { "ndarray": [["www.microsoft.com"]]}}'

curl -v http://10........../api/v0.1/predictions -H 'Content-Type: application/json' '{"data": {"names": [], "ndarray": [["www.microsoft.com"]]}}'

Expected behaviour

Expected behaviour should be that after receiving the string input, it should give the category name as a list
E.g : ['Category_name']

Environment

Kubernetes V1.16.6
Docker Version 18.06.2 Seldon Version 1.0.1

@girishpillai17 girishpillai17 added bug triage Needs to be triaged and prioritised accordingly labels Oct 16, 2020
@RafalSkolasinski RafalSkolasinski self-assigned this Oct 19, 2020
@ukclivecox
Copy link
Contributor

Can you try with 1.3.0 release?

@ukclivecox ukclivecox removed the triage Needs to be triaged and prioritised accordingly label Oct 22, 2020
@girishpillai17
Copy link
Author

@cliveseldon could you let me know whether I am passing the proper input in curl file.
How should I pass the input string?

@ukclivecox
Copy link
Contributor

if you wrapped your python with the standard predict method in the class then you can send as per your last example:

curl -v http://10........../api/v0.1/predictions -H 'Content-Type: application/json' '{"data": {"names": [], "ndarray": [["www.microsoft.com"]]}}'

@girishpillai17
Copy link
Author

girishpillai17 commented Oct 22, 2020

Can you please explain why it is not working on Seldon Version 1.0.1?
Is my file structure and code correct?

@ukclivecox
Copy link
Contributor

Is the error in the logs of the seldon engine or in your model itself? It looks like you are not sending a valid SeldonMessage payload.

@girishpillai17
Copy link
Author

No error is not in the model. I have tried running the predict function inside the container, it gives the output as desired.

@ukclivecox
Copy link
Contributor

Can you try with 1.3?

@girishpillai17
Copy link
Author

Yeah I would try running it in 1.3

@RafalSkolasinski
Copy link
Contributor

Hi @girishpillai17, sorry for taking so long to get back to you.
Please use strData field, e.g.

curl -s -H 'Content-Type: application/json' \
	-d '{"strData": "my input string"}' \
	http://localhost:8003/seldon/seldon/seldon-mock-model/api/v1.0/predictions

I have just verified that this works with SC 1.0.1 (running Engine instead of Executor). Confirmed with minimal model of form

class Model:
    def predict(self, features, names=[], meta=[]):
        print(features)
        return features

build using seldonio/seldon-core-s2i-python37:1.4.0 as base:

curl -s -H 'Content-Type: application/json' \
	-d '{"strData": "my input string"}' \
	http://localhost:8003/seldon/seldon/seldon-mock-model/api/v1.0/predictions  | jq .
{
  "meta": {
    "puid": "covbn9km55d3qp77fi8iv7k7lm",
    "tags": {},
    "routing": {},
    "requestPath": {
      "model": "mock-model:latest"
    },
    "metrics": []
  },
  "strData": "my input string"
}

@girishpillai17
Copy link
Author

girishpillai17 commented Oct 28, 2020

curl -v http://10.144.98.247:32027/seldon/seldon/wc-nbmodel-girish-pillai/api/v0.1/predictions -H 'Content-Type: application/json' -d '{"strData": "www.microsoft.com"}'

*   Trying 10.144.98.247:32027...
* TCP_NODELAY set
* Connected to 10.144.98.247 (10.144.98.247) port 32027(#0)
> POST /seldon/seldon/wc-nbmodel-girish-pillai/api/v0.1/predictions HTTP/1.1
> Host: 10.144.98.247:32027
> User-Agent: curl/7.65.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 32
>
* upload completely sent off: 32 out of 32 bytes
* Mark bundle as not supporting multiuse
< HTTP/1.1 500 Internal Server Error
< x-content-type-options: nosniff
< vary: Origin,Access-Control-Request-Method,Access-Control-Request-Headers
< content-type: text/plain;charset=UTF-8
< content-length: 215
< date: Wed, 28 Oct 2020 10:52:50 GMT
< x-envoy-upstream-service-time: 9
< server: envoy
<
{
  "status": {
    "code": 203,
    "info": "com.google.protobuf.InvalidProtocolBufferException: Expect message object but got: \"\u003c!DOCTYPE\"",
    "reason": "Microservice error",
    "status": "FAILURE"
  }
* Connection #0 to host 10.144.98.247 left intact
}               

I tried using strData, but still getting the same error.

@RafalSkolasinski
Copy link
Contributor

RafalSkolasinski commented Oct 28, 2020

Please, provide version of seldon-core wrapper used to build the image and the manifest yaml file used to create the deployment.

@RafalSkolasinski
Copy link
Contributor

RafalSkolasinski commented Oct 28, 2020

Also, please post logs from the model container. When trying to run code from the repo you linked in first message I hit multiple problems:

  • discrepancy between model's class name and environmental variable setting
  • code syntax errors

Once those were fixed there was still errors on unpickling your model both with Python 3.6 and 3.7.

Please confirm that there are no Python errors reported in the model container as well as please try if the minimal example posted by me works as expected.

@girishpillai17
Copy link
Author

girishpillai17 commented Oct 28, 2020

seldon-core wrapper - seldonio/seldon-core-s2i-python3:0.13

Can you send me any similar example for string input which is in seldon core github so that I can refer it?

I didnt get what is manifest yaml file.

@RafalSkolasinski
Copy link
Contributor

Please, try with seldonio/seldon-core-s2i-python37:1.4.0 as I confirmed it should be working.
A manifest file is what is being kubectl applyied in order to create deployments in the cluster.

For example:

apiVersion: machinelearning.seldon.io/v1
kind: SeldonDeployment
metadata:
  labels:
    app: seldon
  name: seldon-mock-model
spec:
  name: mock-deployment
  predictors:
  - componentSpecs:
    - spec:
        containers:
        - image: mock-model:latest
          imagePullPolicy: IfNotPresent
          name: model
    graph:
      name: model
      type: MODEL
    name: default
    replicas: 1

@girishpillai17
Copy link
Author

Discrepancy between model's class name and environmental variable setting

I am using the below code

class NBmodel(object):
    def __init__(self):
        self.model = pickle.load(open("NBmodel.pkl", "rb"))

    def predict(self, X, features_names=None):
        print(X)
        return self.model.predict([X])

And my environment variable is
MODEL_NAME=NBModel
API_TYPE=REST
SERVICE_TYPE=MODEL
PERSISTENCE=0

I have kept the model name same as class name. So what is the discrepancy?

@RafalSkolasinski
Copy link
Contributor

NBmodel vs NBModel - capitalisation of m. This code will simply not run and you will see that in logs of the model container that you can access using kubectl logs ...

@girishpillai17
Copy link
Author

Hi @RafalSkolasinski

Thank you for all the help and support you have provided.
I rectified every mistake and build the model again and passed strData.
I have got the desired output.

Thanks once again for looking into it.

Input:
curl -v http://10.144.98.247:32027/seldon/seldon/wc-nbmodel-girish-pillai/api/v0.1/predictions -H 'Content-Type: application/json' -d '{"strData": "www.microsoft.com"}'

output:

{
  "meta": {
    "puid": "lgvnpmti0iv7konrpkn7v3hfoq",
    "tags": {
    },
    "routing": {
    },
    "requestPath": {
      "wc-nbmodel-girish-pillai": "10.169.24.144:5055/seldon/wc-nbmodel-girish-pillai:latest"
    },
    "metrics": []
  },
  "data": {
    "names": [],
    "ndarray": ["Computers"]
  }
* Connection #0 to host 10.144.98.247:32027 left intact
}  

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants