Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explainer wrapper should not add model to path for Tensorflow protocol #2664

Closed
ukclivecox opened this issue Nov 19, 2020 · 0 comments · Fixed by #2671
Closed

Explainer wrapper should not add model to path for Tensorflow protocol #2664

ukclivecox opened this issue Nov 19, 2020 · 0 comments · Fixed by #2671
Assignees
Labels
Projects
Milestone

Comments

@ukclivecox
Copy link
Contributor

The issue is the model name may not equal the model name loaded by tenorflow serving which is based on the name of the graph element. Explanations are at a predictor level so should just call predict and not focus on any particular model.

KFSERVING_PREDICTOR_URL_FORMAT = "http://{0}/v1/models/{1}:predict"

Should remove {1}

This is ok as we don't share this wrapper with KFServing.

@ukclivecox ukclivecox added bug triage Needs to be triaged and prioritised accordingly labels Nov 19, 2020
@ukclivecox ukclivecox self-assigned this Nov 19, 2020
@ukclivecox ukclivecox added this to To do in 1.5 via automation Nov 19, 2020
@ukclivecox ukclivecox added this to the 1.5 milestone Nov 19, 2020
@ukclivecox ukclivecox removed the triage Needs to be triaged and prioritised accordingly label Nov 20, 2020
1.5 automation moved this from To do to Done Nov 20, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
No open projects
1.5
Done
Development

Successfully merging a pull request may close this issue.

1 participant