Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update docs for inference graph response code #289

Open
yuzisun opened this issue Sep 10, 2023 · 0 comments
Open

Update docs for inference graph response code #289

yuzisun opened this issue Sep 10, 2023 · 0 comments
Assignees
Labels

Comments

@yuzisun
Copy link
Member

yuzisun commented Sep 10, 2023

What is changing? (Please include as many details as possible.)

Add dependency (soft/hard) field to the inference graph Step to determine whether to terminate the graph execution upon error responses.

How will this impact our users?

By default it maintains the current behavior as soft, if user specifies hard then it terminates at the current step upon error response.

In what release will this take happen (to the best of your knowledge)?

Ex. v0.11.1

Context

Link to associated PRs or issues from other repos here.

  1. Inference Graph error response handling kserve#3039

Additional info

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants