-
Notifications
You must be signed in to change notification settings - Fork 207
Update EPP to expose metrics to represent ready inference models to serve #935
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update EPP to expose metrics to represent ready inference models to serve #935
Conversation
|
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: shotarok The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
|
Hi @shotarok. Thanks for your PR. I'm waiting for a kubernetes-sigs member to verify that this patch is reasonable to test. If it is, they should reply with Once the patch is verified, the new status will be reflected by the I understand the commands that are listed here. Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. |
✅ Deploy Preview for gateway-api-inference-extension ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
|
/ok-to-test overall I’m ok with this change. just a heads up - inferencemodel is during some rework and would probably change significantly. not sure if now is a good timing to add metrics. |
|
@nirrozenbaum Thank you for your review and heads up! I understand InferenceModel may change significantly. If it's better to hold this issue for a future version, I'm happy to close this PR and find the other issue to work on. |
|
since @ahg-g is leading the effort on InferenceModel redesign, I will let him comment if he thinks we should make progress with this or wait until things get clearer. |
|
/retest |
|
/test pull-gateway-api-inference-extension-test-e2e-main |
|
@ahg-g Hello, I understand InferenceModel deprecation is being discussed in "Revisiting The InferenceModel API". Do you think it's worth merging this PR for the versions until InferenceModel is deprecated? If not, I'll close this PR and add a comment on #598, so please let me know. Thanks! |
|
I'm starting the transition today, so probably okay to hold off. |
|
@kfswain Thank you for letting me know! Then, I'll close this PR. |
Resolve #598
Update EPP to expose metrics to represent ready inference models to serve.
inference_model_readyis a binary metric with the pool_name and model_name labels.