Skip to content

Export :tensorflow:serving:... metrics by signature names #1959

@jeongukjae

Description

@jeongukjae

Feature Request

If this is a feature request, please fill out the following form in full:

Describe the problem the feature is intended to solve

For now, tensorflow serving exports metrics by model like below.

...
:tensorflow:serving:request_count{model_name="test_model",status="OK"} 6
...
:tensorflow:serving:request_latency_bucket{model_name="test_model",API="predict",entrypoint="REST",le="10"} 0
:tensorflow:serving:request_latency_bucket{model_name="test_model",API="predict",entrypoint="REST",le="18"} 0
...
:tensorflow:serving:runtime_latency_bucket{model_name="test_model",API="Predict",runtime="TF1",le="10"} 0
:tensorflow:serving:runtime_latency_bucket{model_name="test_model",API="Predict",runtime="TF1",le="18"} 0
:tensorflow:serving:runtime_latency_bucket{model_name="test_model",API="Predict",runtime="TF1",le="32.4"} 0
...

We cannot collect metrics by signatures, even if the latencies of each signature are very different.

Related codes:

Describe the solution

It must be better if runtime latency and request latency are recorded with signature names.

Describe alternatives you've considered

Additional context

Activity

self-assigned this
on Jan 3, 2022
self-assigned this
on Jun 8, 2023
singhniraj08

singhniraj08 commented on Jun 8, 2023

@singhniraj08

@jeongukjae,

Are you still looking for a resolution? We are planning on prioritising the issues based on the community interests. Please let us know if this issue still persists with the latest TF Serving 1.12.1 release so that we can work on fixing it. Thank you for your contributions.

linked a pull request that will close this issue on Jun 16, 2023
jeongukjae

jeongukjae commented on Jun 16, 2023

@jeongukjae
Author

@singhniraj08 I wrote a PR for this issue #2152
I think those patches are enough for this. Can you review that?

singhniraj08

singhniraj08 commented on Jun 16, 2023

@singhniraj08

@jeongukjae, Thank you for your contributions. We will discuss this internally and update this thread. Thanks

jeongukjae

jeongukjae commented on Jun 22, 2023

@jeongukjae
Author

@singhniraj08 Thank you.

+

I wrote another issue that is similar to this issue: #2157
Can you discuss that issue too internally?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

    Development

    Participants

    @jeongukjae@bmzhao@SiqiaoWu1993@nniuzft@sanatmpa1

    Issue actions

      Export `:tensorflow:serving:...` metrics by signature names · Issue #1959 · tensorflow/serving