Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tracing: Add spans for outgoing calls made by the memcached client #3148

Conversation

igorwwwwwwwwwwwwwwwwwwww
Copy link
Contributor

@igorwwwwwwwwwwwwwwwwwwww igorwwwwwwwwwwwwwwwwwwww commented Sep 10, 2020

  • I added CHANGELOG entry for this change.
  • Change is not relevant to the end user.

Changes

This adds tracing instrumentation to the memcached client. That should help diagnose issues with the cache, be it latency or cache effectiveness.

Verification

Still need to figure out a good way to test this.

Signed-off-by: Igor Wiedler <iwiedler@gitlab.com>
@igorwwwwwwwwwwwwwwwwwwww
Copy link
Contributor Author

Note that the upstream memcached client does not support opentracing, see bradfitz/gomemcache#84.

@@ -340,11 +345,16 @@ func (c *memcachedClient) SetAsync(_ context.Context, key string, value []byte,
start := time.Now()
c.operations.WithLabelValues(opSet).Inc()

span, _ := tracing.StartSpan(ctx, "memcached_client_set")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that by its async nature, this span will appear later in the trace, after its parent has already finished. AFAIK most tracing tools should handle this case ok.

Copy link
Contributor

@SuperQ SuperQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice

Copy link
Member

@kakkoyun kakkoyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks solid. Thanks a lot for contributing.

I believe in order to truly track the performance issues with the cache, we need to make sure encapsulating actions also have necessary spans. AFAIK, we have spans for caching bucket however I'm not sure if we have spans already in place for index-caching. It'd be great if you could make sure we don't miss any of the spans for both index-cache and caching bucket.

WDYT?

pkg/cacheutil/memcached_client.go Outdated Show resolved Hide resolved
@bwplotka
Copy link
Member

bwplotka commented Sep 11, 2020

I recall the traces being unreadable with thousands on requests against Memcached for single request, but we can enable this if needed - maybe worth to enable when some flag is specified? Otherwise agree with @kakkoyun let's try to unify this for all kind of caching requests 🤔

@pracucci
Copy link
Contributor

I recall the traces being unreadable with thousands on requests against Memcached for single request

Yes, this was a deal breaker for us (Cortex). Maybe we could add an option to enable these spans, so that we can keep it disabled in Cortex?

Copy link
Member

@bwplotka bwplotka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, if we want this, it has to be under flag / options (:

@stale
Copy link

stale bot commented Nov 21, 2020

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

@stale stale bot added the stale label Nov 21, 2020
@igorwwwwwwwwwwwwwwwwwwww
Copy link
Contributor Author

@bwplotka I've brought this up-to-date with master and made the tracing configurable and opt-in, please take another look :)

Signed-off-by: Igor Wiedler <iwiedler@gitlab.com>
@stale
Copy link

stale bot commented Jan 25, 2021

Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

@stale stale bot added the stale label Jan 25, 2021
@stale stale bot closed this Feb 2, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants