-
Notifications
You must be signed in to change notification settings - Fork 294
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add grpc client and server plugin #547
Add grpc client and server plugin #547
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A lot of constructs incompatible with Node 4 were used and should be refactored to support it. You can use https://node.green/ when not sure if a feature is supported or not.
Thanks for the PR! I did a first pass of review but in general the instrumentation itself looks good. I scheduled this for 0.12.0 since we don't plan to release it before at least a couple weeks, so we should be able to include this feature in the release. |
We have tests that run nightly so we should be able to find anything that breaks pretty quickly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just noticed that the tests are not running in CI and should be added to .circleci/config.yml
Thanks for the in-depth review @rochdev and willingness to accept the PR :) ! I'll update the branch with your comments addressed shortly. |
Missed that, thank you. Added. |
bffe55d
to
273bc20
Compare
Ignore the |
fba8a62
to
8ae25e1
Compare
Found it. |
Your implementation is a lot cleaner than |
d1c4388
to
872bf59
Compare
@rochdev sorry for taking so long to get back to this. Had a busy week. I've updated the PR with support for cancellation and bidi streams (as well as tests of course). I believe all actionable comments should be addressed. Mind doing another review? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall looks good. I'll do one last pass when everything is resolved.
@nunofgs One thing I'm wondering is about the context propagation. We are currently working on an HTTP2 integration which will handle context propagation, but right now you are doing it directly in gRPC. What do you think should be the expected behavior? Should we just drop propagation from gRPC at that point or is there still value to have it there? |
872bf59
to
e134834
Compare
🤔 gRPC uses http2 as transport protocol but it supports others and it's also possible to write your own. It might be useful to keep the gRPC context propagation for those cases. |
83bd64f
to
fd1c54e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's also go through the plugin checklist and make sure we got everything.
Plugin Checklist
|
Let's add some tests to ensure that we aren't modifying any metadata (except for the context propagation). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we also need to add tests to check if the listener runs in the parent context for stream requests.
👍 |
Hi DataDog team!
This PR adds automatic instrumentation support for grpc, both client and server. The
client
plugin is implemented using client interceptors and should be pretty resilient to upstream changes.The
server
plugin however, has no such support and I took a stab at implementing — finding any injection point as best I could. It is fully functional but I cannot guarantee that it'll be as resilient as grpc-node evolves.Let me know what you think!
Note:
require-dir
was removed since it doesn't support nested folders. There was already a file (src/plugins/index.js
) with all of the supported plugins which I believe is even more appropriate to use.