Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add eio implementation #21

Merged
merged 26 commits into from
Jul 10, 2023
Merged

Add eio implementation #21

merged 26 commits into from
Jul 10, 2023

Conversation

quernd
Copy link
Collaborator

@quernd quernd commented Jan 7, 2023

Both unary and streaming, client and server are implemented. The API mostly follows the existing API but we can take this opportunity to overhaul it.

@quernd quernd force-pushed the grpc-eio branch 2 times, most recently from d4c75c6 to 38d8fd8 Compare January 7, 2023 09:25
@wokalski
Copy link
Contributor

We'll iterate on it for our internal project and merge when ready.

Copy link
Collaborator

@tmcgilchrist tmcgilchrist left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This changes are now required to run the greeter example code.

examples/greeter-client-eio/dune Outdated Show resolved Hide resolved
examples/greeter-server-eio/dune Outdated Show resolved Hide resolved
examples/greeter-client-eio/greeter_client_eio.ml Outdated Show resolved Hide resolved
@tmcgilchrist
Copy link
Collaborator

The last two commits on https://github.com/tmcgilchrist/ocaml-grpc/tree/grpc-eio provide the fixes and ports the route guide example to EIO.

@quernd quernd force-pushed the grpc-eio branch 3 times, most recently from 702d4b5 to c56d70e Compare May 23, 2023 08:59
dune-project Outdated Show resolved Hide resolved
quernd and others added 19 commits May 24, 2023 08:05
This fixes a race condition where the trailer was ignored because the
read body was not fully processed, so the status promise was not
resolved by the trailer handler in time.
We don't need a separate request queue if we just fork
We might wait forever causing a deadlock. Maybe this race condition won't occur in the real world because we have already written the status promise meaning a well-behaved client should terminate the request, but we can't rely on that. Also it occurs with mocked request streams in testing, and overall I see no reason why we should exhaust the request stream here.
Co-authored-by: Tim McGilchrist <timmcgil@gmail.com>
@quernd
Copy link
Collaborator Author

quernd commented May 24, 2023

Thanks @tmcgilchrist. I think the rebasing went well this time. CI is now all green except for the experimental lint workflows. I converted the greeter EIO examples to ocaml-protoc-plugin because the shared greeter library changed. Since you ported the routeguide tutorial to EIO, there are probably a few changes needed in the instructions. I only auto-generated the MDX code blocks. It's great that we already have a tutorial that uses EIO!

@quernd quernd marked this pull request as ready for review May 24, 2023 06:56
@quernd quernd requested a review from tmcgilchrist May 24, 2023 06:57
dune-project Show resolved Hide resolved
Seq.iter
(fun encoder ->
let payload = Grpc.Message.make encoder in
H2.Body.Writer.write_string body payload)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be calling flush after each iteration?
I'm undecided about whether it should. The way it is now will batch up the writes and send them in a smaller number of network packets. Flushing after each write_string would send them immediately to the server, which might be preferable.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good question. It could be preferable when aiming for low latency. I'm thinking maybe it should just be opt-in.

On this note, we ran some load tests but they were inconclusive because we stumbled upon a bug in Eio: ocaml-multicore/eio#572
It has been fixed, so I hope we'll have better numbers soon.

We're also experimenting with a different interface for the reader. Lazy sequence might be a bit too opinionated, and they introduce a bit of overhead, so the alternative interface is similar to H2 with on_msg/on_eof callbacks, with a conversion to Seq.t as a convenience helper: https://github.com/dialohq/ocaml-grpc/tree/grpc-eio-noseq/lib/grpc-eio

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting experiment, it will be good to see some performance numbers from it. Keeping the Seq.t for a simple interface is a good idea for introducing the library and less performance sensitive applications.

Opt-in sounds good, I'm happy to leave that till later on. There are a few other config pieces like auth and compression that will need a way to configure them.

else respond_with `Not_acceptable)
| Some _ ->
(* TODO: not sure if there is a specific way to handle this in grpc *)
respond_with `Bad_request
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This details the responses required for unsupported encodings https://github.com/grpc/grpc/blob/master/doc/compression.md#test-cases

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I'll take a look. By the way, with the Eio implementation and your Async server implementation, this code is now duplicated and could be factored out.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll do another pass over the tutorial code after this is merged.

dune-project Outdated Show resolved Hide resolved
examples/routeguide-tutorial.md Outdated Show resolved Hide resolved
examples/greeter-client-eio/greeter_client_eio.ml Outdated Show resolved Hide resolved
examples/routeguide/src/client.ml Outdated Show resolved Hide resolved
print_endline "Try running:";
print_endline "";
print_endline
{| dune exec -- examples/greeter-client-lwt/greeter_client_lwt.exe <your_name> |};
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{| dune exec -- examples/greeter-client-lwt/greeter_client_lwt.exe <your_name> |};
{| dune exec -- examples/greeter-client-eio/greeter_client_eio.exe <your_name> |};

Seq.iter
(fun encoder ->
let payload = Grpc.Message.make encoder in
H2.Body.Writer.write_string body payload)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting experiment, it will be good to see some performance numbers from it. Keeping the Seq.t for a simple interface is a good idea for introducing the library and less performance sensitive applications.

Opt-in sounds good, I'm happy to leave that till later on. There are a few other config pieces like auth and compression that will need a way to configure them.

@tmcgilchrist
Copy link
Collaborator

Merging this to unblock other PRs.

@tmcgilchrist tmcgilchrist merged commit 4fa8b53 into main Jul 10, 2023
@tmcgilchrist tmcgilchrist deleted the grpc-eio branch July 10, 2023 23:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants