Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add condition variables and concurrent-mutexes #210

Closed
wants to merge 6 commits into from

Conversation

TheLortex
Copy link
Contributor

Hi, when porting the lwt-based Mirage networking stack to eio I have encountered an extensive use of the
Lwt_condition and Lwt_mutex modules. This PR adds eio-equivalents: Condition and Eio_mutex (Eio_mutex is prefixed because of the already existing Stdlib.Mutex).

Both features have straightforward implementations by using existing modules (Semaphore / Waiters) but I think it can ease the transition from lwt to eio to have them.

This is obviously open for discussion.

Copy link
Collaborator

@talex5 talex5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks - I agree we need something like these. But we should be careful not to simply copy the Lwt API, as we know it has problems!

I find Lwt_conditions tend to be misused. I've been wondering about splitting them into two separate concepts:

  • Events (no payload).
  • Variables (store a value; notify everyone if it changes; you can read the current value).

lib_eio/eio.ml Outdated Show resolved Hide resolved
lib_eio/eio.mli Outdated Show resolved Hide resolved
lib_eio/eio.mli Outdated Show resolved Hide resolved
lib_eio/eio.mli Outdated Show resolved Hide resolved
lib_eio/eio_mutex.ml Outdated Show resolved Hide resolved
lib_eio/condition.ml Outdated Show resolved Hide resolved
lib_eio/condition.ml Outdated Show resolved Hide resolved
id = Ctf.mint_id ()}

let wait ?mutex t =
Option.iter Eio_mutex.unlock mutex;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have examples of the mutex argument being used in Lwt code? Unless it's very common, it might be clearer to let the caller do this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that it's not common and I took some time to understand what it's doing.
One example is in mirage-vnetif: TheLortex/mirage-vnetif@9b38e59#diff-f1be8e184de1d6478035d749a7d4e1099f4accac53644d09a68f12d6c90b0ec3R97

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, that use of a mutex looks unnecessary. Incrementing, decrementing and checking an int field are all atomic anyway with Lwt.

lib_eio/eio.mli Outdated Show resolved Hide resolved
lib_eio/eio.mli Show resolved Hide resolved
@talex5
Copy link
Collaborator

talex5 commented Jun 13, 2022

I think the main question here is whether to make conditions safe to use across domains or not.

A condition that's only used in a single domain can have a simpler API, because when await returns you know the condition will remain true until you block.

However, if it's used across domains then the API needs to take a mutex because the caller needs to prevent other domains from changing the variable until it has finished using it.

So maybe we need two different modules here (e.g. Condition_single_domain and Condition_multidomain or something).

@talex5 talex5 mentioned this pull request Jun 13, 2022
@talex5
Copy link
Collaborator

talex5 commented Jun 27, 2022

Needs rebasing now that #223 is merged.

@talex5 talex5 mentioned this pull request Aug 11, 2022
@talex5
Copy link
Collaborator

talex5 commented Aug 11, 2022

(fixed in #277)

@talex5 talex5 closed this Aug 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants