New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HazelcastAsyncMultiMap: Near-cache is not used under constant load #73
Comments
mhstnsc
changed the title
HazelcastAsyncMultiMap implementation of near-cache does not converge under load
HazelcastAsyncMultiMap: Near-cache is not used under constant load
May 8, 2017
tsegismont
added a commit
to tsegismont/vert.x
that referenced
this issue
Jul 26, 2017
The user API for context data is unchanged. However module implementers will be able to use: - the ConcurrentMap API - all sorts of keys The motivation for this change is fixing vert-x3/vertx-hazelcast#73 Signed-off-by: Thomas Segismont <tsegismont@gmail.com>
tsegismont
added a commit
to tsegismont/vertx-hazelcast
that referenced
this issue
Jul 26, 2017
… constant load The order has to be guaranteed: - at the context level for event loop and workers - at the thread level for multithreaded contexts Synchronizing on a context bound queue for get requests, we: - comply to the ordering rules - do not hurt performance on event loops (biased locking) Depends on eclipse-vertx/vert.x#2071
tsegismont
added a commit
to vert-x3/vertx-infinispan
that referenced
this issue
Aug 3, 2017
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This results in poor performance and a hazelcast address lookup on event event-bus message dropping the rate by 10x
The problems seems to come from this commit. The algorithm does not converge.
7199d39
In case of constant concurrent accesses to HazelcastAsync the variable inprogressCount
will never be 0 thus cache will never be used even though there is data in it.
It requires one consumer unregister or register to bring the value to 1 again and then concurrent accesses will keep it high
The text was updated successfully, but these errors were encountered: