Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rust message queues #744

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from
Draft

Rust message queues #744

wants to merge 8 commits into from

Conversation

Half-Shot
Copy link
Contributor

Arguably the biggest blocker to getting various components ported to Rust is that hookshot relies upon a central message queue for passing between components. While building a redis compatible one would be trivial, building a in-process one is trickier. This PR adds a nearly functionally complete version of our "LocalMQ" class, written in Rust.

Performance is...patchy. Rust does slightly better when the message contains no data, but is on the whole worse than JS when the data payload is populated. This is likely due to jumping the N-API barrier, and having to clone the content of the message each time it's used.

I'm sure more experienced rust folks might be able to tidy up the impl, but the performance issues only really show at the 1000Hz and above range. Hookshot will certainly bottleneck in other places before we ever reach that kind of throughput. For now with some cleanup I think this would allow us to begin looking at porting the RSS feed component to Rust.

@tadzik
Copy link
Contributor

tadzik commented May 8, 2023

I don't think performance impact should be much of a concern.

The randomBytes(512).toJSON() used in tests stringifies to a very ugly JS object (1844 characters of JSON), and is unlikely to be the kind of thing we'll pass through in a real world scenario. And if it ever becomes a concern, there's some room for improvement still.

For comparison, on my machine:

# original code, passing randomBytes(512).toJSON():
JS:testSimpleEvent: 8.565ms (omitted in later results)
RS:testSimpleEvent: 1.942ms (as above)
JS:testWithData: 3.315ms
RS:testWithData: 7.481ms

# real-life-ish scenario, passing the feed entry from feed entry tests:
JS:testWithData: 1.417ms
RS:testWithData: 0.92ms

# passing JSON.stringify(randomBytes(512).toJSON()) to ease napi load
JS:testWithData: 4.779ms
RS:testWithData: 2.709ms

With this tested for 100 iterations, even if the worst case scenario (the one currently in tree anyway), we're paying a price of about 40μs per message, which will easily get eclipsed by all the other load associated with handling the message. I don't think we need to worry about this.

Half-Shot and others added 4 commits May 8, 2023 11:45
Saves us about 10% in testWithData() on my machine

Co-authored-by: Tadeusz Sośnierz <tadeusz@sosnierz.com>
Co-authored-by: Will Hunt <will@half-shot.uk>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants