Skip to content

Fragile caching mechanism #735

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
mochetts opened this issue May 15, 2025 · 1 comment
Open

Fragile caching mechanism #735

mochetts opened this issue May 15, 2025 · 1 comment
Assignees

Comments

@mochetts
Copy link

mochetts commented May 15, 2025

Describe the bug
Split caching system isn't thread / process safe. When the request ends, the cache remains set in the process.

This is due to Split using class instance variables to store the cache.

In a scenario in which you're using Puma with multiple processes handling requests, this is not ideal.

If one process removes or resets an experiment, the cache only gets reset in a single process. Not in all the processes having the cached values.

To Reproduce
Using puma, Split with cache enabled, and 2 browser windows:

  1. Run two requests simultaneously in two windows that query the same experiment.
  2. Both should cache the same variant
  3. Now set the a different variant as the winner in the split console
  4. Reload both windows
  5. You should see one window still running the non-winner variant.

Suggestions

  • Add an option to make split cache request scoped (same as rails CurrentAttributes). So the cache only lives throughout the lifecycle of a request.
  • Add a way to use CurrentAttributes for the caching mechanism instead of the current class instance variable method.
@andrehjr
Copy link
Member

Thanks for bringing this up! Yes, the local caching mechanism should work over a single request lifecycle. I think we can improve this by hooking the cache onto the Split::Helper instead.

The ideal solution would be to solve what's causing the performance impact. The number of Redis calls made per single request. But that needs a lot more work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants