New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Problems with the KeySpace if the number of keys is high. #10404
Comments
|
@KhDanial Each of your clients is using a different pattern for keyspace? |
4.0.14 they can use same pattern, for example both of user subscribe on key "SampleKey". imagine a situation that single client subscribe on 1e7 key then redis will be block and does not work properly, |
@sundb |
@KhDanial You mean you use a client that subscribes to a large number of patterns? |
@sundb If you mean a lot of patterns, a lot of keys, yes, that's my problem. In a scenario imagine just one user subscribe on a large number of keys with KeySpace. as number of keys incresed, insert new keys will be so slow, because after each insertion redis will iterate on all keys(patterns) and it is not okay. |
If you use keyspace notifications with a large number of patterns, it should be inevitable to iterate through all of them. |
@sundb if you subscribe on these keys, redis will be to slow |
It's important for me to understand what happened to some of the keys. |
@KhDanial Using pattern matching a lot is bound to consume a lot of cpu time, just as we recommend using |
what does this do? |
|
i use your pattern, but i cant write in same format as you write :)))) |
|
Describe the bug
when you subscribe on huge number of keys with KeySpace, redis will block.
when i check procedure i found that in keyspace, redis will store all patterns and iterate on patterns to check client subscribed on key or not. when we subscribed on huge number of keys , the list size will be to large and iterate on list take long time.
To reproduce
enable keyspace notification and subscribed on huge number of keys and update them.
Expected behavior
redis works normaly and did not block.
The text was updated successfully, but these errors were encountered: