New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Role Manager performance / scaling issue #893
Comments
@tangyang9464 @closetool @sagilio |
@dahu33 I have a few opinions and questions
Looking forward to your reply! |
@dahu33 yes, the built-in role manager uses cache to achieve fast speed in HasLink(). okta-role-manager and auth0-role-manager may have a bad implementation and queries DB each time. They should be modified to cache the get roles result and serves the HasLink() query locally in memory, so they will be fast. Actually okta-role-manager and auth0-role-manager are more of examples. If you decide to use them, we can further polish its code |
Want to prioritize this issue? Try:
Is your feature request related to a problem? Please describe.
Using the standard RBAC model:
The
(*RoleManager).HasLink
function is called N time, where N is the the number of unique subjects found in the policies. While this may work ok if the relationships are stored in memory, it just doesn't scale at all if the relationships are stored in an external datastore.Examples with a model containing 100 unique subjects that all have at least one policy:
(*RoleManager).HasLink
function call) for each(*Enforcer).Enforce
call.(*RoleManager).HasLink
function call) for each(*Enforcer).Enforce
call.(*RoleManager).HasLink
function call) for each(*Enforcer).Enforce
call.As you can see with the examples above, this cannot really scale past a few unique subjects and I'm surprised that this issue hasn't been raised before.
Describe the solution you'd like
Ideally, when enforcing a policy, instead of calling
(*RoleManager).HasLink
for each unique subjects, the(*RoleManager).GetRoles
function should be called to get all the relations of a subject in one shot.The text was updated successfully, but these errors were encountered: