New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance: Cache RoleMaps produced by RoleMap#newMatchingRoleMap()
#81
Changes from 2 commits
4139acd
86d94bd
4846f65
b267c97
7eb9afe
1f9a783
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -86,6 +86,16 @@ public class RoleMap { | |
.expireAfterWrite(Settings.USER_DETAILS_CACHE_EXPIRATION_TIME_SEC, TimeUnit.SECONDS) | ||
.build(); | ||
|
||
/** | ||
* {@link RoleMap}s are created again and again using {@link RoleMap#newMatchingRoleMap(String)} | ||
* for different permissions for the same {@code itemName}, so cache them and avoid wasting time | ||
* matching regular expressions. | ||
*/ | ||
private final Cache<String, RoleMap> matchingRoleMapCache = CacheBuilder.newBuilder() | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The matching RoleMaps that are created also have a cache in them which is not needed. Do we reorganize the class structure to avoid it or don't care about a little more memory use? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would submit a follow-up ticket for it. Memory usage is not a problem for the plugin ATM |
||
.softValues() | ||
.maximumSize(Settings.USER_DETAILS_CACHE_MAX_SIZE) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The default value for the max size of the cache is 100. If there are more than 100 objects managed by Role Strategy, the cache would start evicting entries. Also, the max size of the cache cannot be increased after its creation. Should we care about this situation and create a larger cache using an There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. +1. The objects are not big, so bumping to 2048 or so would make sense |
||
.expireAfterWrite(Settings.USER_DETAILS_CACHE_EXPIRATION_TIME_SEC, TimeUnit.SECONDS) | ||
AbhyudayaSharma marked this conversation as resolved.
Show resolved
Hide resolved
|
||
.build(); | ||
|
||
RoleMap() { | ||
this.grantedRoles = new ConcurrentSkipListMap<Role, Set<String>>(); | ||
|
@@ -359,8 +369,12 @@ public Set<String> getSidsForRole(String roleName) { | |
* @param namePattern The pattern to match | ||
* @return A {@link RoleMap} containing only {@link Role}s matching the given name | ||
*/ | ||
|
||
public RoleMap newMatchingRoleMap(String namePattern) { | ||
RoleMap cachedRoleMap = matchingRoleMapCache.getIfPresent(namePattern); | ||
if (cachedRoleMap != null) { | ||
return cachedRoleMap; | ||
} | ||
|
||
SortedMap<Role, Set<String>> roleMap = new TreeMap<>(); | ||
new RoleWalker() { | ||
public void perform(Role current) { | ||
|
@@ -370,7 +384,9 @@ public void perform(Role current) { | |
} | ||
} | ||
}; | ||
return new RoleMap(roleMap); | ||
RoleMap newMatchingRoleMap = new RoleMap(roleMap); | ||
matchingRoleMapCache.put(namePattern, newMatchingRoleMap); | ||
return newMatchingRoleMap; | ||
} | ||
|
||
/** | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we continue with the Cache from Guava (avoid adding any dependencies) or use something else like JCS, Ehcache or something else?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is fine to use Guava as long as we do not needed newer versions.
Maybe we could have a research task for comparing cache implementations with benchmarks in later phases