Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Performance: Cache RoleMaps produced by RoleMap#newMatchingRoleMap() #81

Merged
merged 6 commits into from Jun 27, 2019
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Expand Up @@ -86,6 +86,16 @@ public class RoleMap {
.expireAfterWrite(Settings.USER_DETAILS_CACHE_EXPIRATION_TIME_SEC, TimeUnit.SECONDS)
.build();

/**
* {@link RoleMap}s are created again and again using {@link RoleMap#newMatchingRoleMap(String)}
* for different permissions for the same {@code itemName}, so cache them and avoid wasting time
* matching regular expressions.
*/
private final Cache<String, RoleMap> matchingRoleMapCache = CacheBuilder.newBuilder()
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we continue with the Cache from Guava (avoid adding any dependencies) or use something else like JCS, Ehcache or something else?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is fine to use Guava as long as we do not needed newer versions.
Maybe we could have a research task for comparing cache implementations with benchmarks in later phases

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The matching RoleMaps that are created also have a cache in them which is not needed. Do we reorganize the class structure to avoid it or don't care about a little more memory use?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would submit a follow-up ticket for it. Memory usage is not a problem for the plugin ATM

.softValues()
.maximumSize(Settings.USER_DETAILS_CACHE_MAX_SIZE)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The default value for the max size of the cache is 100. If there are more than 100 objects managed by Role Strategy, the cache would start evicting entries. Also, the max size of the cache cannot be increased after its creation. Should we care about this situation and create a larger cache using an ItemListener and a ComputerListener at the cost of higher memory usage?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1. The objects are not big, so bumping to 2048 or so would make sense

.expireAfterWrite(Settings.USER_DETAILS_CACHE_EXPIRATION_TIME_SEC, TimeUnit.SECONDS)
AbhyudayaSharma marked this conversation as resolved.
Show resolved Hide resolved
.build();

RoleMap() {
this.grantedRoles = new ConcurrentSkipListMap<Role, Set<String>>();
Expand Down Expand Up @@ -359,8 +369,12 @@ public Set<String> getSidsForRole(String roleName) {
* @param namePattern The pattern to match
* @return A {@link RoleMap} containing only {@link Role}s matching the given name
*/

public RoleMap newMatchingRoleMap(String namePattern) {
RoleMap cachedRoleMap = matchingRoleMapCache.getIfPresent(namePattern);
if (cachedRoleMap != null) {
return cachedRoleMap;
}

SortedMap<Role, Set<String>> roleMap = new TreeMap<>();
new RoleWalker() {
public void perform(Role current) {
Expand All @@ -370,7 +384,9 @@ public void perform(Role current) {
}
}
};
return new RoleMap(roleMap);
RoleMap newMatchingRoleMap = new RoleMap(roleMap);
matchingRoleMapCache.put(namePattern, newMatchingRoleMap);
return newMatchingRoleMap;
}

/**
Expand Down