Skip to content

Commit

Permalink
Rewrite ConcurrentLruCache implementation
Browse files Browse the repository at this point in the history
Prior to this commit, the `ConcurrentLruCache` implementation would not
perform well under certain conditions. As long as the cache capacity was
not reached, the cache would avoid maintaining an eviction queue
(reordering entries depending with least/most recently read). When the
cache capacity was reached, the LRU queue was updated for each
read/write operation. This decreased performance significantly under
contention when the capacity was reached.

This commit completely rewrites the internals of `ConcurrentLruCache`.
`ConcurrentLruCache` is now a specialized version of the
`ConcurrentLinkedHashMap` [1]. This change focuses on buferring read and
write operations, only processing them at certain times to avoid
contention.

When a cached entry is read, a read operation is queued and buffered
operations are drained if the buffer reached a fixed limit. When a new
cache entry is added or removed, a write operation is queued and
triggers a drain attempt. When the capacity is outgrown, the cache polls
items from the eviction queue, which maintains elements with the
least recently used ones first. Entries are removed until the capacity
is under control.

The behavior described here and the buffer sizes are optimized with the
number of available processors in mind. Work is localized as much as
possible on a per-thread basis to avoid contention on the eviction queue.

The new implementation has been tested with the JMH benchmark provided
here, comparing the former `COncurrentLruCache`, the new implementation
as well as the `ConcurrentLinkedHashMap` [1].

When testing with a cache reaching capacity, under contention, with a
10% cache miss, we're seeing a 40x improvement compared to the previous
implementation and performance on par with the reference.
See [2] for how to replicate the benchmark.

[1] https://github.com/ben-manes/concurrentlinkedhashmap
[2] https://github.com/spring-projects/spring-framework/wiki/Micro-Benchmarks

Closes gh-26320
  • Loading branch information
bclozel committed Aug 31, 2022
1 parent 706c1ec commit c470262
Show file tree
Hide file tree
Showing 4 changed files with 597 additions and 93 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
/*
* Copyright 2002-2022 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.springframework.util;

import java.util.ArrayList;
import java.util.List;
import java.util.Random;
import java.util.function.Function;

import org.openjdk.jmh.annotations.Benchmark;
import org.openjdk.jmh.annotations.BenchmarkMode;
import org.openjdk.jmh.annotations.Level;
import org.openjdk.jmh.annotations.Mode;
import org.openjdk.jmh.annotations.Param;
import org.openjdk.jmh.annotations.Scope;
import org.openjdk.jmh.annotations.Setup;
import org.openjdk.jmh.annotations.State;
import org.openjdk.jmh.infra.Blackhole;

/**
* Benchmarks for {@link ConcurrentLruCache}.
* @author Brian Clozel
*/
@BenchmarkMode(Mode.Throughput)
public class ConcurrentLruCacheBenchmark {

@Benchmark
public void lruCache(BenchmarkData data, Blackhole bh) {
for (String element : data.elements) {
String value = data.lruCache.get(element);
bh.consume(value);
}
}

@State(Scope.Benchmark)
public static class BenchmarkData {

ConcurrentLruCache<String, String> lruCache;

@Param({"100"})
public int capacity;

@Param({"0.1"})
public float cacheMissRate;

public List<String> elements;

public Function<String, String> generator;

@Setup(Level.Iteration)
public void setup() {
this.generator = key -> key + "value";
this.lruCache = new ConcurrentLruCache<>(this.capacity, this.generator);
Assert.isTrue(this.cacheMissRate < 1, "cache miss rate should be < 1");
Random random = new Random();
int elementsCount = Math.round(this.capacity * (1 + this.cacheMissRate));
this.elements = new ArrayList<>(elementsCount);
random.ints(elementsCount).forEach(value -> this.elements.add(String.valueOf(value)));
this.elements.sort(String::compareTo);
}
}
}

0 comments on commit c470262

Please sign in to comment.