-
-
Notifications
You must be signed in to change notification settings - Fork 0
Caches
Cache is a data structure, which is used to amortize complex calculations by storing a subset of results efficiently as key/value pairs. The cache is usually based on some access heuristics and can be used in many different situations. For example, it can be used as a way to speed up some functions, methods, or getters.
This document describes the API of caches and their individual implementations. The toolkit provide a decorator to help you cache results of a function/method call, which is described at the end of this document.
Legend for tables
-
API
-
capacity
- the capacity of the cache. -
node
- a node object in the list -
key
- a key -
value
- a value -
iterator
- an Iterator instance or an object with an iterator/iterable protocol -
object
- an object to be decorated -
prop
- a property ofobject
, either a string or a symbol. -
fn
- a function -
cache
- a cache object
-
-
Complexity
- O - complexity of an operation
- O(1) - constant time
- O(n) - linear time proportional to the cache size
- O(k) - linear time proportional to the argument size
This is a user-facing module. It exposes all exports from cache/cache-lru.js
(see below)
and the CacheLRU
class as the default export.
Additionally, it exports the cacheDecorator()
, which is a useful wrapper for methods and getters.
By default it uses CacheLRU
class as the default cache.
import Cache, {cacheDecorator} from 'list-toolkit/cache.js';
class ComplexCalculations {
static factorial(n) {
if (n <= 1) return 1;
return n * ComplexCalculations.factorial(n - 1);
}
static fibonacci(n) {
if (n <= 1) return 1;
return ComplexCalculations.fibonacci(n - 1) + ComplexCalculations.fibonacci(n - 2);
}
};
cacheDecorator(ComplexCalculations, 'factorial', new Cache(100));
cacheDecorator(ComplexCalculations, 'fibonacci');
This module provides the CacheLRU
class based on the least recently used (LRU) algorithm:
import CacheLRU from 'list-toolkit/cache/cache-lru.js';
const cache = new CacheLRU(100);
This class is used as a foundational class for other caches. It provides the following methods:
Member | Return type | Description | O |
---|---|---|---|
constructor(capacity = 10) |
this |
create a new cache | O(1) |
capacity |
number |
capacity of the cache | O(1) |
isEmpty |
boolean |
true if the cache is empty | O(1) |
size |
number |
number of elements in the cache | O(1) |
has(key) |
boolean |
true if the cache contains an element with the specified key | O(1) |
find(key) |
node or undefined
|
a ValueNode of the element with the specified key |
O(1) |
get(key) |
node or undefined
|
alias for find(key)
|
O(1) |
remove(key) |
this |
remove an element with the specified key | O(1) |
delete(key) |
this |
alias for remove(key)
|
O(1) |
register(key, value) |
this |
register a value with the specified key | O(1) |
add(key, value) |
this |
alias for register(key, value)
|
O(1) |
set(key, value) |
this |
alias for register(key, value)
|
O(1) |
clear() |
this |
clear the cache | O(1) |
[Symbol.iterator]() |
iterator | an iterator over the cached values | O(1) |
getReverseIterator() |
iterator | an iterator over the cached values in reverse order | O(1) |
Internally CacheLRU
uses doubly linked value lists and a map to store references to the nodes.
When a cache reaches its capacity, the least recently used element is removed from the cache and the new one is added.
find(key)
returns the ValueNode
of the element with the specified key. It marks the element
as the most recently used. The value
property of the returned node is an object with two properties:
{key, value}
, which can be used to access the value and the associated key.
has(key)
returns true if the cache contains an element with the specified key. It does not update
the LRU status of the element.
register(key, value)
registers a value with the specified key and, if the capacity is reached,
evicts the least recently used element and adds the new one.
The default iterator enumerates {key, value}
pairs in the LRU order: the most recently used first.
The reverse iterator goes in the opposite order.
CacheLRU
class is exported as CacheLRU
and as the default export.
This module provides the CacheFIFO
class based on the first in first out (FIFO) algorithm:
import CacheFIFO from 'list-toolkit/cache/cache-fifo.js';
const cache = new CacheFIFO(100);
It has exactly the same API as CacheLRU
above. The only difference is that CacheFIFO
uses
FIFO algorithm to evict old elements.
The default iterator enumerates {key, value}
pairs in the FIFO order: the oldest first.
The reverse iterator goes in the opposite order.
This module provides the CacheLFU
class based on the least frequently used (LFU) algorithm:
import CacheLFU from 'list-toolkit/cache/cache-lfu.js';
const cache = new CacheLFU(100);
It has exactly the same API as CacheLRU
above. The main difference is that CacheLFU
uses
LFU algorithm to evict old elements.
Internally it uses access counters and a heap to keep track of the least frequently used elements.
Counters are kept in the value
property of the nodes: {key, value, counter}
.
As with CacheLRU
, find(key)
updates the counter, while has(key)
does not.
The default iterator enumerates {key, value, counter}
objects in no particular order.
The reverse iterator goes in the opposite order.
To help with counters it exposes the following methods:
Member | Return type | Description | O |
---|---|---|---|
resetCounters(initialValue = 1) |
this |
reset counters | O(n) |
resetCounters()
resets the counters to the specified value. The default value is 1.
It is useful to manage long running caches to avoid counter overflows without clearing caches.
This module provides the CacheRandom
class based on evicting elements randomly:
import CacheRandom from 'list-toolkit/cache/cache-random.js';
const cache = new CacheRandom(100);
It has exactly the same API as CacheLRU
above. The only difference is that CacheRandom
evicts old elements randomly.
Internally it uses an id and a heap to keep track of the elements.
Ids are kept in the value
property of the nodes: {key, value, id}
.
Every new element gets a new id or receives an id of the previously evicted element.
The default iterator enumerates {key, value, id}
objects in no particular order.
The reverse iterator goes in the opposite order.
To help with counters it exposes the following methods:
Member | Return type | Description | O |
---|---|---|---|
resetIds() |
this |
regenerate ids | O(n) |
resetIds()
resets the ids. It is useful to manage long running caches to avoid id overflows
without clearing caches.
This module provides the helpers to decorate functions, methods, and getters with a cache:
import decorator,
{
decorateFn,
decorateMethod,
decorate,
getCache
} from 'list-toolkit/cache/decorator.js';
The module exports the following functions:
Function | Return type | Description | O |
---|---|---|---|
decorateFn(fn, cache) |
function | decorate a function | O(1) |
decorateMethod(object, prop, cache) |
function | decorate a method | O(1) |
decorate(object, prop, cache) |
function | decorate a property | O(1) |
getCache(object, prop) |
cache or undefined
|
get the cache of a property | O(1) |
All decorateXXX()
functions return a wrapped function that caches the result of the original.
The wrapped function has two additional properties:
-
fn
- the original function -
cache
- the cache object
decorateFn()
is used to wrap stand alone functions. Its returned value should be used instead.
decorateMethod()
is a simple helper to decorate a function value defined on an object.
decorate()
is a simple helper to decorate a property value defined on an object. It will decorate
a function value.
getCache()
is a helper to get the cache object from the wrapped property.
It returns undefined
if the property is not decorated and throws an error if the property is not defined.
All methods described above are exported. decorate()
is exported as the default export.