New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(cache): enable lru cache #158
Conversation
Benchmark result:
|
By updating the stat code to let num = 0;
const stat = new Set();
// ...
num++;
stat.add(input);
console.log(num, stat.size); And here is the result of one generation of a hexo dummy site (hexo-theme-unit):
And for my own blogs:
It shows why we definitely need a cache. |
@SukkaW I have some questions.
const LFU = require('node-lfu-cache');
const cache = new LFU(2);
cache.set(1, 1);
cache.set(2, 2);
cache.get(1); // returns 1
cache.set(3, 3); // evicts key 2
cache.get(2); // returns undefined (not found)
cache.get(3); // returns 3.
cache.set(4, 4); // evicts key 1.
cache.get(1); // returns undefined (not found)
cache.get(3); // returns 3
cache.get(4); // returns 4 The expected result should be like comments. (https://leetcode.com/problems/lfu-cache/) cache.get(1); // returns 1
cache.get(3); // returns 3
cache.get(4); // returns undefined (not found) This means that when the cache is full, no more items can be added to the cache.
cache.set(1, v);
cache.get(1);
cache.set(2, v);
cache.get(2);
...
cache.set(n, v);
cache.get(n);
// cache evicts key 1, 2 ...
...
cache.set(1, v);
cache.get(1);
cache.set(2, v);
cache.get(2);
...
cache.set(n, v);
cache.get(n);
// cache evicts key 1, 2 ...
... |
We all want to better utilize the cache and not to fill up with once-called key-value, right? Also, the code case you given is exactly expected
I have provided a stat in my last comment:
Although there are 944 urls being processed by |
Ok, I understand your meaning.
This is not working. See my test code above. Or you can run this test. I have tested using your branch of hexo-utils and confirmed about that. |
@dailyrandomphoto Maybe not I have setup a PoC of LFU cache: const LFU = require('node-lfu-cache');
const cache = new LFU(50);
const random = (min, max) => Math.round(Math.random() * (max - min)) + min;
const cacheFunc = (key, value) => {
if (cache.has(key)) return cache.get(key);
cache.set(key, value);
return value;
}
// Will be called only once
cacheFunc('cold', '123');
// Will be called for 10 times
for (let i = 1; i <= 10; i++) {
cacheFunc('hot', '456');
}
// Let the cache size limit exceeded
for (let i = 1; i <= 60; i++) {
cacheFunc(String(100 + i), random(100, 900));
}
console.log('Cold: ' + cacheFunc('cold', '789'));
// Should be 789 not 123, because the cold cache will be removed from the cache.
console.log('Hot: ' + cacheFunc('hot', '789'));
// Should be 456 not 789, because it is hot and remains in the cache.
// > "Cold: 789"
// > "Hot: 456" |
This is a good example. const LFU = require('node-lfu-cache');
const cache = new LFU(50);
const random = (min, max) => Math.round(Math.random() * (max - min)) + min;
const cacheFunc = (key, value) => {
if (cache.has(key)) return cache.get(key);
cache.set(key, value);
return value;
}
// Will be called only once
cacheFunc('cold', '123');
// Will be called for 10 times
for (let i = 1; i <= 10; i++) {
cacheFunc('hot', '456');
}
// Let the cache size limit exceeded
for (let i = 1; i <= 60; i++) {
// save to cache
cacheFunc(String(100 + i), random(100, 900));
// read from cache
cacheFunc(String(100 + i), random(100, 900));
}
cacheFunc('new', '123');
console.log('Cold: ' + cacheFunc('cold', '789'));
// Should be 789 not 123, because the cold cache will be removed from the cache.
console.log('Hot: ' + cacheFunc('hot', '789'));
// Should be 456 not 789, because it is hot and remains in the cache.
console.log('New: ' + cacheFunc('new', '789'));
// Should be 123 not 789, because it is a new item.
// You will find that, when the cache is full, no more items can be added to the cache.
console.log(cache.dump()); https://runkit.com/dailyrandomphoto/5df84e2da91f66001d9620c5 console.log('New: ' + cacheFunc('new', '789'));
// Should be 123 not 789, because it is a new item. BUT, it shows |
I tasted this patch with a large blog (which has dummy 3000 posts). The result shows:
Could you consider to use a plain object? Detals:I have a dummy blog which has 3000 posts. I run And also, to follow @dailyrandomphoto 's comment #158 (comment) Source code:
Result:Elapsed time:
Counter:
Environment:
|
@seaoak What about memory usage? I am still wondering if we need a LFU cache. |
@seaoak The performance regression in your benchmark should be caused by the issue of |
@dailyrandomphoto @seaoak I have change LFU to LRU, please feel free to have a benchmark see if any performance gained. |
Should be superseded by #162, |
I'm sorry I'm late. I found a tool "memwatch". The result shows using cache has little effect to memory usage. Details:For each condition, measure 10 times and calculate the average. Condition changes:
Result:
(modified: data of "seaoak" was mistook) |
I'm guessing cache only stores the pointer to the value, rather than the actual value. |
I have setup a stat by adding a few line of the code to
node_modules/hexo
in my local dummy site:It turns out that even as few as 17 posts in hexo-theme-unit-test,
full_url_for()
helper will be called for 537 times andisExternalLink()
will be called for 2599 times in one generation!So I bring up this PR. The
node-lfu-cache
is enabled for those functions: