Skip to content

Commit

Permalink
feat: optional refresh strategy (stale-while-revalidate) #533 (#586)
Browse files Browse the repository at this point in the history
* feat: optional refresh strategy (stale-while-revalidate) #533

* feat: optional refresh strategy (stale-while-revalidate) #533
test update

* feat: optional refresh strategy (stale-while-revalidate) #533
update: README.md

* feat: optional refresh strategy (stale-while-revalidate) #533
update: README.md

* feat: optional refresh strategy (stale-while-revalidate) #533
update: tests to be less time sensitive in slow environments
  • Loading branch information
jonathanarezki committed Sep 8, 2023
1 parent 5cb7ab1 commit d38d6d2
Show file tree
Hide file tree
Showing 5 changed files with 95 additions and 4 deletions.
31 changes: 29 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,16 +119,43 @@ await multiCache.mset(
// This is done recursively until either:
// - all have been found
// - all caches has been fetched
console.log(await multiCache.mget('key', 'key2');
console.log(await multiCache.mget('key', 'key2'));
// >> ['bar', 'bar2']

// Delete keys with mdel() passing arguments...
await multiCache.mdel('foo', 'foo2');

```

See unit tests in [`test/multi-caching.test.ts`](./test/multi-caching.test.ts) for more information.

### Refresh cache keys in background

Both the `caching` and `multicaching` modules support a mechanism to refresh expiring cache keys in background when using the `wrap` function.
This is done by adding a `refreshThreshold` attribute while creating the caching store.

If `refreshThreshold` is set and after retrieving a value from cache the TTL will be checked.
If the remaining TTL is less than `refreshThreshold`, the system will update the value asynchronously,
following same rules as standard fetching. In the meantime, the system will return the old value until expiration.

NOTES:

* In case of multicaching, the store that will be checked for refresh is the one where the key will be found first (highest priority).
* If the threshold is low and the worker function is slow, the key may expire and you may encounter a racing condition with updating values.
* The background refresh mechanism currently does not support providing multiple keys to `wrap` function.

For example, pass the refreshThreshold to `caching` like this:

```typescript
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 /*milliseconds*/,
refreshThreshold: 3 * 1000 /*milliseconds*/,
});
```

When a value will be retrieved from Redis with a TTL minor than 3sec, the value will be updated in the background.
```
## Store Engines
### Official and updated to last version
Expand Down
7 changes: 7 additions & 0 deletions src/caching.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ import { MemoryCache, MemoryConfig, memoryStore } from './stores';

export type Config = {
ttl?: Milliseconds;
refreshThreshold?: Milliseconds;
isCacheable?: (val: unknown) => boolean;
};

Expand Down Expand Up @@ -85,6 +86,12 @@ export async function caching<S extends Store, T extends object = never>(
const cacheTTL = typeof ttl === 'function' ? ttl(result) : ttl;
await store.set<T>(key, result, cacheTTL);
return result;
} else if (args?.refreshThreshold) {
const cacheTTL = typeof ttl === 'function' ? ttl(value) : ttl;
const remainingTtl = await store.ttl(key);
if (remainingTtl < args.refreshThreshold) {
fn().then((result) => store.set<T>(key, result, cacheTTL));
}
}
return value;
},
Expand Down
6 changes: 4 additions & 2 deletions src/multi-caching.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ export function multiCaching<Caches extends Cache[]>(
) => {
await Promise.all(caches.map((cache) => cache.set(key, data, ttl)));
};

return {
get,
set,
Expand All @@ -50,9 +51,10 @@ export function multiCaching<Caches extends Cache[]>(
return result;
} else {
const cacheTTL = typeof ttl === 'function' ? ttl(value) : ttl;
await Promise.all(
Promise.all(
caches.slice(0, i).map((cache) => cache.set(key, value, cacheTTL)),
);
).then();
caches[i].wrap(key, fn, ttl).then(); // call wrap for store for internal refreshThreshold logic, see: src/caching.ts caching.wrap
}
return value;
},
Expand Down
25 changes: 25 additions & 0 deletions test/caching.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -228,5 +228,30 @@ describe('caching', () => {
describe('issues', () => {
it('#183', () =>
expect(cache.wrap('constructor', async () => 0)).resolves.toEqual(0));

it('#533', () => {
expect(
(async () => {
cache = await caching('memory', {
ttl: 5 * 1000,
refreshThreshold: 4 * 1000,
});

await cache.wrap('refreshThreshold', async () => 0);
await new Promise((resolve) => {
setTimeout(resolve, 2 * 1000);
});
await cache.wrap('refreshThreshold', async () => 1);
await new Promise((resolve) => {
setTimeout(resolve, 500);
});
await cache.wrap('refreshThreshold', async () => 2);
await new Promise((resolve) => {
setTimeout(resolve, 500);
});
return cache.wrap('refreshThreshold', async () => 3);
})(),
).resolves.toEqual(1);
});
});
});
30 changes: 30 additions & 0 deletions test/multi-caching.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -254,5 +254,35 @@ describe('multiCaching', () => {
await expect(cache0.get(key)).resolves.toEqual(value);
await expect(cache1.get(key)).resolves.toEqual(value);
});

it('#533', () => {
expect(
(async () => {
const cache0 = await caching('memory', {
ttl: 5 * 1000,
refreshThreshold: 4 * 1000,
});
const cache1 = await caching('memory', {
ttl: 10 * 1000,
refreshThreshold: 8 * 1000,
});
const multi = multiCaching([cache0, cache1]);

await multi.wrap('refreshThreshold', async () => 0);
await new Promise((resolve) => {
setTimeout(resolve, 2 * 1000);
});
await multi.wrap('refreshThreshold', async () => 1);
await new Promise((resolve) => {
setTimeout(resolve, 500);
});
await multi.wrap('refreshThreshold', async () => 2);
await new Promise((resolve) => {
setTimeout(resolve, 500);
});
return multi.wrap('refreshThreshold', async () => 3);
})(),
).resolves.toEqual(1);
});
});
});

0 comments on commit d38d6d2

Please sign in to comment.