Skip to content

Commit

Permalink
Changing public interface
Browse files Browse the repository at this point in the history
  • Loading branch information
corradodellorusso committed Nov 24, 2023
1 parent d444bf3 commit fec974e
Show file tree
Hide file tree
Showing 8 changed files with 124 additions and 108 deletions.
77 changes: 42 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
# polycache

[![codecov](https://codecov.io/gh/corradodellorusso/polycache/branch/master/graph/badge.svg?token=ZV3G5IFigq)](https://codecov.io/gh/corradodellorusso/polycache)
[![tests](https://github.com/corradodellorusso/polycache/actions/workflows/test.yml/badge.svg)](https://github.com/corradodellorusso/polycache/actions/workflows/test.yml)
[![license](https://img.shields.io/github/license/corradodellorusso/polycache)](https://github.com/corradodellorusso/polycache/blob/master/LICENSE)
[![npm](https://img.shields.io/npm/dm/polycache-core)](https://npmjs.com/package/polycache-core)
![npm](https://img.shields.io/npm/v/polycache-core)

# Flexible NodeJS cache module
## Flexible multi cache module

A cache module for nodejs that allows easy wrapping of functions in cache, tiered caches, and a consistent interface.
A cache module for node that allows easy wrapping of functions, tiered caches, and a consistent interface.

## Features

- Made with Typescript and compatible with [ESModules](https://nodejs.org/docs/latest-v14.x/api/esm.html)
- Easy way to wrap any function in cache.
- Tiered caches -- data gets stored in each cache and fetched from the highest.
priority cache(s) first.
- Tiered caches -- data gets stored in each cache and fetched from the highest priority cache(s) first.
- Use any cache you want, as long as it has the same API.
- 100% test coverage via [vitest](https://github.com/vitest-dev/vitest).

Expand All @@ -27,12 +27,14 @@ A cache module for nodejs that allows easy wrapping of functions in cache, tiere
### Single Store

```typescript
import { caching } from 'polycache-core';
import { caching, createLruStore } from 'polycache-core';

const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 /*milliseconds*/,
});
const memoryCache = caching(
createLruStore({
max: 100,
ttl: 10 * 1000 /*milliseconds*/,
}),
);

const ttl = 5 * 1000; /*milliseconds*/
await memoryCache.set('foo', 'bar', ttl);
Expand All @@ -54,28 +56,27 @@ console.log(await memoryCache.wrap(key, () => getUser(userId), ttl));
// >> { id: 123, name: 'Bob' }
```

See unit tests in [`test/caching.test.ts`](./test/caching.test.ts) for more information.

#### Example setting/getting several keys with mset() and mget()
#### Example setting/getting several keys with setMany() and getMany()

```typescript
await memoryCache.store.mset(
await memoryCache.store.setMany(
[
['foo', 'bar'],
['foo2', 'bar2'],
],
ttl,
);

console.log(await memoryCache.store.mget('foo', 'foo2'));
console.log(await memoryCache.store.getMany('foo', 'foo2'));
// >> ['bar', 'bar2']

// Delete keys with mdel() passing arguments...
await memoryCache.store.mdel('foo', 'foo2');
// Delete keys with delMany() passing arguments...
await memoryCache.store.delMany('foo', 'foo2');
```

#### Custom Stores
Under construction...

Custom stores can be easily built by adhering to the `Store` type.

### Multi-Store

Expand All @@ -99,29 +100,27 @@ await multiCache.del('foo2');

// Sets multiple keys in all caches.
// You can pass as many key, value tuples as you want
await multiCache.mset(
await multiCache.setMany(
[
['foo', 'bar'],
['foo2', 'bar2'],
],
ttl
ttl,
);

// mget() fetches from highest priority cache.
// getMany() fetches from highest priority cache.
// If the first cache does not return all the keys,
// the next cache is fetched with the keys that were not found.
// This is done recursively until either:
// - all have been found
// - all caches has been fetched
console.log(await multiCache.mget('key', 'key2'));
console.log(await multiCache.getMany('key', 'key2'));
// >> ['bar', 'bar2']

// Delete keys with mdel() passing arguments...
await multiCache.mdel('foo', 'foo2');
// Delete keys with delMany() passing arguments...
await multiCache.delMany('foo', 'foo2');
```

See unit tests in [`test/multi-caching.test.ts`](./test/multi-caching.test.ts) for more information.

### Refresh cache keys in background

Both the `caching` and `multicaching` modules support a mechanism to refresh expiring cache keys in background when using the `wrap` function.
Expand All @@ -133,26 +132,34 @@ following same rules as standard fetching. In the meantime, the system will retu

NOTES:

* In case of multicaching, the store that will be checked for refresh is the one where the key will be found first (highest priority).
* If the threshold is low and the worker function is slow, the key may expire and you may encounter a racing condition with updating values.
* The background refresh mechanism currently does not support providing multiple keys to `wrap` function.
* If no `ttl` is set for the key, the refresh mechanism will not be triggered. For redis, the `ttl` is set to -1 by default.
- In case of multicaching, the store that will be checked for refresh is the one where the key will be found first (highest priority).
- If the threshold is low and the worker function is slow, the key may expire and you may encounter a racing condition with updating values.
- The background refresh mechanism currently does not support providing multiple keys to `wrap` function.
- If no `ttl` is set for the key, the refresh mechanism will not be triggered. For redis, the `ttl` is set to -1 by default.

For example, pass the refreshThreshold to `caching` like this:

```typescript
const memoryCache = await caching('memory', {
max: 100,
ttl: 10 * 1000 /*milliseconds*/,
refreshThreshold: 3 * 1000 /*milliseconds*/,
});
import { createLruStore, caching } from 'polycache-core';

const memoryCache = caching(
createLruStore({
max: 100,
ttl: 10 * 1000 /*milliseconds*/,
}),
{
refreshThreshold: 3 * 1000 /*milliseconds*/,
},
);
```

When a value will be retrieved from Redis with a TTL minor than 3sec, the value will be updated in the background.

## Store Engines
Under construction...

* Built-in LRU store for in-memory caching.

Under construction...

## Contribute

Expand Down
14 changes: 7 additions & 7 deletions src/multi-caching.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ import { coalesceAsync } from 'promise-coalesce';
import { Cache, Milliseconds, WrapOptions } from './types';
import { resolveTTL } from './utils';

export type MultiCache = Omit<Cache, 'store'> & Pick<Cache['store'], 'mset' | 'mget' | 'mdel'>;
export type MultiCache = Omit<Cache, 'store'> & Pick<Cache['store'], 'setMany' | 'getMany' | 'delMany'>;

/**
* Module that lets you specify a hierarchy of caches.
Expand Down Expand Up @@ -51,24 +51,24 @@ export const multiCaching = <C extends Cache[]>(caches: C): MultiCache => {
reset: async () => {
await Promise.all(caches.map((x) => x.reset()));
},
mget: async (...keys: string[]) => {
getMany: async (...keys: string[]) => {
const values = new Array(keys.length).fill(undefined);
for (const cache of caches) {
if (values.every((x) => x !== undefined)) break;
try {
const val = await cache.store.mget(...keys);
const val = await cache.store.getMany(...keys);
val.forEach((v, i) => {
if (values[i] === undefined && v !== undefined) values[i] = v;
});
} catch (e) {}
}
return values;
},
mset: async (args: [string, unknown][], ttl?: Milliseconds) => {
await Promise.all(caches.map((cache) => cache.store.mset(args, ttl)));
setMany: async (args: [string, unknown][], ttl?: Milliseconds) => {
await Promise.all(caches.map((cache) => cache.store.setMany(args, ttl)));
},
mdel: async (...keys: string[]) => {
await Promise.all(caches.map((cache) => cache.store.mdel(...keys)));
delMany: async (...keys: string[]) => {
await Promise.all(caches.map((cache) => cache.store.delMany(...keys)));
},
};
};
11 changes: 6 additions & 5 deletions src/stores/lru.ts
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ export type LRUStore = Store & {
/**
* Wrapper for lru-cache.
*/
export function lruStore(args?: LRUConfig): LRUStore {
export const createLruStore = (args?: LRUConfig): LRUStore => {
const shouldCloneBeforeSet = args?.shouldCloneBeforeSet !== false; // clone by default
const isCacheable = args?.isCacheable ?? ((val) => val !== undefined);

Expand All @@ -43,13 +43,14 @@ export function lruStore(args?: LRUConfig): LRUStore {
const lruCache = new LRUCache(lruOpts);

return {
name: 'memory-lru',
del: async (key) => {
lruCache.delete(key);
},
get: async <T>(key: string) => lruCache.get(key) as T,
keys: async () => [...lruCache.keys()],
mget: async (...args) => args.map((x) => lruCache.get(x)),
mset: async (args, ttl?) => {
getMany: async (...args) => args.map((x) => lruCache.get(x)),
setMany: async (args, ttl?) => {
const opt = { ttl: ttl !== undefined ? ttl : lruOpts.ttl } as const;
for (const [key, value] of args) {
if (!isCacheable(value)) {
Expand All @@ -63,7 +64,7 @@ export function lruStore(args?: LRUConfig): LRUStore {
}
}
},
mdel: async (...args) => {
delMany: async (...args) => {
for (const key of args) {
lruCache.delete(key);
}
Expand Down Expand Up @@ -100,4 +101,4 @@ export function lruStore(args?: LRUConfig): LRUStore {
*/
load: (...args: Parameters<LRU['load']>) => lruCache.load(...args),
};
}
};
9 changes: 5 additions & 4 deletions src/types.ts
Original file line number Diff line number Diff line change
@@ -1,13 +1,14 @@
export type Milliseconds = number;

export type Store = {
name: string;
get<T>(key: string): Promise<T | undefined>;
set<T>(key: string, data: T, ttl?: Milliseconds): Promise<void>;
del(key: string): Promise<void>;
reset(): Promise<void>;
mset(args: [string, unknown][], ttl?: Milliseconds): Promise<void>;
mget(...args: string[]): Promise<unknown[]>;
mdel(...args: string[]): Promise<void>;
setMany(args: [string, unknown][], ttl?: Milliseconds): Promise<void>;
getMany(...args: string[]): Promise<unknown[]>;
delMany(...args: string[]): Promise<void>;
keys(pattern?: string): Promise<string[]>;
ttl(key: string): Promise<number>;
};
Expand All @@ -27,6 +28,6 @@ export type Cache<S extends Store = Store> = {
get: <T>(key: string) => Promise<T | undefined>;
del: (key: string) => Promise<void>;
reset: () => Promise<void>;
wrap<T>(key: string, fn: () => Promise<T>, ttl?: WrapOptions<T>): Promise<T>;
wrap: <T>(key: string, fn: () => Promise<T>, ttl?: WrapOptions<T>) => Promise<T>;
store: S;
};

0 comments on commit fec974e

Please sign in to comment.