Skip to content

Commit

Permalink
feat(chunk): added support for chunking (by time or idle) filter and …
Browse files Browse the repository at this point in the history
…map operations
  • Loading branch information
grantila committed Dec 30, 2021
1 parent eb021db commit 0280855
Show file tree
Hide file tree
Showing 3 changed files with 353 additions and 10 deletions.
43 changes: 43 additions & 0 deletions README.md
Expand Up @@ -252,6 +252,29 @@ const outArray = await filter( inArray, filterFun );
const outArray = await filter( inArray, { concurrency: 4 }, filterFun );
```

### filter operations chunked by idle time

Some filter operations (predicate functions) are heavy on calculations. To not starve the system (e.g. a browser) from CPU resources, the filter can be chopped up in smaller chunks with either a `setTimeout(0)` or by using [`requestIdleCallback`](https://developer.mozilla.org/en-US/docs/Web/API/Window/requestIdleCallback).

The options used to specify concurrency can instead specify `chunk`. This implies a concurrency of 1, i.e. no concurrency. Chunking is mostly useful in synchronously heavy operations, not asynchronous.

Specify a chunk time explicitly, e.g. 50ms:

```ts
import { filter } from 'already'

const outArray = await filter( inArray, { chunk: 50 }, filterFun );
```

or use `requestIdleCallback` to try to maintain a hang-free experience in browsers:

```ts
import { filter } from 'already'

const outArray = await filter( inArray, { chunk: 'idle' }, filterFun );
```


## map

Same as with `filter`, `map` acts like awaiting all promises in an array, and then applying `array.map( )` on the result. Also, just like with `filter`, it will await the resulting promises from the map callback (if they actually are promises).
Expand Down Expand Up @@ -286,6 +309,26 @@ const outArray = await map( inArray, mapFun );
const outArray = await map( inArray, { concurrency: 4 }, mapFun );
```

### map operations chunked by idle time

Some map operations (predicate functions) are heavy on calculations, just like `filter`. And for [the same reasons](#filter-operations-chunked-by-idle-time), you can select `chunk` to chunk up a map operation to not starve system from CPU resources in (synchronously) heavy map operations:

Specify a chunk time explicitly, e.g. 50ms:

```ts
import { map } from 'already'

const outArray = await map( inArray, { chunk: 50 }, mapFun );
```

or use `requestIdleCallback` to try to maintain a hang-free experience in browsers:

```ts
import { map } from 'already'

const outArray = await map( inArray, { chunk: 'idle' }, mapFun );
```


## flatMap

Expand Down
105 changes: 95 additions & 10 deletions lib/index.ts
Expand Up @@ -228,8 +228,14 @@ export interface ConcurrencyOptions
{
concurrency: number;
}
export type FilterMapOptions = Partial< ConcurrencyOptions >;
const defaultFilterMapOptions: FilterMapOptions = { concurrency: Infinity };
export interface ChunkOptions
{
chunk: number | 'idle';
}
export type FilterMapOptions = Partial< ConcurrencyOptions | ChunkOptions >;
const defaultFilterMapOptions: FilterMapOptions = {
concurrency: Infinity
};

export type MapArray< T > =
Array< T | PromiseLike< T > > |
Expand Down Expand Up @@ -349,25 +355,104 @@ export function map< T, U >(
? defaultFilterMapOptions
: < FilterMapOptions >arr;

const { concurrency = Infinity } = opts;

const promiseMapFn =
( t: T, index: number, arr: ConcatArray< T | PromiseLike< T > > ) =>
Promise.resolve( ( < MapFn< T, U > >mapFn )( t, index, arr ) );

const { concurrency = Infinity } = opts as Partial< ConcurrencyOptions >;
const { chunk } = opts as Partial< ChunkOptions >;

if ( typeof chunk !== 'undefined' )
{
if ( typeof chunk !== 'number' && chunk !== 'idle' )
throw new Error( `Invalid 'chunk' option to 'map': ${chunk}` );

const useIdle = chunk === 'idle';
const timeout =
chunk === 'idle'
? 15 // 15ms, allow 1.666ms for browser work (to fill up 16.666 ms)
: chunk;

const looper = async ( cb: ( timeout: number ) => Promise< void > ) =>
new Promise< void >( ( resolve, reject ) =>
{
const resolver = async ( timeout: number ) =>
{
try
{
await cb( timeout );
resolve( );
}
catch ( err )
{
reject( err );
}
}

( useIdle && typeof requestIdleCallback !== 'undefined' )
? requestIdleCallback(
idleDeadline =>
resolver(
// Round down to millisecond, subtract 1.
// That gives a little bit of time for the browser
// to do whatever it wants, and will to some
// degree prevent us to step over the time budget.
Math.floor( idleDeadline.timeRemaining( ) ) - 1
),
{ timeout: 0 }
)
: setTimeout( ( ) => resolver( timeout ), 0 );
} );

return ( t: ConcatArray< T | PromiseLike< T > > )
: Promise< Array< U > > =>
{
return Promise.resolve( t )
.then( async ( values: ConcatArray< T | PromiseLike< T > > ) =>
{
const arr = toReadonlyArray( values );
const ret: Array< U > = [ ];
let i = 0;

const loop = async ( timeout: number ) =>
{
const start = Date.now( );

for ( ; i < arr.length; )
{
const u = await promiseMapFn( await arr[ i ], i, arr );
ret.push( u );
++i;

if ( Date.now( ) - start >= timeout )
{
await looper( loop );
break;
}
}
}

await loop( timeout );

return ret;
} )
.then( values => Promise.all( values ) );
};
}

const concurrently = concurrent( concurrency );

return ( t: ConcatArray< T | PromiseLike< T > > )
: Promise< Array< U > > =>
{
return Promise.resolve( t )
.then( ( values: ConcatArray< T | PromiseLike< T > > ) =>
toReadonlyArray( values ).map(
( val, index, arr ) =>
( ( ) => Promise.resolve( val ) )( )
.then( ( val: T ) =>
concurrently( promiseMapFn, val, index, arr )
)
toReadonlyArray( values )
.map( ( val, index, arr ) =>
( ( ) => Promise.resolve( val ) )( )
.then( ( val: T ) =>
concurrently( promiseMapFn, val, index, arr )
)
)
)
.then( values => Promise.all( values ) );
Expand Down

0 comments on commit 0280855

Please sign in to comment.