Skip to content

Commit

Permalink
Fix unhandled Promise rejections in pipeline.exec
Browse files Browse the repository at this point in the history
1. Always return the same Promise in pipeline.exec.

   An example of why the original Promise should be returned, rather than
   a different Promise resolving to the original Promise:

   ```javascript
   process.on('unhandledRejection', (reason) => console.log('unhandledRejection', reason));
   const x = Promise.reject(new Error());
   x.catch((e) => console.log('caught x', e));

   const causesUnhandledRejection = Promise.resolve().then(() => x);
   ```
2. In the negligible chance the `script exists`/`script load` command
   failed, give up, catch the rejected Promise with `finally`,
   and try to run the commands in the pipeline anyway.
   (e.g. networking issues, corrupted bytes, etc)

   This isn't the best approach, but it's hopefully better than an unhandled Promise rejection,
   resource/memory leak, or hanging request due to a redis Promise never
   resolving or rejecting for an individual command in this edge case.
3. There are calls to `this.exec(...);` all over the Pipeline that don't
   check if the returned Promise is caught which I believe could lead to
   unhandled Promise rejections on retrying failed commands.

   Also, lib/autopipelining.js also called pipeline.exec without
   checking for errors.

   The fact there is now exactly one Promise instance that can be returned
   means that this no longer can cause unhandled Promise rejections.
   (see example snippet in commit description)
   (`standard-as-callback` is catching rejections)
4. pipeline.exec can explicitly call reject for ioredis's Redis.Cluster
   client
  • Loading branch information
TysonAndre committed Nov 23, 2021
1 parent 77ad094 commit e2892ad
Showing 1 changed file with 22 additions and 11 deletions.
33 changes: 22 additions & 11 deletions lib/pipeline.ts
Original file line number Diff line number Diff line change
Expand Up @@ -240,7 +240,15 @@ Pipeline.prototype.execBuffer = deprecate(function () {
return execBuffer.apply(this, arguments);
}, "Pipeline#execBuffer: Use Pipeline#exec instead");

Pipeline.prototype.exec = function (callback: CallbackFunction) {
// NOTE: To avoid an unhandled promise rejection, this will unconditionally always return this.promise,
// which always has the rejection handled by standard-as-callback
// adding the provided rejection callback.
//
// If a different promise instance were returned, that promise would cause its own unhandled promise rejection
// errors, even if that promise unconditionally resolved to **the resolved value of** this.promise.
Pipeline.prototype.exec = function (
callback: CallbackFunction
): Promise<Array<any>> {
// Wait for the cluster to be connected, since we need nodes information before continuing
if (this.isCluster && !this.redis.slots.length) {
if (this.redis.status === "wait") this.redis.connect().catch(noop);
Expand Down Expand Up @@ -338,17 +346,19 @@ Pipeline.prototype.exec = function (callback: CallbackFunction) {

// In cluster mode, always load scripts before running the pipeline
if (this.isCluster) {
return pMap(scripts, (script) => _this.redis.script("load", script.lua), {
pMap(scripts, (script) => _this.redis.script("load", script.lua), {
concurrency: 10,
}).then(function () {
for (let i = 0; i < scripts.length; i++) {
_this.redis._addedScriptHashes[scripts[i].sha] = true;
}
return execPipeline();
});
})
.then(function () {
for (let i = 0; i < scripts.length; i++) {
_this.redis._addedScriptHashes[scripts[i].sha] = true;
}
})
.finally(execPipeline);
return this.promise;
}

return this.redis
this.redis
.script(
"exists",
scripts.map(({ sha }) => sha)
Expand All @@ -371,8 +381,9 @@ Pipeline.prototype.exec = function (callback: CallbackFunction) {
for (let i = 0; i < scripts.length; i++) {
_this.redis._addedScriptHashes[scripts[i].sha] = true;
}
return execPipeline();
});
})
.finally(execPipeline);
return this.promise;

function execPipeline() {
let data = "";
Expand Down

0 comments on commit e2892ad

Please sign in to comment.