Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Catch exceptions #224

Merged
merged 1 commit into from
Feb 26, 2018
Merged

Catch exceptions #224

merged 1 commit into from
Feb 26, 2018

Conversation

Avaq
Copy link
Member

@Avaq Avaq commented Feb 15, 2018

So I finally reached a point where all the original unit tests are passing again. This marks an important milestone in my work towards implementing a solution for #153.

But the suspicion I mentioned having in #196 has been confirmed: These changes appear to affect performance significantly. In particular the creation of Futures has slowed down, in many cases to about half of the speed, in some even to 10% of the original speed. I don't yet understand why this happens.

latest-screenshot

Additional things to do:

  • Restore test coverage
  • Ensure the recovery branch has a consistent type
  • Ensure fork is not used in tests asserting _interpret - also rename test cases
  • Ensure that the @@type property is bumped
  • Fix the performance issues

/cc @Beanow

@Avaq Avaq self-assigned this Feb 15, 2018
@codecov-io
Copy link

codecov-io commented Feb 15, 2018

Codecov Report

Merging #224 into master will not change coverage.
The diff coverage is 100%.

Impacted file tree graph

@@          Coverage Diff          @@
##           master   #224   +/-   ##
=====================================
  Coverage     100%   100%           
=====================================
  Files          48     48           
  Lines        1046   1143   +97     
=====================================
+ Hits         1046   1143   +97
Impacted Files Coverage Δ
src/dispatchers/extract-right.js 100% <100%> (ø) ⬆️
src/parallel.js 100% <100%> (ø) ⬆️
src/internal/error.js 100% <100%> (ø)
src/dispatchers/chain-rej.js 100% <100%> (ø) ⬆️
src/internal/const.js 100% <100%> (ø) ⬆️
src/dispatchers/bimap.js 100% <100%> (ø) ⬆️
src/dispatchers/alt.js 100% <100%> (ø) ⬆️
src/core.js 100% <100%> (ø) ⬆️
src/encase-n2.js 100% <100%> (ø) ⬆️
src/dispatchers/promise.js 100% <100%> (ø) ⬆️
... and 35 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6bbe351...9d07be2. Read the comment docs.

@Avaq
Copy link
Member Author

Avaq commented Feb 16, 2018

The benchmark results actually look quite different in Node v8. I was using v9 before.

┌─────────────────────────────────────────┬──────────────────┬──────────────────┬────────┬─────────┬───┐
│ suite                                   │ Old              │ New              │ diff   │ change  │ α │
├─────────────────────────────────────────┼──────────────────┼──────────────────┼────────┼─────────┼───┤
│ def.construct.Future                    │ 7,998,149 ±0.68% │ 7,258,871 ±1.14% │ 004.8% │ -009.2% │   │
│ def.construct.of                        │ 7,261,358 ±0.86% │ 6,617,226 ±1.37% │ 004.6% │ -008.9% │   │
│ def.construct.reject                    │ 7,370,939 ±0.69% │ 6,045,733 ±0.76% │ 009.9% │ -018.0% │   │
│ def.construct.after                     │ 6,954,759 ±1.51% │ 5,605,209 ±0.48% │ 010.7% │ -019.4% │ ✗ │
│ def.construct.attempt                   │ 6,064,616 ±2.66% │ 4,434,121 ±5.59% │ 015.5% │ -026.9% │ ✗ │
│ def.construct.cache                     │ 4,264,237 ±7.22% │ 3,729,538 ±4.98% │ 006.7% │ -012.5% │   │
│ def.construct.chainRec                  │ 5,914,881 ±2.39% │ 5,387,866 ±1.54% │ 004.7% │ -008.9% │   │
│ def.construct.encase                    │ 5,663,542 ±1.48% │ 4,448,120 ±2.83% │ 012.0% │ -021.5% │ ✗ │
│ def.construct.encaseN                   │ 4,734,732 ±6.55% │ 4,182,550 ±4.26% │ 006.2% │ -011.7% │   │
│ def.construct.encaseP                   │ 5,494,980 ±2.93% │ 5,058,328 ±0.81% │ 004.1% │ -007.9% │   │
│ def.construct.go                        │ 6,073,758 ±1.91% │ 5,529,125 ±1.60% │ 004.7% │ -009.0% │   │
│ def.construct.hook                      │ 4,436,183 ±2.24% │ 3,745,082 ±1.48% │ 008.4% │ -015.6% │   │
│ def.construct.node                      │ 6,075,830 ±1.22% │ 5,242,111 ±1.09% │ 007.4% │ -013.7% │   │
│ def.construct.parallel                  │ 4,740,475 ±0.70% │ 3,109,314 ±3.18% │ 020.8% │ -034.4% │ ✗ │
│ def.transform.ap                        │ 130,686 ±3.71%   │ 118,276 ±4.62%   │ 005.0% │ -009.5% │   │
│ def.transform.map                       │ 361,595 ±5.57%   │ 433,094 ±2.07%   │ 009.0% │ +019.8% │   │
│ def.transform.chain                     │ 174,833 ±2.76%   │ 155,470 ±4.46%   │ 005.9% │ -011.1% │   │
│ run.construct.parallel.empty            │ 5,790,409 ±2.71% │ 4,172,110 ±3.67% │ 016.2% │ -027.9% │ ✗ │
│ run.construct.parallel.small.sequential │ 366,858 ±3.93%   │ 375,237 ±2.77%   │ 001.1% │ +002.3% │   │
│ run.construct.parallel.small.concurrent │ 333,434 ±4.10%   │ 380,925 ±2.69%   │ 006.6% │ +014.2% │   │
│ run.construct.parallel.big.sequential   │ 14,344 ±4.75%    │ 16,291 ±6.95%    │ 006.4% │ +013.6% │   │
│ run.construct.parallel.big.concurrent   │ 19,117 ±2.92%    │ 19,891 ±1.49%    │ 002.0% │ +004.0% │   │
│ run.transform.sync.ap                   │ 112,592 ±2.48%   │ 109,112 ±2.37%   │ 001.6% │ -003.1% │   │
│ run.transform.sync.map                  │ 298,304 ±0.92%   │ 248,961 ±4.87%   │ 009.0% │ -016.5% │   │
│ run.transform.sync.chain.one            │ 99,727 ±9.64%    │ 125,796 ±2.92%   │ 011.6% │ +026.1% │ ✓ │
│ run.transform.sync.chain.many           │ 149 ±4.68%       │ 164 ±3.68%       │ 004.7% │ +009.9% │   │
│ run.transform.async.ap                  │ 51,058 ±1.89%    │ 47,973 ±3.26%    │ 003.1% │ -006.0% │   │
│ run.transform.async.map                 │ 97,264 ±2.62%    │ 92,465 ±2.03%    │ 002.5% │ -004.9% │   │
│ run.transform.async.chain.one           │ 57,758 ±1.72%    │ 55,297 ±2.03%    │ 002.2% │ -004.3% │   │
│ run.transform.async.chain.many          │ 100 ±4.14%       │ 99 ±2.95%        │ 000.4% │ -000.8% │   │
│ run.transform.async.race.fast-vs-slow   │ 59,531 ±11.25%   │ 64,229 ±4.42%    │ 003.8% │ +007.9% │   │
│ run.transform.async.race.slow-vs-fast   │ 69,577 ±3.11%    │ 64,617 ±2.15%    │ 003.7% │ -007.1% │   │
│ run.transform.async.race.slow-vs-slow   │ 782 ±1.57%       │ 780 ±0.81%       │ 000.1% │ -000.2% │   │
└─────────────────────────────────────────┴──────────────────┴──────────────────┴────────┴─────────┴───┘

@Avaq
Copy link
Member Author

Avaq commented Feb 19, 2018

Given:

const unsafe = Math.random() > 0.5 ? {foo: 1} : null
const noop = () => {}

I'm a bit concerned that the following:

Future((rej, res) => {
  const foo = unsafe.foo
  res(foo)
})._interpret(noop, noop, noop)

Will not be equal to:

const foo = unsafe.foo
Future((rej, res) => {
  res(foo)
})._interpret(noop, noop, noop)

Essentially the exception catching feature removes a level of being able to reason about, and safely refactor, certain code.

@Beanow
Copy link

Beanow commented Feb 19, 2018

@Avaq I would not have considered these to be equivalent either way. Especially when considering potential failures.

Should unsafe.foo for example have been a configuration value, I would strongly favour doing related evaluations as early as possible, so bad configuration can crash the program early. Additionally such a future might be generated more often by a factory it'd be wasteful to evaluate it repeatedly (unless unsafe is supposed to be mutable).

On the other hand, if this unsafe.foo is a value obtained through less predictable means, such as a database or external API, it would be swell to contain the failure and prevent crashes. Even if it would arguably be an implementation error to be fixed in the code, rather than a user error.

@Avaq Avaq changed the title WIP: Catch exceptions Catch exceptions Feb 22, 2018
Copy link
Member Author

@Avaq Avaq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So except for the potential performance issues (which I could only observe in Node 9.x), this is now no longer a WIP. Anybody care to review some of this?


if(typeof Object.create !== 'function') error('Please polyfill Object.create to use Fluture');
if(typeof Object.assign !== 'function') error('Please polyfill Object.assign to use Fluture');
if(typeof Array.isArray !== 'function') error('Please polyfill Array.isArray to use Fluture');
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This chunk was useless, because rollup changes the order of code evaluation in such a way that this code is reached only after some parts have already tried to use these functions. It's also not super important, as it only served to slightly improve the error messages.

@@ -20,7 +14,7 @@ Future.map = map;
Future.bimap = bimap;
Future.chain = chain;

var Par = concurrify(Future, never, race, parallelAp);
var Par = concurrify(Future, never, race, function parallelAp(a, b){ return b._parallelAp(a) });
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed parallelAp from a static constructor to a standard Action. This saved me writing some extra code.

"test:build": "npm run clean && npm run build && mocha --require @std/esm --ui bdd --reporter dot --bail test/**.buildtest.js",
"test:coverage": "npm run clean && nyc --include src mocha --require @std/esm --ui bdd --reporter dot test/**.test.js || true",
"coverage:upload": "nyc report --reporter=text-lcov > coverage.lcov && codecov",
"coverage:report": "nyc report --reporter=html",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes facilitate potentially failing tests while testing for coverage -- annoyingly when testing toString, sometimes the coverage instrumentation breaks expected output.

@Avaq
Copy link
Member Author

Avaq commented Feb 22, 2018

I just flipped the libraries I'm comparing, so Old shows the results for the new stuff, and New for the old stuff, and it turns out that the results are exactly the same.. So we are getting false positives. It might be some optimisation introduced in v9 that doesn't for some reason apply to the right side.

┌────────────────────────┬──────────────────────────────┬─────────────────────────────┬────────┬─────────┬───┐
│ suite                  │ Old                          │ New                         │ diff   │ change  │ α │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.Future   │ 277,883,345 Hz ±0.66% (n 90) │ 25,408,997 Hz ±1.12% (n 83) │ 083.2% │ -090.9% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.of       │ 166,840,514 Hz ±0.38% (n 86) │ 30,217,530 Hz ±1.00% (n 85) │ 069.3% │ -081.9% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.reject   │ 177,376,212 Hz ±1.21% (n 78) │ 30,463,316 Hz ±2.22% (n 83) │ 070.7% │ -082.8% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.after    │ 46,805,158 Hz ±0.48% (n 83)  │ 19,378,201 Hz ±0.93% (n 81) │ 041.4% │ -058.6% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.attempt  │ 26,059,718 Hz ±0.74% (n 86)  │ 16,275,959 Hz ±1.09% (n 79) │ 023.1% │ -037.5% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.cache    │ 10,537,069 Hz ±1.87% (n 87)  │ 9,691,659 Hz ±0.91% (n 91)  │ 004.2% │ -008.0% │   │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.chainRec │ 18,929,958 Hz ±2.29% (n 83)  │ 11,663,289 Hz ±1.81% (n 75) │ 023.8% │ -038.4% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.encase   │ 20,333,009 Hz ±1.76% (n 76)  │ 11,697,816 Hz ±0.90% (n 88) │ 027.0% │ -042.5% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.encaseN  │ 18,895,576 Hz ±1.30% (n 82)  │ 10,578,323 Hz ±0.99% (n 85) │ 028.2% │ -044.0% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.encaseP  │ 18,758,250 Hz ±0.72% (n 86)  │ 9,836,303 Hz ±0.77% (n 90)  │ 031.2% │ -047.6% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.go       │ 16,231,686 Hz ±2.15% (n 77)  │ 13,508,766 Hz ±0.58% (n 92) │ 009.2% │ -016.8% │   │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.hook     │ 9,871,054 Hz ±0.64% (n 90)   │ 8,061,882 Hz ±0.69% (n 89)  │ 010.1% │ -018.3% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.node     │ 27,726,998 Hz ±0.94% (n 80)  │ 12,686,311 Hz ±0.71% (n 85) │ 037.2% │ -054.2% │ ✗ │
├────────────────────────┼──────────────────────────────┼─────────────────────────────┼────────┼─────────┼───┤
│ def.construct.parallel │ 9,710,786 Hz ±1.15% (n 90)   │ 6,811,319 Hz ±0.48% (n 91)  │ 017.5% │ -029.9% │ ✗ │
└────────────────────────┴──────────────────────────────┴─────────────────────────────┴────────┴─────────┴───┘

@Avaq
Copy link
Member Author

Avaq commented Feb 23, 2018

Seeing as the diff is so huge, I would like to highlight some of the important changes and decisions, so that they can be discussed.

  1. Consistently typed recovery branch. When Fluture catches a value that was thrown, it is turned into a new Error no matter the original value. The message of this error will contain any message that could be extracted from the value thrown, plus some contextual information.

    export function someError(action, e, s){
    var context = typeof s === 'string' ? '\n\n In: ' + s : '';
    try{
    var name = e && e.name ? String(e.name) : 'An error';
    var errorMessage = (
    e && e.message ? String(e.message) :
    e && typeof e.toString === 'function' ? e.toString() :
    String(e)
    );
    var message = errorMessage.trim().split('\n').map(indent).join('\n');
    return error(name + ' came up while ' + action + ':\n' + message + context + '\n');
    }catch(_){
    return error('Something came up while ' + action + ', ' +
    'but it could not be converted to string' + context + '\n');
    }
    }

    This means that users will not be able to use mixing error types to handle them differently, this is by design - an error in the recovery branch is always considered a bug. It also means that users can rely on the recovery branch having a consistent type.

  2. Asynchronously thrown exceptions. When a Future is forked, it might synchronously call the continuations if the underlying computation happens synchronously. This meant that if an exception would be thrown, some outer catch statement would sometimes get the opportunity to catch it. I've removed this opportunity by having fork always throw the exception in the recovery branch in the next tick.

    export var raise = immediately(function raise(x){
    throw x;
    });

    Fluture/src/core.js

    Lines 129 to 134 in 85d4eb8

    Future.prototype.fork = function Future$fork(rej, res){
    if(!isFuture(this)) throwInvalidContext('Future#fork', this);
    if(!isFunction(rej)) throwInvalidArgument('Future#fork', 0, 'to be a Function', rej);
    if(!isFunction(res)) throwInvalidArgument('Future#fork', 0, 'to be a Function', res);
    return this._interpret(raise, rej, res);
    };

    This means that if you don't specify a recovery function, a bug will always crash the program (or end up in a global exception handler, which is expected to then exit the program).
    It also means that Future((rej, res) => { otherFuture().fork(rej, compose(res, f)) }) is no longer strictly equivalent to otherFuture().map(f) in cases where otherFuture() resolves synchronously. To me that sounds like a small price to pay.

  3. Throwing the recovery reason on .promise(). Promises mix bugs with expected failures and in my view that's a flaw. Fluture keeps them separate. But what to do when casting a Future to a Promise? Currently the error is thrown.

    Fluture/src/core.js

    Lines 150 to 155 in 85d4eb8

    Future.prototype.promise = function Future$promise(){
    var _this = this;
    return new Promise(function Future$promise$computation(res, rej){
    _this._interpret(raise, rej, res);
    });
    };

    This means that .promse() potentially throws (which has always been the case). But since users asked for a Promise, they might actually want the bugs and failures to mix.

@rpominov
Copy link

This meant that if an exception would be thrown, some outer catch statement would sometimes get the opportunity to catch it.

It's kinda hard for me to judge because I wouldn't wrap big chunks of computation into a try...catch, but maybe for people who do, it's actually desirable to have such kind of exceptions caught.

Also, I'm guessing that the solution involves a rethrowing an exception. In this case, a debugger won't pause on the original exception, and users won't be able to inspect a frozen program. I, for one, rely a lot on this feature of a debugger.

This means that .promse() potentially throws (which has always been the case). But since users asked for a Promise, they might actually want the bugs and failures to mix.

A possible solution would be to direct bugs into promises' failure path and direct both errors and normal results into promises' success, using an improvised Either. Basically what fun-task does https://github.com/rpominov/fun-task/blob/master/docs/api-reference.md#tasktopromiseoptions 😅

@Avaq
Copy link
Member Author

Avaq commented Feb 24, 2018

This meant that if an exception would be thrown, some outer catch statement would sometimes get the opportunity to catch it.

It's kinda hard for me to judge because I wouldn't wrap big chunks of computation into a try...catch

Generally this kind of thing happens if there is some underlying abstraction calling your code, such as when the Future is forked as part of an Express middleware.

A possible solution would be to direct bugs into promises' failure path and direct both errors and normal results into promises' success, using an improvised Either.

I like this idea, but I don't like introducing a competing Either type. Perhaps I can change the signature to:

promise :: (a -> c) -> (b -> c) -> Future a b -> Promise c

One would use it like:

const toPromise = Future.promise(S.Left, S.Right);

toPromise(Future.reject('meh')) // Promise<Left('meh')>

@rpominov
Copy link

Future.promise(S.Left, S.Right)

This looks great 👍

Regarding rethrowing an exception on a next tick, I still think it's a bad idea. Maybe it's just me, but I can get very angry at times when a library messes with exceptions somehow and I need to fix a bug asap.

@Avaq
Copy link
Member Author

Avaq commented Feb 24, 2018

I can get very angry at times when a library messes with exceptions somehow

My reasoning for this behaviour was that I wanted to make the behaviour more consistent. Nine out of ten times when you fork a Future that ends up throwing, it will throw halfway through the computation in some unexpected tick. The only time you can catch an exception thrown during computation is if it happens to throw in the same tick in which .fork was called. In my experience this can lead to surprises. Eliminating the possibility for catching these exceptions altogether seemed to me like it would lead to fewer surprises.

However, I think it's a very compelling point that this is kind of anti-tooling. It's probably easier to track an error that was thrown, caught, and retrown all in the same tick, using various debuggers. I also like the idea that Future((rej, res) => { otherFuture().fork(rej, compose(res, f)) }) remains equivalent to otherFuture().map(f). This allows for people to write their own transformations using Future() and fork(), without losing something over the built-in transformations.

So in conclusion, I will experiment with synchronously thrown errors, and likely amend this PR. Thank you for your feedback Roman! :D

@Avaq
Copy link
Member Author

Avaq commented Feb 25, 2018

Future.promise(S.Left, S.Right)

I'm going to leave this for another PR, leaving the signature (and behaviour) of .promise() unchanged for this one.

@Avaq Avaq merged commit dd15a8d into master Feb 26, 2018
@Avaq Avaq deleted the avaq/catch branch February 26, 2018 11:59
Avaq added a commit that referenced this pull request Jul 26, 2018
Breaking changes

- #224 Exceptions are now caught and rethrown in `fork`.
- #230 Many of the TypeScript type definitions that used `never` now use a
  generic instead.
- #238 When unsubscribing from a Future created by `hook`, the cancellation
  signal is no longer sent to the disposal Future.
- #266 Exported TypeScript interfaces have been renamed to avoid naming
  conflicts with exported values.

New features

- #250 Included interoperability with Sanctuary Show
- #261 Added a new function, `forkCatch`, which allows for exception recovery.
Avaq added a commit that referenced this pull request Jan 20, 2019
When a Future created from a Promise would enter crashed state later
along a synchronous pipeline, the original Promise would catch the error
and change its state to rejected. That rejection would never be handled,
and cause the Promise to swallow it, triggering the unhandledRejection
events in some Promise implementations.

Related to #150 and #224.
Avaq added a commit that referenced this pull request Jan 20, 2019
When a Future created from a Promise would enter crashed state later
along a synchronous pipeline, the original Promise would catch the error
and change its state to rejected. That rejection would never be handled,
and cause the Promise to swallow it, triggering the unhandledRejection
events in some Promise implementations.

Related to #150 and #224.
Avaq added a commit that referenced this pull request Jan 21, 2019
When a Future created from a Promise would enter crashed state later
along a synchronous pipeline, the original Promise would catch the error
and change its state to rejected. That rejection would never be handled,
and cause the Promise to swallow it, triggering the unhandledRejection
events in some Promise implementations.

Related to #150 and #224.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants