Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Async Reply functions (always emit errors) #1596

Merged
merged 6 commits into from Aug 1, 2019

Conversation

mastermatt
Copy link
Member

@mastermatt mastermatt commented Jun 24, 2019

This is a POC for #1557 and is offered as an Option B to #1572.

The difference between this change and #1572 is that this takes a hard line on how to handle errors in reply functions. It always emits an error (105fda4).

I personally believe this is the right design long term but it does introduce a breaking change.

Using the example from the docs:

const scope = nock('http://www.google.com')
  .post('/echo')
  .reply(201, (uri, requestBody, cb) => {
    fs.readFile('cat-poems.txt', cb) // Error-first callback
  })

Currently, if readFile errs, the request will succeed with a response that has a 500 status code.
To achieve the same affect, the code would have to be rewritten by the consumer to:

const scope = nock('http://www.google.com')
  .post('/echo')
  .reply((uri, requestBody, cb) => {
    fs.readFile('cat-poems.txt', (err, contents) => {
      if (err) cb([500, err.stack])
      else cb([201, contents])
    })
  })

However, I would argue that this is not the standard use case nor the most intuitive handling of a test.

This check is no longer reachable because `util.promisfy` will ignore
subsequent calls to the callback method.
@mastermatt mastermatt force-pushed the async-reply-fns-always-emit-errors branch from 98ad7cb to 9f815e8 Compare July 15, 2019 20:58
@paulmelnikow
Copy link
Member

Hey Matt, is your feeling still that this is a better way to go? If it is I will aim to dig in and leave some comments in the next day or so.

@mastermatt
Copy link
Member Author

This version gets my vote unless we deem the breaking changes undesirable.

# Conflicts:
#	tests/test_reply_function_async.js
README.md Show resolved Hide resolved
tests/test_reply_function_async.js Outdated Show resolved Hide resolved
tests/test_reply_function_async.js Outdated Show resolved Hide resolved
tests/test_reply_function_async.js Outdated Show resolved Hide resolved
.get('/')
.reply(200, async (path, requestBody) => {
t.equal(path, '/')
t.equal(requestBody, '')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would make a better assertion if you passed something more specific in, probably both for the path and the body.

tests/test_reply_function_async.js Outdated Show resolved Hide resolved
tests/test_reply_function_async.js Outdated Show resolved Hide resolved
tests/test_reply_function_async.js Outdated Show resolved Hide resolved
@@ -307,41 +306,29 @@ function RequestOverrider(req, options, interceptors, remove) {
if (interceptor.replyFunction) {
const parsedRequestBody = parseJSONRequestBody(req, requestBody)

if (interceptor.replyFunction.length === 3) {
let fn = interceptor.replyFunction
if (fn.length === 3) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could also consider checking if it is an AsyncFunction (and in that case not promisifying, regardless of the number of arguments):

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/AsyncFunction

It ensures that an async reply function is awaited, even if the arguments have gotten messed up.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have reservations on this. An async function and a function returning a promise should be treated the same in my book, but there is no way to sniff for the later.

With some quick testing, it looks like even if an async fn gets wrapped in promisify it will resolve the whole chain as desired.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with the principle that an async function and a promise should be treated the same.

A conflicting principle (in this case) is that the number of arguments to a promise-returning function should not affect its behavior. I see why we did that here, and when we lived in the world of callbacks, why it would be necessary. Though going forward ideally we wouldn't care about the number of arguments to the function. My goal is to avoid surprises when someone makes a mistake.

I don't think we need to belabor this (and it's not so important in the scheme of things) – just wanted to convey where I was coming from.

lib/request_overrider.js Show resolved Hide resolved
@mastermatt
Copy link
Member Author

@paulmelnikow based on other convos we've had in other PRs; would the preference be to change the calls of t.rejects to use assert-rejects?

@paulmelnikow
Copy link
Member

If we're going to change to a different test runner I think we may as well keep using assertRejects though I don't much care either way.

mastermatt and others added 2 commits July 28, 2019 21:47
Co-Authored-By: Paul Melnikow <github@paulmelnikow.com>
Co-Authored-By: Paul Melnikow <github@paulmelnikow.com>
@mastermatt mastermatt changed the title POC Async Reply functions (always emit errors) Async Reply functions (always emit errors) Jul 29, 2019
@mastermatt mastermatt requested a review from gr2m July 29, 2019 04:09
@mastermatt mastermatt merged commit 26fc08f into nock:beta Aug 1, 2019
@mastermatt mastermatt deleted the async-reply-fns-always-emit-errors branch August 1, 2019 19:47
@mastermatt mastermatt mentioned this pull request Aug 1, 2019
@nockbot
Copy link
Collaborator

nockbot commented Aug 1, 2019

🎉 This PR is included in version 11.0.0-beta.31 🎉

The release is available on:

Your semantic-release bot 📦🚀

@nockbot
Copy link
Collaborator

nockbot commented Aug 12, 2019

🎉 This PR is included in version 11.0.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

gr2m pushed a commit that referenced this pull request Sep 4, 2019
Functions passed to `Interceptor.reply`, either to respond with an array of `[status, body, headers]` or with just the body, can now be async/promise-returning functions.

BREAKING CHANGE: uncaught errors thrown inside of user provided reply functions, whether async or not, will no longer be caught, and will no longer generate a successful response with a status code of 500. Instead, the error will be emitted by the request just like any other unhandled error during the request processing.
gr2m pushed a commit that referenced this pull request Sep 4, 2019
Functions passed to `Interceptor.reply`, either to respond with an array of `[status, body, headers]` or with just the body, can now be async/promise-returning functions.

BREAKING CHANGE: uncaught errors thrown inside of user provided reply functions, whether async or not, will no longer be caught, and will no longer generate a successful response with a status code of 500. Instead, the error will be emitted by the request just like any other unhandled error during the request processing.
gr2m pushed a commit that referenced this pull request Sep 5, 2019
Functions passed to `Interceptor.reply`, either to respond with an array of `[status, body, headers]` or with just the body, can now be async/promise-returning functions.

BREAKING CHANGE: uncaught errors thrown inside of user provided reply functions, whether async or not, will no longer be caught, and will no longer generate a successful response with a status code of 500. Instead, the error will be emitted by the request just like any other unhandled error during the request processing.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants