Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more tests #3

Closed
5 tasks done
gr2m opened this issue Dec 21, 2021 · 5 comments
Closed
5 tasks done

Add more tests #3

gr2m opened this issue Dec 21, 2021 · 5 comments

Comments

@gr2m
Copy link
Owner

gr2m commented Dec 21, 2021

Follow up to moll/node-mitm#76 (comment) thanks @moll

@gr2m
Copy link
Owner Author

gr2m commented Dec 22, 2021

@moll I think I addressed all your concerns? Can you think of anything else I should be testing?

@gr2m gr2m closed this as completed Dec 22, 2021
@moll
Copy link

moll commented Dec 22, 2021

Oh, well done! You're of course bound to see more bugs as it's effectively the same path Nock took — hooking into the requests/streams from the outside and having to ensure the API remains the same.

For example, I have a feeling the 693fc94 test case isn't exactly right. You're calling response.pause in the user's code, but that's not necessary. As a user you can do the following and get the expected output:

var Http = require("http")

Http.get("http://example.com", function(res) {
	setTimeout(function() {
		res.on("data", (data) => console.log("Data:", data.toString()))
	}, 500)
})

The hook into response's data event in the recording code is the thing that changes behavior — it immediately unpauses the stream. I suppose you could call res.pause in the recording code, but I have a suspicion that may then break automatic unpausing on data the user code does.

Some other possible issues:

  • Did you test when calling write, end et al. with a chunk and a callback? I believe https://github.com/gr2m/http-recorder/blob/main/index.js#L32 would then be calling Buffer.from with a function as encoding.
  • I'd add a test for pipe, too — both ways. Can't be sure it doesn't do something internal to streams that bypasses your hooks.
  • That's a minor thing, but given readable.setEncoding, the recorder is converting strings back to buffers presuming UTF-8(?).

@gr2m
Copy link
Owner Author

gr2m commented Dec 22, 2021

Did you test when calling write, end et al. with a chunk and a callback

Another great catch, thanks! I made an issue here: #17

I'd add a test for pipe, too — both ways. Can't be sure it doesn't do something internal to streams that bypasses your hooks.

I looked into it, but I couldn't figure out how to pipe with the low-level http APIs. I tested it with a request using got via #2. Can you share a both-ways pipe example using http.request? I'd really appreciate it

That's a minor thing, but given readable.setEncoding, the recorder is converting strings back to buffers presuming UTF-8(?).

My thinking here was to normalize the requestBodyChunks passed to the the record event listeners. I'm not worried about the performance overhead as this is a dev tool. Do you see any other problem? I will default encoding to utf-8 as part of #17

@gr2m
Copy link
Owner Author

gr2m commented Dec 22, 2021

it's effectively the same path Nock took

nock is overwriting the entire http.ClientRequest module. This is much more resilient.

@moll
Copy link

moll commented Dec 22, 2021

I looked into it, but I couldn't figure out how to pipe with the low-level http APIs. I tested it with a request using got via #2. Can you share a both-ways pipe example using http.request? I'd really appreciate it

I suppose create a writable stream, call pipe on it and pass it the request. Smth like the following (untested):

stream = new WritableStream
stream.pipe(req)
stream.write("test")

That both ways., first to req, then from res

My thinking here was to normalize the requestBodyChunks passed to the the record event listeners. I'm not worried about the performance overhead as this is a dev tool. Do you see any other problem? I will default encoding to utf-8 as part of #17

Oh, I wouldn't care for performance either. Just correctness. I'm not sure you're normalizing the response --- seems more like it's converting decoded data back into a buffer, but what encoding is that presuming? All HTTP responses aren't text...

nock is overwriting the entire http.ClientRequest module. This is much more resilient.

The less overwriting, the better, of course, though you're still forced to mirror the behavior of API parts you've overwritten and risk Node-internals go behind the WritableStream public API. Then again, there's no real alternative.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants