Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added test for issue 231 (infinity for await of). #232

Closed
wants to merge 2 commits into from

Conversation

mars-dlx
Copy link

@mars-dlx mars-dlx commented Sep 14, 2020

#231
nodejs/node#35190

Pipeline for 12.18.3 failed as expected by timeout. 10 and 14 passed. The old versions don't support for await of. Going to exclude them later for this test.

@dougwilson
Copy link
Contributor

Thank you for this PR. I'm not sure if you've tried or not, but adding a debugger into the mix fixes the issue, which has been quite maddening :) I have been trying to figure out what is going on for the last few hours and since adding a debugger (or even certain console.logs to the code) just make it get "unstuck" for some reason makes it nearly impossible to figure it out so far.

@mars-dlx
Copy link
Author

The first time I "fixed" it with increasing the timeout for tests. It was worked... But on CI, it was broken again. I ran all tests locally, and it failed also.
When I was writing this test, I first tried to use the pf1y5.png file. But the test passed with it. I replaced the file to the binaryfile.tar.gz, and the test failed.
And I also tried to remove Promise to make code simpler. But it again passed...

So, yes. In some conditions, I can not reproduce it.

@dougwilson
Copy link
Contributor

Yea, it seems to be very timing sensitive, which seems to be why any time I have a debugger connected it always works... :(

@dougwilson
Copy link
Contributor

dougwilson commented Sep 15, 2020

So far from what I've seen trying to tackle it, I am leaning towards it being a v8 issue, as this did have a v8 upgrade. I have seen v8 issues before in their code gen... generating the x86 instructions during jit. There are some tell-tail signs that this is likely the cause of the stall -- attaching debugger prevents it from happening, for instance. Even adding part.on=part.on; to your getFile fixes it (* sometimes; it just makes it, for me, sometimes pass and sometimes fail), as it changes how the functions are optimized, etc. by the underlying v8 engine.

@dougwilson
Copy link
Contributor

Even adding part.on=part.on; to your getFile fixes it (* sometimes; it just makes it, for me, sometimes pass and sometimes fail)

Well, after trying more again with the code directly in this branch, it does sometimes get stuck and sometimes not... so I think I just thought that was making a difference 😭

@bnoordhuis
Copy link

Long shot but does it make a difference if you start node with --predictable or --noopt or maybe --optimize_for_size?

@dougwilson
Copy link
Contributor

Thanks! I was looking around for what those were. So using a not statistically correct method of running in a for loop and counting error vs success, the options don't really seem to be influencing the likeliness of if the stall happens or not. It generally seems to stall about 35% of the runs, with or without any of those flags :/

@dougwilson
Copy link
Contributor

See #231 (comment)

@dougwilson dougwilson closed this Dec 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants