New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use spawn to check perl scripts #3368
Conversation
OK, so the main reason for #3155 failing with a timeout is likely to be found in here. 🙂 |
This avoids ERR_CHILD_PROCESS_STDIO_MAXBUFFER being called if we have too much output.
Mmm.. the test has been running for 2 hours already. |
I suspect some background process gets killed, which causes the build to not finish in the alloted amount of time. The Travis build runs the same script and shows some interesting log messages. Different issues taken out of context:
Obviously, the script and build should not exit with a success status if a single file causes an error and return |
I think I got it. In the current version of the script, the perl processes actually get started in parallel, which obviously is too much for the build pipelines. The only reason why the Travis CI build works is that it handles this particular problem a bit differently than GitHub Workflows. Let's see what I can do to fix it. 😁 |
…ction, since this requires too much memory and cpu in most CI scenarios
The syntax checking is slow (5-6 mins on github), but checking over 100 scripts simultaneously is also unworkable: openfoodfacts#3368 The limit is configurable, in case the environment running the checks ever has more than the 2 cores it currently does. Increasing the limit on 2 cores works (at least with 3 & 4), but is no faster.
Adapted from the refresh_taxonomies.js script. The syntax checking is slow (5-6 mins on github), but checking over 100 scripts simultaneously is also unworkable: openfoodfacts#3368 The limit is configurable, in case the environment running the checks ever has more than the 2 cores it currently does. Increasing the limit on 2 cores works (at least with 3 & 4), but is no faster.
Adapted from the refresh_taxonomies.js script. The syntax checking is slow (5-6 mins on github), but checking over 100 scripts simultaneously is also unworkable: openfoodfacts#3368 The limit is configurable, in case the environment running the checks ever has more than the 2 cores it currently does. Increasing the limit on 2 cores works (at least with 3 & 4), but is no faster.
This avoids ERR_CHILD_PROCESS_STDIO_MAXBUFFER being called if we have too much output.