Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use spawn to check perl scripts #3368

Merged
merged 2 commits into from May 5, 2020
Merged

Use spawn to check perl scripts #3368

merged 2 commits into from May 5, 2020

Conversation

hangy
Copy link
Member

@hangy hangy commented May 4, 2020

This avoids ERR_CHILD_PROCESS_STDIO_MAXBUFFER being called if we have too much output.

@hangy hangy added the CI Continuous integration label May 4, 2020
@hangy hangy self-assigned this May 4, 2020
@hangy hangy requested a review from a team May 4, 2020 20:50
@hangy
Copy link
Member Author

hangy commented May 4, 2020

OK, so the main reason for #3155 failing with a timeout is likely to be found in here. 🙂

This avoids ERR_CHILD_PROCESS_STDIO_MAXBUFFER being called if we have too much output.
@stephanegigandet
Copy link
Contributor

Mmm.. the test has been running for 2 hours already.

@hangy
Copy link
Member Author

hangy commented May 5, 2020

Mmm.. the test has been running for 2 hours already.

I suspect some background process gets killed, which causes the build to not finish in the alloted amount of time. The Travis build runs the same script and shows some interesting log messages. Different issues taken out of context:

[scripts/import_csv_file.pl] Out of memory!
[scripts/update_all_products_from_dir_in_mongodb.pl] result: null
[scripts/import_csv_file.pl] result: 1

Obviously, the script and build should not exit with a success status if a single file causes an error and return 1. The null return value is also interesting, as I wouldn't know how a spawned process could return without a status.. I suspect I made a mistake when checking the return value. Additionally, the `out of memory´ error is weird, but might be caused by too many subprocesses running at the same time. This is likely the cause why the GitHub Workflow doesn't manage to finish the build. 🙂

@hangy
Copy link
Member Author

hangy commented May 5, 2020

I think I got it. In the current version of the script, the perl processes actually get started in parallel, which obviously is too much for the build pipelines. The only reason why the Travis CI build works is that it handles this particular problem a bit differently than GitHub Workflows. Let's see what I can do to fix it. 😁

…ction, since this requires too much memory and cpu in most CI scenarios
@hangy hangy merged commit b02e5e1 into openfoodfacts:master May 5, 2020
@hangy hangy deleted the spawn branch May 5, 2020 21:17
svensven added a commit to svensven/openfoodfacts-server that referenced this pull request Oct 7, 2020
The syntax checking is slow (5-6 mins on github), but checking over 100 scripts simultaneously is also unworkable:
openfoodfacts#3368

The limit is configurable, in case the environment running the checks ever has more than the 2 cores it currently does. Increasing the limit on 2 cores works (at least with 3 & 4), but is no faster.
svensven added a commit to svensven/openfoodfacts-server that referenced this pull request Oct 7, 2020
Adapted from the refresh_taxonomies.js script.

The syntax checking is slow (5-6 mins on github), but checking over 100 scripts simultaneously is also unworkable:
openfoodfacts#3368

The limit is configurable, in case the environment running the checks ever has more than the 2 cores it currently does. Increasing the limit on 2 cores works (at least with 3 & 4), but is no faster.
svensven added a commit to svensven/openfoodfacts-server that referenced this pull request Oct 7, 2020
Adapted from the refresh_taxonomies.js script.

The syntax checking is slow (5-6 mins on github), but checking over 100 scripts simultaneously is also unworkable:
openfoodfacts#3368

The limit is configurable, in case the environment running the checks ever has more than the 2 cores it currently does. Increasing the limit on 2 cores works (at least with 3 & 4), but is no faster.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI Continuous integration 👩‍💻 DevOps
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants