Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Facing exception while processing 1 Million data.. #5

Closed
aswinjose89 opened this issue Apr 27, 2021 · 6 comments
Closed

Facing exception while processing 1 Million data.. #5

aswinjose89 opened this issue Apr 27, 2021 · 6 comments

Comments

@aswinjose89
Copy link

Refer the exception below

A worker process managed by the executor was unexpectedly terminated. This could be caused by a segmentation fault while calling the function or by an excessive memory usage causing the Operating System to kill the worker.

The exit codes of the workers are {SIGKILL(-9), SIGKILL(-9)}

@atif-hassan

@atif-hassan
Copy link
Owner

If n_jobs > 1 then PyImpetus internally makes that many copies of the data. You can try setting n_jobs to 1 in order to stop PyImpetus from making multiple copies of the data. This will make the algorithm slower but you shouldn't run into above said error.

@aswinjose89
Copy link
Author

Thanks for the quick response..

@atif-hassan
Copy link
Owner

Please note that "n_jobs>1" case also includes "n_jobs=-1"

@aswinjose89
Copy link
Author

Noted..I have tried n_jobs=70(cores) and n_jobs=-1(To consume all the cores).Both cases were throwing similar exception..

@atif-hassan
Copy link
Owner

Yes, the reason being that with any value other than 1 for n_jobs will result in internal copies of the data which will definitely lead to memory error for such a large dataset.

@aswinjose89
Copy link
Author

Thanks for the inputs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants