Skip to content
This repository has been archived by the owner on Jul 29, 2023. It is now read-only.

Trainer.py doesn't use multiple workers for training when workers >1 #109

Closed
smguo opened this issue Aug 16, 2019 · 0 comments · Fixed by #129
Closed

Trainer.py doesn't use multiple workers for training when workers >1 #109

smguo opened this issue Aug 16, 2019 · 0 comments · Fixed by #129

Comments

@smguo
Copy link
Collaborator

smguo commented Aug 16, 2019

use_multiprocessing option needs to be True for model.fit_generator to initiate multiple generator instances

Nice post demonstrating behaviors of model.fit_generator with and without multiprocessing:
https://keunwoochoi.wordpress.com/2017/08/24/tip-fit_generator-in-keras-how-to-parallelise-correctly/

Will fix this in the next PR

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant