Skip to content

Neg n_workers are now the same as zero #4019

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Dec 5, 2017

Conversation

Erotemic
Copy link
Contributor

@Erotemic Erotemic commented Dec 4, 2017

I foolishly passed the kwarg num_workers=-1 to a DataLoader expecting that it would default to serial processing (I correctly recalled that it should be a non-positive number, but failed to remember that it must be zero to get this behavior).

Normally this would be fine, but the error message that it spat out when I did this was very confusing:

AttributeError: 'DataLoaderIter' object has no attribute 'rcvd_idx'

What made the matter worse is that I recently put a print statement in that file to test something, so I though I had broken pytorch, but I couldn't figure out where the difference was. After spending too much time searching for cached pyc files, I finally realized my mistake.

I wanted to submit a patch so it would at least spit out a ValueError. However, I think it is a simpler change to have it accept negative numbers. This change simply makes it so setting num_workers to a non-positive number results in the same behavior as zero.

@kmike
Copy link
Contributor

kmike commented Dec 4, 2017

FWIW, scikit-learn and joblib use a different convention: -1 means "use the same number of workers as CPU cores", -2 means "use num_cores-1", etc. From joblib docs:

If -1 all CPUs are used. If 1 is given, no parallel computing code is used at all, which is useful for debugging. For n_jobs below -1, (n_cpus + 1 + n_jobs) are used. Thus for n_jobs = -2, all CPUs but one are used.

@apaszke
Copy link
Contributor

apaszke commented Dec 5, 2017

Taking into account @kmike's comment I feel like it'd be best to raise ValueError for negative values.

@Erotemic
Copy link
Contributor Author

Erotemic commented Dec 5, 2017

Changed it to a ValueError

@@ -219,6 +219,9 @@ def __init__(self, loader):
# prime the prefetch loop
for _ in range(2 * self.num_workers):
self._put_indices()
elif self.num_workers < 0:

This comment was marked as off-topic.

@Erotemic Erotemic force-pushed the fix_neg_num_workers branch from 1675c26 to cb44b81 Compare December 5, 2017 16:18
@apaszke apaszke merged commit 5c13c69 into pytorch:master Dec 5, 2017
@apaszke
Copy link
Contributor

apaszke commented Dec 5, 2017

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants