-
Notifications
You must be signed in to change notification settings - Fork 607
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytorch timeout=30 #1520
Pytorch timeout=30 #1520
Conversation
hub/integrations/pytorch/dataset.py
Outdated
@@ -244,7 +245,7 @@ def __iter__(self): | |||
|
|||
while any(self.active_workers): | |||
try: | |||
wid, data = self.data_queue.get(timeout=5) | |||
wid, data = self.data_queue.get(timeout=PYTORCH_DATALOADER_TIMEOUT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we make this an argument that the user passes and default to this value?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think ideally there should be no timeout and the workers should be able to report exceptions to the main thread.. so better not change API right now.
Codecov Report
@@ Coverage Diff @@
## main #1520 +/- ##
==========================================
- Coverage 92.35% 92.31% -0.04%
==========================================
Files 189 189
Lines 16453 16455 +2
==========================================
- Hits 15195 15191 -4
- Misses 1258 1264 +6
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
馃殌 馃殌 Pull Request
Checklist:
coverage-rate
upChanges
Set pytorch dataloader timeout to 30 sec.