-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Progress bar doesn't show up on Kaggle TPU with num_workers
greater than 0
.
#9814
Comments
ok I will check the issue and check back |
hi can you please send me the "train_dataset" that you are using. thank you |
Here is the data loader function: def get_dataset_loader():
transform = transforms.Compose([
transforms.Resize(IMAGE_SIZE), # the shorter side is resize to match image_size
transforms.CenterCrop(IMAGE_SIZE),
transforms.ToTensor(), # to tensor [0,1]
transforms.Lambda(lambda x: x.mul(255)) # convert back to [0, 255]
])
train_dataset = datasets.ImageFolder(DATASET,
transform)
train_loader = DataLoader(train_dataset,
batch_size=BATCH_SIZE,
shuffle=True,
num_workers=multiprocessing.cpu_count(),
drop_last=True)
return train_loader
# Load train dataset
train_dataloader = get_dataset_loader() Here, |
ok thank you I will check it |
hi I also tried this train loader function with a GPU and Kaggle TPU but even when num_workers is 0 it doesn't give a progress bar https://www.kaggle.com/ranugadisansagamage/notebook353e8aa184 this is the notebook in Kaggle that I used I installed Pytorch 1.8 also is there any differences in this code. |
I can't see the notebook. You haven't saved a version with execution. Btw for me GPU execution is not a problem even if I set CPU cores to max i.e. 4 in case of Kaggle. |
hi I updated the kaggle notebook can you please check it and confirm that the code is correct https://www.kaggle.com/ranugadisansagamage/notebook353e8aa184 |
this is the same in https://www.kaggle.com/ranugadisansagamage/notebook353e8aa184 |
There seem to be a training loop there. My notebooks output when I set
|
ok I will check this error |
Please try running this notebook on TPU instance. Try to incrementally change values of |
ok I will check it |
hi, I found some similar issues. Regards |
ok I will check the notebook Regards |
the notebook doesn't exist |
Forgot to save changes after setting visibility to public. Please check it now. |
ok |
@Programmer-RD-AI Btw I am stuck at a problem where I need to implement Sorry, this is not a right place for this question but I feel like I went everywhere on Internet but didn't get anything correct. |
ok I will check If I can find anything Regards |
hi, I have checked but I couldn't find anything, Sorry Regards, |
Thank you so much! |
Can you please help with this issue please thank you. Regards. |
Btw please look no further if you are trying to. I have found some helpful potential implementations (although not perfect i.e. doesn't give same results in terms of signs but I'll search more). |
ok, I am sorry I couldn't solve the problem. Regards. I will try and find a resource for the error. Regards |
Hi, I've gone through all those references but doesn't look they relate to this issue. 🤔 This issue might have to do something with library itself. |
@tchaton Can you please check this issue? Thanks. With best regards, |
I am new to PyTorch lightning so I am not sure so @tchaton can help. With best regards, |
@RahulBhalley @Programmer-RD-AI I will take a stab at this with GCP TPUs. |
ok, thank you @kaushikb11 With best regards, |
This is issue doesn't really affect my work but just curious if there's been any progress on resolving it. |
🐛 Bug
As the issue title says: the progress bar doesn't show up on Kaggle TPU with
num_workers
greater than0
.Disclaimer: I haven't tested this program on Google Colab TPU.
To Reproduce
Set
num_workers
to any number greater than zero up to max CPU cores. On Kaggle, the following code sets it to4
.The training is successful but instead of showing progress bar the following output is shown:
Expected behavior
The progrès bar must show up.
Environment
conda
,pip
, source): piptorch.__config__.show()
: N/AAdditional context
N/A
cc @kaushikb11 @rohitgr7 @tchaton
The text was updated successfully, but these errors were encountered: