Trainer.predict returns None when tpu_cores=8 #7824
Replies: 5 comments 4 replies
-
cc @kaushikb11 😎 |
Beta Was this translation helpful? Give feedback.
-
@KijitoraButi |
Beta Was this translation helpful? Give feedback.
-
Dear @KijitoraButi, Supporting trainer.predict for TPU could result into OOM and we decided it would be unwise. However, we have a built-in Callback called BasePredictionWriter: https://pytorch-lightning.readthedocs.io/en/stable/extensions/generated/pytorch_lightning.callbacks.BasePredictionWriter.html#pytorch_lightning.callbacks.BasePredictionWriter. This is meant to enable you to save your predictions as you want.
Best, |
Beta Was this translation helpful? Give feedback.
-
@tchaton, thanks for your suggestion to use the Here's the code to reproduce the issue, it is based on the TPU tutorial with a few modifications: Package installation:
Code:
It works correctly using a single core with |
Beta Was this translation helpful? Give feedback.
-
Hi, Can I call CustomWriter only with TPUs? I want to write a custom prediction writer on GPU. I tried adding this callback, but it doesn't work. |
Beta Was this translation helpful? Give feedback.
-
🐛 Bug
The Trainer.predict error was fixed in this issue, but there was another problem.
When tpuc_cores=8, Trainer.predict returns None.
Please reproduce using the BoringModel
https://colab.research.google.com/drive/1evjVtSbQpVfiWpTsiZknT6PO6Fa7u7sY?usp=sharing
To Reproduce
If you set tpu_cores=8 and run Trainer.predict, None will be returned.
Expected behavior
Predicted values are returned from Trainer.predict.
Environment
Beta Was this translation helpful? Give feedback.
All reactions