Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question/Feature Request] You mentioned it works with GPU, does Fast-TabNet work with TPUs? #129

Closed
neomatrix369 opened this issue Jun 5, 2020 · 2 comments
Assignees
Labels
Abhishek-eBook Label your issue with this and try to win a free copy of Abhishek's edbook enhancement New feature or request

Comments

@neomatrix369
Copy link

Feature request

What is the expected behavior?
Same as the outcome on CPUs and GPUs

What is motivation or use case for adding/changing the behaviour?
Better training performance

How should this be implemented in your opinion?
Similar to Tensorflow/Pytorch sends the data to the TPU

Are you willing to work on this yourself?
Happy to contribute along with another experienced developer

@neomatrix369 neomatrix369 added the enhancement New feature or request label Jun 5, 2020
@neomatrix369 neomatrix369 changed the title [Question] You mentioned it works with GPU, does Fast-TabNet work with TPUs? [Question/Feature Request] You mentioned it works with GPU, does Fast-TabNet work with TPUs? Jun 5, 2020
@Optimox
Copy link
Collaborator

Optimox commented Jun 6, 2020

Hey @neomatrix369, I'm not sure to understand correctly the question?

fast-tabnet is the fastai wrapper of pytorch-tabnet, they are two different things.

Does fast-tabnet uses TPU : I don't think so.
Does pytorch-tabnet uses TPU: no.

I've never used pytorch with TPUs, so if you have some links in that direction I'd be interested to know more!

@Optimox Optimox added the Abhishek-eBook Label your issue with this and try to win a free copy of Abhishek's edbook label Jun 6, 2020
@neomatrix369
Copy link
Author

neomatrix369 commented Jun 6, 2020

Hey @neomatrix369, I'm not sure to understand correctly the question?

fast-tabnet is the fastai wrapper of pytorch-tabnet, they are two different things.

Does fast-tabnet uses TPU : I don't think so.
Does pytorch-tabnet uses TPU: no.

I've never used pytorch with TPUs, so if you have some links in that direction I'd be interested to know more!

Thanks for your response, I'm working on this notebook: https://www.kaggle.com/neomatrix369/bert-on-steroids-using-pytorch-cpus-tpus (credits to the inspirer of the notebook - mentioned in it, i.e. it's @abhishekkrthakur).

It uses Pytorch, Bert, Transformers and TPUs on Kaggle (although should work on Google Colab as well) to do multi-processing TPU model training.

I hope this helps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Abhishek-eBook Label your issue with this and try to win a free copy of Abhishek's edbook enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants