Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

please use release with torch / tpu versions #645

Closed
ildoonet opened this issue Apr 30, 2019 · 5 comments
Closed

please use release with torch / tpu versions #645

ildoonet opened this issue Apr 30, 2019 · 5 comments

Comments

@ildoonet
Copy link

I can not use xla since it requires tpu for tf 1.14, which is not released at this moment.

Please use 'release' feature in github with reproducible versions for torch, tpu and etcs.

It is very discouraging to use this project.

@dlibenzi
Copy link
Collaborator

Hi!
Thanks for the interest in the PyTorch/TPU project. The project is under heavy development, and unfortunately, as we move forward, we discover new features that we need on the TPU VM side (the TF service).
We expect the project to be stabilizing in the new couple of months, and we are also speeding up the TF release process, so that the gap between cutting edge TF, and the version inside the TPU VM, will be smaller.

@ildoonet
Copy link
Author

@dlibenzi Thanks for the reply! So do you have any recommendation of versions to use current xla? (eg. xla commit id, torch commit id, tpu pod tf version, ...)

@ildoonet
Copy link
Author

@dlibenzi Thanks for the hard work you put in but current repository is somewhat misguiding many people. It is not installable with the current version of readme. Please provide commit ids(pytorch, tensorflow, xla) and tpu pod version that we can test.

@dlibenzi
Copy link
Collaborator

As I mentioned, the PyTorch/TPU project is under heavy development, and we have switched the architecture from using the PyTorch/JIT, to using a new XTen (lazy tensor) approach.
In order for you to fetch a version working with TF 1.13, you will have to fetch the PyTorch and PyTorch/XLA commit IDs before Dec 19th 2018, which is a LONG time ago.
And you will be fetching a version using the old JIT approach, which has limited support (if you are planning to train MNIST or resnets, that is OK - anything more, you will likely find "unsupported operator" errors).

@ildoonet
Copy link
Author

Thanks I will look forward to your improved works!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants