Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pytorch model loader #37

Closed
faroit opened this issue Mar 30, 2018 · 3 comments
Closed

pytorch model loader #37

faroit opened this issue Mar 30, 2018 · 3 comments
Labels
type:feature New feature or request

Comments

@faroit
Copy link

faroit commented Mar 30, 2018

deeplearn.js supported some level of conversion of pytorch models via python scripts

Will this functionality be ported to tfjs or is this out of scope now that the framework is aiming for a tensorflow only solution?

@faroit faroit changed the title support pytorch models pytorch model loader Mar 30, 2018
@caisq caisq added the type:feature New feature or request label Mar 31, 2018
@nsthorat
Copy link
Contributor

We're not planning on supporting pytorch in the immediate future, however there is a path to use the old checkpoint loader with pytorch.

The legacy loader with the pytorch weight-dumper script can be used: https://github.com/PAIR-code/deeplearnjs-legacy-loader

The legacy loader currently depends on deeplearn, so you're going to have a dependency problem for the JS (the python will still work).

Thankfully, all of the logic lives in a single file with 1 dependency (Tensor). This means you could just copy this file into your project: https://github.com/PAIR-code/deeplearnjs-legacy-loader/blob/master/src/checkpoint_loader.ts

You can swap the import of 'deeplearn' with '@tensorflow/tfjs' or '@tensorflow/tfjs-core' and it should just work.

@faroit
Copy link
Author

faroit commented Apr 1, 2018

Thanks a lot for the pointer. I will try that.

@faroit faroit closed this as completed Apr 1, 2018
@slawojstanislawski
Copy link

slawojstanislawski commented May 3, 2018

I had a .pth file (containing a pytorch model I presume) and using the method outlined above I was able to get the output: a long list of _weight and _bias files, as well as one manifest.json file. This is in line with what's described here. Finally calling the getAllVariables() method of the legacy loader gets me the variables. However, I can't figure out the steps to move from this vars variable to a usable, tensorflowjs-importable model. Is it by writing out the model manually as it's described on the linked page (I'm not the author of the model, so it's problematic), use tfjs converter in some way, or there's a different procedure?

nsthorat pushed a commit that referenced this issue Aug 19, 2019
* Use nightly tensorflow for linux and revert to using bundled eager
header.

* Add clean and only download libtensorflow once.

* Use custom libtensorflow still.
nsthorat pushed a commit that referenced this issue Aug 20, 2019
* Update render function docs
* update dev dependency
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:feature New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants