Skip to content
This repository has been archived by the owner on Aug 3, 2021. It is now read-only.

Be able to load a float16 or mixed model into a float32 model #233

Closed
blisc opened this issue Sep 8, 2018 · 1 comment
Closed

Be able to load a float16 or mixed model into a float32 model #233

blisc opened this issue Sep 8, 2018 · 1 comment

Comments

@blisc
Copy link
Contributor

blisc commented Sep 8, 2018

Similar to #213.
It would be nice to train a model in mixed or fp16 and do inference in fp32 or vice-versa.

@blisc
Copy link
Contributor Author

blisc commented Nov 20, 2018

Currently a WIP under the transfer_learning_dtype branch.

@chiphuyen can you double check that it works for your models?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant