Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only train on Decoder variables? #8

Open
tiancheng2000 opened this issue May 2, 2020 · 0 comments
Open

Only train on Decoder variables? #8

tiancheng2000 opened this issue May 2, 2020 · 0 comments

Comments

@tiancheng2000
Copy link
Contributor

Hi, I'm trying to migrate this sample to TL2.
But I found in train.py : 116 that optimizer only works for Decoder's variables, and there are codes restoring weights from .npz file to Encoders before training (and these weights will not be changed during training).
In order to migrate to TL2, and without pretrained Encoder's weights at hand, I need to write a model composed of 3 Encoders and 1 Decoders, and then config a tf.keras.optimizers.Adam optimizer to apply gradients to all trainable variables of the model, am i right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant