Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected breaking change: Optimizer.get_weights() removed #442

Open
adriangb opened this issue Sep 4, 2022 · 25 comments
Open

Unexpected breaking change: Optimizer.get_weights() removed #442

adriangb opened this issue Sep 4, 2022 · 25 comments
Assignees

Comments

@adriangb
Copy link
Contributor

adriangb commented Sep 4, 2022

It seems like Optimizer.get_weights() is being removed. SciKeras was using it to serialize optimizer weights since SavedModel silently fails to do so (see tensorflow/tensorflow#44670 and other linked issues, this is a longstanding bug that hasn't been fixed). Could someone fill me in on what the plans are going forward? Pickling models is an essential part how Scikit-Learn operates and hence SciKeras gets completely broken if TensorFlow models can't be serialized.

@adriangb adriangb changed the title Optimizer.get_weights() removed Unexpected breaking change: Optimizer.get_weights() removed Sep 4, 2022
@tilakrayal
Copy link
Collaborator

@gowthamkpr,
I was able to reproduce the issue on tensorflow v2.8, v2.9 and nightly. Kindly find the gist of it here.

@mattdangerw
Copy link
Member

@chenmoneygithub can you take a look here?

@chenmoneygithub
Copy link
Contributor

@adriangb Thanks for reporting the issue!

There has not been any change on get_weights() for months. For loading optimizer weights, please make sure you call load_weights() if you want to get the optimizer weights. For example:

def roundtrip(model: keras.Model) -> keras.Model:
    save_dir = "/tmp/mymodel"
    model.save(save_dir)
    restored = keras.models.load_model(save_dir)
    restored.load_weights(save_dir)
    return restored

@adriangb
Copy link
Contributor Author

adriangb commented Sep 8, 2022

On tensorflow==2.8.0:

>> import tensorflow as tf
>> tf.keras.optimizers.get("rmsprop").get_weights
<bound method OptimizerV2.get_weights of <keras.optimizer_v2.rmsprop.RMSprop object at 0x7f9f580e9360>>

On tf-nightly (9/8/2022):

>> import tensorflow as tf
>> tf.keras.optimizers.get("rmsprop").get_weights
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: 'RMSprop' object has no attribute 'get_weights'

This is because the new keras.optimizers.optimizer_experimental.Optimizer does not have a get_weights() method while the old keras.optimizers.optimizer_v2.optimizer_v2.OptimizerV2 does.

@adriangb
Copy link
Contributor Author

@chenmoneygithub any updates on this? This is breaking SciKeras (and presumably other downstream things)

Here's notebooks proving this is broken:

@adriangb
Copy link
Contributor Author

@mattdangerw would you mind giving an update now that 2.11.0 was released? Please let me know if I am doing something wrong or if there are alternatives, but as far as I can tell this was an unannounced breaking change with no alternative API.

@chenmoneygithub
Copy link
Contributor

get_weights method is removed in the new optimizer, if you need to access the weights, please call variables() method.

But I don't think the issue is caused by this deprecation, at the time then issue got created the new optimizer was still in experimental namespace, so it should be an issue with serializing the old optimizer.

At the current version, I would encourage reworking the optimizer serialization strategy to not rely on the get_weights method. If you can point me the exact code you are using, we can check what could be an alternative solution. thanks!

@adriangb
Copy link
Contributor Author

Digging a bit I see that there at least is a public .variables API. I can use that to get the weights, but setting them for Adam does not work as expected:

import tensorflow as tf

model = tf.keras.Sequential(
    [
        tf.keras.Input(shape=(1,)),
        tf.keras.layers.Dense(1, activation="softmax"),
    ]
)
model.compile(optimizer="adam", loss="categorical_crossentropy")
model.fit([[1]], [0])

optimizer = model.optimizer
variables = optimizer.variables()
print(len(variables))  # 5

new = tf.keras.optimizers.Adam()
print(len(new.variables()))  # 1
new.build(variables)
print(len(new.variables()))  # 11!
new.set_weights(variables)  # fails
# ValueError: Optimizer variable m/iteration_568 has shape () not compatible with provided weight shape (1, 1)

Maybe that's an unrelated bug? I would expect this roundtrip to work

@adriangb
Copy link
Contributor Author

Here's the current code: https://github.com/adriangb/scikeras/blob/master/scikeras/_saving_utils.py

It's the only reliable way that I know of to serialize optimizers with state like Adam.

@adriangb
Copy link
Contributor Author

at the time then issue got created the new optimizer was still in experimental namespace

I reported the issue because I test with tf-nightly precisely so I can catch these sorts of breaking changes before they are released

@chenmoneygithub
Copy link
Contributor

There is not set_weights method. The current workaround is to loop over the variable list and set each variable individually.

Some more context - we no longer keep the set_weights and get_weights because optimizer variables are now stored as common TF variables, so the checkpoint saving/restoring is the same as Keras layers. Can you try using the same code for serializing Keras layer on optimizer? Theoretically it should work.

@adriangb
Copy link
Contributor Author

@adriangb
Copy link
Contributor Author

the checkpoint saving/restoring is the same as Keras layers. Can you try using the same code for serializing Keras layer on optimizer? Theoretically it should work.

That's good news! It may "just work" then and I can remove these workarounds. Let me try.

@adriangb
Copy link
Contributor Author

Exciting news! I think all of the bugs with optimizer serialization are fixed. So I think all of the following tickets are resolved:
keras-team/keras#15661
tensorflow/tensorflow#44670
#70

I'll update SciKeras and run tests in CI to confirm

@adriangb
Copy link
Contributor Author

Nope, I got my hopes up too soon. The bug reported in tensorflow/tensorflow#44670 is still there.

So yeah @chenmoneygithub can you think of a way to serialize and deserialize stateful optimizers in 2.11.0?

@chenmoneygithub
Copy link
Contributor

You need to call load_weights() to restore the weights, otherwise it will be lazily loaded to my knowledge.

For your code snippet, you want to do new.load_weights("model") after the load_model() call.

@adriangb
Copy link
Contributor Author

I think Model.save_weights() and Model.load_weights() deal with the model weights, not the optimizer variables/state/weights. Also Model.save() and load_model() do save/load the model weights (but not the optimizer state/variables/weights).

If I'm missing something, maybe you can give me a self-contained example where a model is saved and re-loaded preserving the optimizer state?

@chenmoneygithub
Copy link
Contributor

import tensorflow as tf

model = tf.keras.Sequential(
    [
        tf.keras.Input(shape=(1,)),
        tf.keras.layers.Dense(1, activation="softmax"),
    ]
)
model.compile(optimizer="adam", loss="categorical_crossentropy")
model.fit([[1]], [0])

model.save("model")
new = tf.keras.models.load_model("model")
new.load_weights("model")

It loads the optimizer state. You cannot do the length equal assertion because iteration somehow is no longer a variable right after restoring, but you can still access it with the right value by new.optimizer.iterations.

@adriangb
Copy link
Contributor Author

adriangb commented Nov 18, 2022

Here's what I'm getting:

import tensorflow as tf

model = tf.keras.Sequential(
    [
        tf.keras.Input(shape=(1,)),
        tf.keras.layers.Dense(1, activation="softmax"),
    ]
)
model.compile(optimizer="adam", loss="categorical_crossentropy")
model.fit([[1]], [0])

model.save("model")
new = tf.keras.models.load_model("model")
new.load_weights("model")

print([v.name for v in model.optimizer.variables()])  # ['iteration:0', 'Adam/m/dense/kernel:0', 'Adam/v/dense/kernel:0', 'Adam/m/dense/bias:0', 'Adam/v/dense/bias:0']
print([v.name for v in new.optimizer.variables()])  # ['iteration:0']

My understanding is that iteration corresponds to the first variable (of shape ()).
The issue here is that the rest of the variables are not restored at all. Notice how model.optimizer.variables() has 5 variables while new.optimizer.variables() has just 1 variable (iteration). This means that the optimizer state/weights/variables were not restored and attempting to resume training would result in no errors but probably really bad or at least unexpected results.

So I think that snipped you posted does not work.

@adriangb
Copy link
Contributor Author

@chenmoneygithub looping back here. Am I missing something or does your suggestion indeed not work? Thanks

@chenmoneygithub
Copy link
Contributor

@adriangb What's your TF and Keras version? The snippet works as expected on my testing.

@adriangb
Copy link
Contributor Author

adriangb commented Nov 29, 2022

https://colab.research.google.com/drive/1p9XOAE9SwU3ZATKVHmzxWIDHK1BGF7S3?usp=sharing

This notebook confirms my results. The TF and Keras versions are printed out as well (2.11.0 for both).

Is there something I'm missing or wrong with this notebook?

@chenmoneygithub
Copy link
Contributor

chenmoneygithub commented Nov 29, 2022

In 2.11 the optimizer does lazy loading, if you want to explicitly restore the variable values, you need to call optimizer.build(model.trainable_variables), which is automatically called at the first time of updating variable value.

A little more context - Keras team made a new-version optimizer, and is available via tf.keras.optimizers.experimental.XXX in 2.9/2.10 release, and we have made that default in 2.11. The legacy optimizer is moved under legacy namesapce. Please see more details here: https://github.com/tensorflow/tensorflow/releases

For serialization/deserialization purpose, I don't know what the current approach is. One potential solution I am thinking about is to explicitly call optimizer.variables to get all variables during serialization, and set variables one by one during deserialization. Could you point me to the code in SciKeras that does the work? I can try taking a closer look, thanks!

@chenmoneygithub
Copy link
Contributor

import tensorflow as tf

print(tf.__version__)
print(tf.keras.__version__)

model = tf.keras.Sequential(
    [
        tf.keras.Input(shape=(1,)),
        tf.keras.layers.Dense(1, activation="softmax"),
    ]
)
model.compile(optimizer="adam", loss="categorical_crossentropy")
model.fit([[1]], [0], verbose=0)

model.save("model")
new = tf.keras.models.load_model("model")
new.load_weights("model")

new.optimizer.build(model.trainable_variables)

print([v.name for v in model.optimizer.variables()])  # ['iteration:0', 'Adam/m/dense/kernel:0', 'Adam/v/dense/kernel:0', 'Adam/m/dense/bias:0', 'Adam/v/dense/bias:0']
print([v.name for v in new.optimizer.variables()])  # ['iteration:0', 'm/dense_2/kernel:0', 'v/dense_2/kernel:0', 'm/dense_2/bias:0', 'v/dense_2/bias:0']

Please check if this works for you, thx!

@adriangb
Copy link
Contributor Author

Yes, I think that works! I’ll give it some more in depth testing and confirm. Thank you for your help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants