You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 3, 2022. It is now read-only.
OS: Windows and Linux
Tensorflow version: 1.12.0
Keras version: 2.1.6-tf
Devices: CPU and GPU
I am using keras.applications.inception_v3.InceptionV3 as a feature extractor and I am getting different feature vectors in each session. Seems related to random variable initialization, because setting tf.random_seed(1234) solves the problem. I still cannot pinpoint the exact variables that are initialized randomly instead of being assigned the saved weights.
@pgenevski, Keras-applications and TF-hub have different usages.
Keras has its default session, and the keras model is created and initialized on the default session in your codes. Thus, once you create another session and work on the session, you must lose the pretrained weights. The keras's results are from random initialization.
TF-Hub is not coupled with the default session, and defining global_variables_initializer as a loader of pretrained weights. Thus, you can get the same results on every newly created session in your codes, because the results are computed with pretrained weights.
Thank you! Indeed it turned out that Keras would load the weights into its own session as soon as the model is instantiated. A possible workaround is to instantiate the tf session first, set it explicitly on the Keras backend and then use the model as a tensor in the tf graph (as specified in the functional Keras API).
I have updated the notebook to reflect this approach.
As a side note, although it works, IMHO it would have been cleaner if Keras used tf variable initializers instead of eagerly materializing the model weights upon creation.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi guys,
OS: Windows and Linux
Tensorflow version: 1.12.0
Keras version: 2.1.6-tf
Devices: CPU and GPU
I am using keras.applications.inception_v3.InceptionV3 as a feature extractor and I am getting different feature vectors in each session. Seems related to random variable initialization, because setting tf.random_seed(1234) solves the problem. I still cannot pinpoint the exact variables that are initialized randomly instead of being assigned the saved weights.
Using inception v3 from tensorflow hub works as expected (same value in every session).
I have illustrated the problem in this notebook.
Update:
The text was updated successfully, but these errors were encountered: