You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to load a model from Tensorflowhub using example code. It works perfect with the FP32. As soon as I add the tf.keras.mixed_precision.set_global_policy('mixed_float16') to enable mixed float, it raises an error. Looks like the dimension issue but then it works perfect with FP32
Relevant code
importtensorflowastfimporttensorflow_hubashubIMAGE_SIZE= (224,224)
class_names= ['cat','dog']
#If you comment out the following line, the code works fine.tf.keras.mixed_precision.set_global_policy('mixed_float16')
# --------model_handle="https://tfhub.dev/google/imagenet/resnet_v1_50/feature_vector/5"do_fine_tuning=Falseprint("Building model with", model_handle)
model=tf.keras.Sequential([
tf.keras.layers.InputLayer(input_shape=IMAGE_SIZE+ (3,)),
hub.KerasLayer(model_handle, trainable=do_fine_tuning),
tf.keras.layers.Dropout(rate=0.2),
tf.keras.layers.Dense(len(class_names),
kernel_regularizer=tf.keras.regularizers.l2(0.0001))
])
model.build((None,)+IMAGE_SIZE+(3,))
model.summary()
I tried to reproduce the same error, I'm unable to get the same error after passing dtype=tf.float32 to hub.KerasLayer(), for your reference I've added Gist file here
Could you please confirm if this issue is resolved for you ? Please feel free to close the issue if it is resolved ?
What happened?
I am trying to load a model from Tensorflowhub using example code. It works perfect with the FP32. As soon as I add the
tf.keras.mixed_precision.set_global_policy('mixed_float16')
to enable mixed float, it raises an error. Looks like the dimension issue but then it works perfect with FP32Relevant code
Relevant log output
tensorflow_hub Version
0.12.0 (latest stable release)
TensorFlow Version
other (please specify)
Other libraries
tensorflow-gpu==2.9.1
Python Version
3.x
OS
Linux
The text was updated successfully, but these errors were encountered: