-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there any plan to expose intermediate CNN outputs in hub with KerasLayer? #453
Comments
Hi there,
Mainly CNNs that are unavailable via `tf.keras.applications`, such as
InceptionV1 and InceptionV4.
Thanks.
Best regards,
Tan Jia Huei
University of Malaya.
…On Sat, Dec 21, 2019 at 5:15 AM gowthamkpr ***@***.***> wrote:
Do you have any specific module in mind @jiahuei
<https://github.com/jiahuei> ?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#453?email_source=notifications&email_token=AEAKUD5V6YFQVBU76RJAB7LQZUYYNA5CNFSM4J6AV322YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEHOGACY#issuecomment-568090635>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEAKUD6N6NQJXRZU7T5Y2ADQZUYYNANCNFSM4J6AV32Q>
.
|
Hi @jiahuei, there are no current plans to expose intermediate outputs systematically. However, there is an undocumented way to get them out of some TF2 SavedModels exported from TF-Slim, such as https://tfhub.dev/google/imagenet/inception_v1/feature_vector/4: passing NOTE: This interface is subject to change or removal, and has known issues. l = hub.KerasLayer(
"https://tfhub.dev/google/imagenet/inception_v1/feature_vector/4",
trainable=True, # Or not, as you please.
arguments=dict(return_endpoints=True))
images = tf.keras.layers.Input((224, 224, 3))
outputs = l(images)
for k, v in sorted(outputs.items()):
print(k, v.shape) Issues to be aware of:
|
jiahuei have you since been able to easily do this with models such as efficientnet? It really is annoying that we can't easily see intermediate layers. Even when I've tried to do it by utilizing tf.keras_applications models to just transfer over my weights, differences in the architectures make it nearly impossible to do even though documentation states "they are identical." |
Hi, it has been a while, but I might be able to retrieve intermediate layers. Check out the code at https://github.com/jiahuei/TF2-pretrained-CNN |
Hi @arnoegw! Are there any changes expected to support access to intermediate layers? I've found a lot of issues in this repository about accessing intermediate outputs for BERT and various CNN models, and this issue seems to be the only one where a solution is given. Please, consider implementing some kind of official way to use intermediate outputs, as it will significantly expand the capabilities of tfhub models. |
@arnoegw |
Currently, this issue is not under active development. However given that
in TF any tensor can be retrieved using tf.get_variable, one somewhat hacky
approach could be to inspect a graph (say in tensorboard.dev), find the
name of relevant tensors and read them directly
…On Wed, Dec 2, 2020 at 12:09 AM Ujjwal ***@***.***> wrote:
Is this issue receiving any attention ? This is one of the most important
concerns right now.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#453 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AG52OTHLIP7O5W6OFFGTTITSSVZQ7ANCNFSM4J6AV32Q>
.
|
Hi, in order to see your model architecture, you can try (example with efficientnet lite0): efficientnet_lite0_base_layer = hub.KerasLayer(
"https://tfhub.dev/tensorflow/efficientnet/lite0/feature-vector/2",
output_shape=[1280],
trainable=False
)
print("Thickness of the model:", len(efficientnet_lite0_base_layer.weights))
print ("{:<80} {:<20} {:<10}".format('Layer','Shape','Type'))
for i in range(len(efficientnet_lite0_base_layer.weights)):
model_weights_raw_string = str(efficientnet_lite0_base_layer.weights[i])
model_weights_wo_weights = model_weights_raw_string.split(", numpy", 1)[0]
dtype = model_weights_wo_weights.split(" dtype=")[1]
shape = model_weights_wo_weights.split(" shape=")[1].split(" dtype=")[0]
print ("{:<80} {:<20} {:<10}".format(efficientnet_lite0_base_layer.weights[i].name, shape, dtype)) Output:
|
@akhorlin Hmmm..., could you elaborate a bit? I am wondering how I can get outputs of intermediate layers from |
Currently all CNNs in Hub only provide either feature vector after global pooling or logits.
However it can be very useful to have access to the feature maps as well as retaining ability to fine-tune the CNN using KerasLayer in TF 2.0.
Is there any plan to expose intermediate layer outputs in hub with KerasLayer?
The text was updated successfully, but these errors were encountered: