Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Hidden layers output in the loss function #43151

Closed
Otakarlp opened this issue Sep 11, 2020 · 6 comments
Closed

Using Hidden layers output in the loss function #43151

Otakarlp opened this issue Sep 11, 2020 · 6 comments
Assignees
Labels
comp:keras Keras related issues stat:awaiting response Status - Awaiting response from author TF 2.3 Issues related to TF 2.3 type:bug Bug

Comments

@Otakarlp
Copy link

Please make sure that this is a bug. As per our
GitHub Policy,
we only address code/doc bugs, performance issues, feature requests and
build/installation issues on GitHub. tag:bug_template

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): *
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: *
  • TensorFlow installed from (source or binary):
  • TensorFlow version (use command below): 2.3 / tf-nightly
  • Python version: 3.6-3.7-3.8
  • Bazel version (if compiling from source): *
  • GCC/Compiler version (if compiling from source): *
  • CUDA/cuDNN version: *
  • GPU model and memory: *

Describe the current behavior

Hidden layers output of the model cannot be accessed outside of the function building code.

Describe the expected behavior

To be able to use hidden layers output in my loss function.

Standalone code to reproduce the issue

https://colab.research.google.com/drive/1laEpykHax2QbAV4SB-8Srwfh9SvmAt4B?usp=sharing

Other info / logs

TypeError: An op outside of the function building code is being passed
a "Graph" tensor. It is possible to have Graph tensors
leak out of the function building context by including a
tf.init_scope in your function building code.
For example, the following function will fail:
@tf.function
def has_init_scope():
my_constant = tf.constant(1.)
with tf.init_scope():
added = my_constant * 2
The graph tensor has name: dense_6/Relu:0

During handling of the above exception, another exception occurred:

_SymbolicException Traceback (most recent call last)
9 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
72 raise core._SymbolicException(
73 "Inputs to eager execution function cannot be Keras symbolic "
---> 74 "tensors, but found {}".format(keras_symbolic_tensors))
75 raise e
76 # pylint: enable=protected-access

_SymbolicException: Inputs to eager execution function cannot be Keras symbolic tensors, but found [<tf.Tensor 'dense_6/Relu:0' shape=(None, 128) dtype=float32>]

When i used tf.config.run_functions_eagerly(True)

I got my loss being equal to 0.0000e+00

@bhack
Copy link
Contributor

bhack commented Sep 11, 2020

Can you add_loss for your case? Check https://www.tensorflow.org/guide/keras/custom_layers_and_models#the_add_loss_method

@Otakarlp
Copy link
Author

Thanks for your answer !

I'm actually getting a ValueError: No gradients provided for any variable error.

In fact, i'm trying to reproduce this model https://github.com/IAmSuyogJadhav/3d-mri-brain-tumor-segmentation-using-autoencoder-regularization/blob/master/model.py

Using 2 model outputs, a dice loss for one ouput ( GT ) and a loss function ( VAE) depending on two layers (z_mean,z_var) of the model.

I think i've tried everything to get it work (layer output in loss function), so if someone can manage to make the standalone code i've made in the first post work, with ability to generalize for multiple outputs, it will be huge ! Thanks.

@ravikyram ravikyram added comp:keras Keras related issues TF 2.3 Issues related to TF 2.3 labels Sep 12, 2020
@ravikyram
Copy link
Contributor

@Otakarlp

I have tried in colab with TF nightly version(2.4.0-dev20200912) and i am not seeing any issue. Please, find the gist here.Please, verify once and close the issue. Thanks!

@ravikyram ravikyram added the stat:awaiting response Status - Awaiting response from author label Sep 12, 2020
@bhack
Copy link
Contributor

bhack commented Sep 12, 2020

@ravikyram It seems to me that you have not reproduced the same user example with nightly cause on your Colab gist you are missing the + tf.math.reduce_sum(layer2) user original case inside loss2.

@Otakarlp As I told you I think the you could follow the documentation example I mentioned with add_loss. Other then documentation you could see keras-team/keras#5563 (comment)

@Otakarlp
Copy link
Author

First, thank you both for trying to help me.

Secondly, i want to make my mea culpa here, @bhack was absolutely right, i used add_loss and it worked perfectly ( before compiling the model(s) ), thank you again @bhack you are huge.

For those who are trying to add intermediate layers to the loss function with respect to a certain path of the model, please use the gist Here.

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:keras Keras related issues stat:awaiting response Status - Awaiting response from author TF 2.3 Issues related to TF 2.3 type:bug Bug
Projects
None yet
Development

No branches or pull requests

3 participants