Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Graph disconnected: cannot obtain value for tensor Tensor…The following previous layers were accessed without issue: #42017

Closed
boluoyu opened this issue Aug 4, 2020 · 11 comments
Assignees
Labels
comp:keras Keras related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TF 2.3 Issues related to TF 2.3 type:support Support issues

Comments

@boluoyu
Copy link

boluoyu commented Aug 4, 2020

I want to obtain the output of intermediate sub-model layers with tf2.keras.Here is a model composed of two sub-modules:

    input_shape = (100, 100, 3)

    def model1():
        input = tf.keras.layers.Input(input_shape)
        cov = tf.keras.layers.Conv2D(filters=32, kernel_size=3, strides=1,name='cov1')(input)
        embedding_model = tf.keras.Model(input,cov,name='model1')
        return embedding_model

    def model2(embedding_model):

        input_sequence = tf.keras.layers.Input((None,) + input_shape)

        sequence_embedding = tf.keras.layers.TimeDistributed(embedding_model,name='time_dis1')

        emb = sequence_embedding(input_sequence)
        att = tf.keras.layers.Attention()([emb,emb])
        dense1 = tf.keras.layers.Dense(64,name='dense1')(att)
        outputs = tf.keras.layers.Softmax()(dense1)

        final_model = tf.keras.Model(inputs=input_sequence, outputs=outputs,name='model2')
        return final_model

    embedding_model = model1()

    model2 = model2(embedding_model)
    print(model2.summary())

output:

Model: "model2"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_2 (InputLayer)            [(None, None, 100, 1 0                                            
__________________________________________________________________________________________________
time_dis1 (TimeDistributed)     (None, None, 98, 98, 896         input_2[0][0]                    
__________________________________________________________________________________________________
attention (Attention)           (None, None, 98, 98, 0           time_dis1[0][0]                  
                                                                 time_dis1[0][0]                  
__________________________________________________________________________________________________
dense1 (Dense)                  (None, None, 98, 98, 2112        attention[0][0]                  
__________________________________________________________________________________________________
softmax (Softmax)               (None, None, 98, 98, 0           dense1[0][0]                     
==================================================================================================
Total params: 3,008
Trainable params: 3,008
Non-trainable params: 0

and then,I want to get output intermediate layer of model1 and model2:

    model1_output_layer = model2.get_layer('time_dis1').layer.get_layer('cov1')
    output1 = model1_output_layer.get_output_at(0)
    output2 = model2.get_layer('dense1').get_output_at(0)

    output_tensors = [output1,output2]
    model2_input = model2.input
    submodel = tf.keras.Model([model2_input],output_tensors)
    input_data2 = np.zeros((1,10,100,100,3))

    result = submodel.predict([input_data2])
    print(result)

Running in tf2.3 ,the error I am getting is:

 File "/Users/bouluoyu/anaconda/envs/tf2/lib/python3.6/site-packages/tensorflow/python/keras/engine/functional.py", line 115, in __init__
    self._init_graph_network(inputs, outputs)
  File "/Users/bouluoyu/anaconda/envs/tf2/lib/python3.6/site-packages/tensorflow/python/training/tracking/base.py", line 457, in _method_wrapper
    result = method(self, *args, **kwargs)
  File "/Users/bouluoyu/anaconda/envs/tf2/lib/python3.6/site-packages/tensorflow/python/keras/engine/functional.py", line 191, in _init_graph_network
    self.inputs, self.outputs)
  File "/Users/bouluoyu/anaconda/envs/tf2/lib/python3.6/site-packages/tensorflow/python/keras/engine/functional.py", line 931, in _map_graph_network
    str(layers_with_complete_input))
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_1:0", shape=(None, 100, 100, 3), dtype=float32) at layer "cov1". The following previous layers were accessed without issue: ['time_dis1', 'attention', 'dense1']

But the following code works:

    model1_input = embedding_model.input
    model2_input = model2.input

    submodel = tf.keras.Model([model1_input,model2_input],output_tensors)

    input_data1 = np.zeros((1,100,100,3))
    input_data2 = np.zeros((1,10,100,100,3))

    result = submodel.predict([input_data1,input_data2])
    print(result

But not what I want.This is strange, model1 is part of model2, so why do we need to input an extra tensor input_data1 ? Sometimes,it is hard to get an extra tensor,especially for complex models.What should I do? Do we need a new API to support this functionality?

@boluoyu boluoyu added the type:others issues not falling in bug, perfromance, support, build and install or feature label Aug 4, 2020
@amahendrakar
Copy link
Contributor

Was able to reproduce the issue with TF v2.3 and TF-nightly. Please find the gist of it here. Thanks!

@amahendrakar amahendrakar added comp:keras Keras related issues TF 2.3 Issues related to TF 2.3 type:support Support issues and removed type:others issues not falling in bug, perfromance, support, build and install or feature labels Aug 4, 2020
@boluoyu
Copy link
Author

boluoyu commented Aug 17, 2020

any updates ?

@gowthamkpr
Copy link

@boluoyu Looks like this is an implementation issue. Please take a look at this issue here and let me know if it helps. Thanks!

@gowthamkpr gowthamkpr added the stat:awaiting response Status - Awaiting response from author label Sep 14, 2020
@boluoyu
Copy link
Author

boluoyu commented Sep 17, 2020

@gowthamkpr No,This issue is very different from that one. These are two completely unrelated issue.

    embedding_model = model1()
    model2 = model2(embedding_model)
    print(model2.summary())

    input_data2 = np.zeros((1,10,100,100,3))

    result = model2.predict(input_data2)
    print(result) #works

Do you see it ? when I use the raw model2 to predict ,JUST input one tensor ,no problem.But I MUST input two tensor when I want to get output intermediate layer of model1 and model2.

    output1 = model1_output_layer.get_output_at(0)
    output2 = model2.get_layer('dense1').get_output_at(0)

    output_tensors = [output1,output2]
    isubmodel = tf.keras.Model([model1_input,model2_input],output_tensors) #input two tensor

    input_data1 = np.zeros((1,100,100,3))
    input_data2 = np.zeros((1,10,100,100,3))

    result = submodel.predict([input_data1,input_data2])

Model1 is part of model2, we don't need an extra tensor input_data1,I hope the new API changes it.

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Status - Awaiting response from author label Sep 19, 2020
@gowthamkpr gowthamkpr added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Oct 28, 2020
@amahendrakar amahendrakar assigned ravikyram and unassigned gowthamkpr Jan 15, 2021
@ravikyram ravikyram removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jan 18, 2021
@arvoelke
Copy link

I have encountered this same issue as well. In general, it happens any time you use some functional sub-model as a layer in your model, and then you want to use a tensor from that sub-model in the outer model (e.g., exposing it as an extra output to debug or analyze a model).

Example:

import tensorflow as tf

inp1 = tf.keras.Input(1)
h1 = tf.keras.layers.Dense(1)(inp1)
out1 = tf.keras.layers.Dense(1)(h1)
model1 = tf.keras.Model(inputs=inp1, outputs=out1)

inp2 = tf.keras.Input(1)
h2 = model1(inp2)
out2 = tf.keras.layers.Dense(1)(h2)
model2 = tf.keras.Model(inputs=inp2, outputs=[out2, model1.layers[1].output])

=>

ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_1:0", shape=(None, 1), dtype=float32) at layer "dense". The following previous layers were accessed without issue: ['functional_1']

The issue does not occur if the submodel is at the same "level" as the outer model, or if you explicitly make the required tensor an output of the sub-model. It's not clear to me if there are any other ways around this, but this seems to be the intent with the functional model API.

@michaelpulis
Copy link

I am also experiencing this issue. I wish to perform feature extraction on my siamese network model (which contains two sub-models) and I am facing this issue.

@jvishnuvardhan jvishnuvardhan added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Mar 22, 2021
@Yannik1337
Copy link

As a workaround, you can add the embedding layers to the model's outputs:

model = tf.keras.Model(inputs=[A, B], outputs=[distance, embeddingA, embeddingB])

where A and B are the inputs to the sub-networks, and embeddingA and embeddingB are the outputs (the embeddings) of these sub-networks. Not beautiful, but it works in a custom train loop.

@sanatmpa1 sanatmpa1 self-assigned this Oct 12, 2021
@sanatmpa1
Copy link

@boluoyu,

Can you take a look at the above workaround proposed by @Yannik1337 and let us know if it helps? Thanks!

@sanatmpa1 sanatmpa1 removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Oct 12, 2021
@sanatmpa1 sanatmpa1 added the stat:awaiting response Status - Awaiting response from author label Oct 12, 2021
@google-ml-butler
Copy link

This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.

@google-ml-butler google-ml-butler bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Oct 19, 2021
@google-ml-butler
Copy link

Closing as stale. Please reopen if you'd like to work on this further.

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@jvishnuvardhan jvishnuvardhan removed their assignment Oct 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:keras Keras related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TF 2.3 Issues related to TF 2.3 type:support Support issues
Projects
None yet
Development

No branches or pull requests

10 participants