Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Layer with multiple outputs #3061

Closed
3 tasks
AdityaGudimella opened this issue Jun 24, 2016 · 12 comments
Closed
3 tasks

Layer with multiple outputs #3061

AdityaGudimella opened this issue Jun 24, 2016 · 12 comments

Comments

@AdityaGudimella
Copy link

How do I implement a custom layer which returns multiple outputs instead of a single one?

Please make sure that the boxes below are checked before you submit your issue. Thank you!

  • Check that you are up-to-date with the master branch of Keras. You can update with:
    pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
  • If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
    pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
  • Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).
@AdityaGudimella
Copy link
Author

@rilut I'm not talking about implementing a model with multiple outputs. I'm talking about implementing a custom layer with multiple outputs. Something along the lines of:

class CustomLayer(Layer):
    def __init__(self, *args):
        # do initialization here

    def call(self, inputs):
        # do something with inputs
        return output1, output2, output3

    def build(self, input_shape):
        # initialize weights

Unfortunately when I do the above, it gives me an error saying tuple object does not have attribute _keras_shape. What is the correct way to do this?

@andreatg-zz
Copy link

I am having the same issue. Did you find a solution?

@eakbas
Copy link

eakbas commented Mar 28, 2017

An upvote for this issue!

@waleedka
Copy link
Contributor

Another up vote. My guess is that this option is not available. I haven't seen any of the built-in Keras layers return more than one output.

My hacky work-around is to merge the outputs into one tensor, and then later split it to multiple tensor. It could be more more elegant, though, if Keras supports multiple outputs.

@rjrowekamp
Copy link

I seem to have a layer with multiple outputs working.

  1. I manually set _keras_shape for each tensor the layer returned in call.
  2. I wrote compute_mask to return n_output * [None]
  3. I wrote compute_output_shape to return the shapes of the outputs.

@waleedka
Copy link
Contributor

waleedka commented Jun 6, 2017

@rjrowekamp Thanks! I gave it another try and managed to get it working as well, but with slightly different approach.

  1. Let compute_output_shape() return a list of shapes.
  2. Let call() return a list of outputs.
  3. For easier debugging, set the shapes of returned tensors as such:
output_shapes = self.compute_output_shape([K.int_shape(i) for i in inputs])
for o, s in zip(outputs, output_shapes):
    o.set_shape(s)

This is mostly working. But it broke the Tensorboard callback (which seems to assume that all layers return one tensor). I disabled the histogram summary in that callback and that allowed me to side-step the bug for now.

@vqdang
Copy link

vqdang commented Jun 6, 2017

@waleedka
May I ask what errors were thrown by Tensorboard callbacks? I'm wondering if it is the same bug I encountered. This is what I did using a custom layer with hint from @rjrowekamp .

class SplitLayer(Layer):

    def __init__(self, n_splits, split_axis = 3, **kwargs):
        self.n_splits = n_splits
        self.split_axis = split_axis
        super(SplitLayer, self).__init__(**kwargs)

    def call(self, x):
        sub_tensors = tf.split(x, self.n_splits, axis = self.split_axis)     
        return sub_tensors

    def compute_output_shape(self, input_shape):
        sub_tensor_shape = list(input_shape)
        num_channels = sub_tensor_shape[-1] 
        sub_tensor_shape[-1] = int(num_channels / self.n_splits)
        sub_tensor_shape = tuple(sub_tensor_shape)
        list_of_output_shape = [sub_tensor_shape] * self.n_splits
        return  list_of_output_shape

    def compute_mask(self, inputs, mask=None):
        return self.n_splits * [None]

My callback still worked but the graph would not display, everything else seemed to be fine though.

@waleedka
Copy link
Contributor

waleedka commented Jun 6, 2017

@vqdang The TensorBoard callback was trying to create a histogram of the outputs of each layer, and I believe it assumes that all layers generate one output (or similarly shaped outputs). So it was complaining that the shapes and data types of the outputs didn't match. I disabled the histogram feature by setting its frequency to zero as such:

keras.callbacks.TensorBoard(log_dir=log_path, histogram_freq=0, 
                            write_graph=False, write_images=False)

@stale
Copy link

stale bot commented Sep 4, 2017

This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed.

@agamemnonc
Copy link

agamemnonc commented Dec 15, 2017

It seems that the plot_model function cannot handle this situation very well, i.e. when a layer returns multiple outputs the visualization ignores this information and draws a single arrow for the output variables.

@kureta
Copy link

kureta commented Mar 27, 2018

@agamemnonc at some point plot_model used to normally handle this situation as seen on this web page, at the output of KLDivergenceLayer. But now it behaves as you've stated.

@Zhangtd
Copy link

Zhangtd commented Nov 14, 2019

Same problem. Is there any solutions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants