Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple Input Layers #14

Closed
lleuven opened this issue Jun 7, 2021 · 2 comments
Closed

Multiple Input Layers #14

lleuven opened this issue Jun 7, 2021 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@lleuven
Copy link

lleuven commented Jun 7, 2021

Dear developers,

I really like your tool. For me, it is the very good way to visualize my networks. Currently, I am using a model with multiple input branches that are concatenated at a later stage. Unfortunately, Net2Vis assumes all input layers to be a split of the same input (even if the shapes differ). I would appreciate if your tool could support multiple input branches.

Please find attached the graphical output and my model code.

grafik

# You can freely modify this file.
# However, you need to have a function that is named get_model and returns a Keras Model.
import keras
from keras import models
from keras import layers
from keras import utils

def get_model():
    window = 65
    variables = 9

    n_branches = 4
    
    input_shape = [(window, 1, variables),
        (window, 1, variables),
        (window, 1, variables),
        (window, 1, variables + 1)]
    output_shape = 4

    conf = [64, 32, 16]
    bn = False
    activation = keras.layers.ReLU
    activation_output = keras.layers.Activation
    dropout = True
    dropout_rate = 0.2

    # input tail 0
    x_input_b0 = keras.layers.Input(shape=input_shape[0])
    x_in_b0 = keras.layers.Flatten()(x_input_b0)
    for layer, n_hidden in enumerate(conf):
        x_in_b0 = keras.layers.Dense(n_hidden)(x_in_b0)
        if bn is True:
            x_in_b0 = keras.layers.BatchNormalization()(x_in_b0)
        x_in_b0 = activation()(x_in_b0)
        if dropout is not None:
            x_in_b0 = keras.layers.Dropout(dropout_rate)(x_in_b0)

    # input tail 1
    x_input_b1 = keras.layers.Input(shape=input_shape[1])
    x_in_b1 = keras.layers.Flatten()(x_input_b1)
    for layer, n_hidden in enumerate(conf):
        x_in_b1 = keras.layers.Dense(n_hidden)(x_in_b1)
        if bn is True:
            x_in_b1 = keras.layers.BatchNormalization()(x_in_b1)
        x_in_b1 = activation()(x_in_b1)
        if dropout is not None:
            x_in_b1 = keras.layers.Dropout(dropout_rate)(x_in_b1)

    # input tail 2
    x_input_b2 = keras.layers.Input(shape=input_shape[2])
    x_in_b2 = keras.layers.Flatten()(x_input_b2)
    for layer, n_hidden in enumerate(conf):
        x_in_b2 = keras.layers.Dense(n_hidden)(x_in_b2)
        if bn is True:
            x_in_b2 = keras.layers.BatchNormalization()(x_in_b2)
        x_in_b2 = activation()(x_in_b2)
        if dropout is not None:
            x_in_b2 = keras.layers.Dropout(dropout_rate)(x_in_b2)

    # input tail 3
    x_input_b3 = keras.layers.Input(shape=input_shape[3])
    x_in_b3 = keras.layers.Flatten()(x_input_b3)
    for layer, n_hidden in enumerate(conf):
        x_in_b3 = keras.layers.Dense(n_hidden)(x_in_b3)
        if bn is True:
            x_in_b3 = keras.layers.BatchNormalization()(x_in_b3)
        x_in_b3 = activation()(x_in_b3)
        if dropout is not None:
            x_in_b3 = keras.layers.Dropout(dropout_rate)(x_in_b3)

    # concat all inputs
    x_input = [x_input_b0, x_input_b1, x_input_b2, x_input_b3]
    x_concat = keras.layers.Concatenate()([x_in_b0, x_in_b1, x_in_b2, x_in_b3])
    
    # output tail
    n_neurons_concat = int(conf[-1]) * n_branches
    layer_concat = 0
    for exp in reversed(range(2, n_branches + 1)):
        n_neurons = output_shape ** exp
        if n_neurons < n_neurons_concat:
            layer_concat += 1
            x_concat = keras.layers.Dense(n_neurons)(x_concat)
            if bn is True:
                x_concat = keras.layers.BatchNormalization()(x_concat)
            x_concat = activation()(x_concat)
            if dropout is not None:
                x_concat = keras.layers.Dropout(dropout_rate)(x_concat)
    x_concat = keras.layers.Dense(output_shape)(x_concat)
    out = activation_output("linear")(x_concat)
    model = keras.Model(inputs=x_input, outputs=[out])
    
    return model
@Sparkier Sparkier self-assigned this Jun 13, 2021
@Sparkier Sparkier added the bug Something isn't working label Jun 13, 2021
@Sparkier
Copy link

Thank you for submitting this Issue and sorry I could not get to it earlier. This should be fixed with this commit: 47523d7

@Sparkier
Copy link

I copied your code here: https://viscom.net2vis.uni-ulm.de/multi_in_paths, and this is how it now looks for me:
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants