Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Evaluate with multiple inputs #552

Closed
cynthia166 opened this issue Jun 18, 2021 · 6 comments
Closed

Using Evaluate with multiple inputs #552

cynthia166 opened this issue Jun 18, 2021 · 6 comments
Assignees
Labels
user support nothing is wrong with Talos

Comments

@cynthia166
Copy link

cynthia166 commented Jun 18, 2021

Hello, I was wondering if you could please help me,

Confirm the below
I have looked for an answer in the Docs. Yes, I have done it
My Python version is 3.5 or higher: Python 3.8.5 and TensorFlow 2.4.1
I have searched through the issues Issues for a duplicate.
I've tested that my Keras model works as a stand-alone. : Yes, I have been running the keras.

My python is 3.8 and .I am trying to hypertune a deep autoencoder, in my function I need too pass thE X value, I have get the following error:
ValueError: Layer model_4 expects 2 input(s), but it received 1 input tensors. Inputs received: [<tf.Tensor 'IteratorGetNext:0' shape=(None, 470) dtype=float32>]
Here ir my code:
def autoEncoder(X_train, y_train, x_val, y_val,params):
'''
Autoencoder for Collaborative Filter Model
'''
#model = Sequential()
users_items_matrix, content_info = X
#content_info = X[:,420:X.shape[1]]
#users_items_matrix = X[:,0:420]

Input

input_layer = Input(shape=(users_items_matrix.shape[1],), name='UserScore')
input_content = Input(shape=(content_info.shape[1],), name='Itemcontent')

Encoder

-----------------------------

enc = Dense(512, activation=params["activation"], name='EncLayer1')(input_layer)

Content Information

#embbeding Turns positive integers (indexes) into dense vectors of fixed size.
x_content = Embedding(100, 256, input_length=content_info.shape[1])(input_content)
x_content = Flatten()(x_content)
x_content = Dense(256, activation=params["activation"],
name='ItemLatentSpace')(x_content)

Latent Space

-----------------------------

Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time, which helps prevent overfitting. Inputs not set to 0 are scaled up by 1/(1 - rate) such that the sum over all inputs is unchanged.

lat_space = Dense(256, activation=params["activation"], name='UserLatentSpace')(enc)
lat_space= add([lat_space, x_content], name='LatentSpace')
lat_space = Dropout(params["dropout"], name='Dropout')(lat_space) # Dropout

Decoder

-----------------------------

dec = Dense(512, activation=params["activation"], name='DecLayer1')(lat_space)

Output

output_layer = Dense(users_items_matrix.shape[1], activation='linear', name='UserScorePred')(dec)

this model maps an input to its reconstruction

model = Model([input_layer, input_content], output_layer)
#model.compile(optimizer = SGD(lr=0.0001), loss='mse')
model.compile(
optimizer=params'optimizer',
loss='mean_squared_error',)
model.summary()
history = model.fit(X,y ,
validation_data=(X_test, y_test),
batch_size=params['batch_size'],
epochs=params['epochs'],
verbose=0)
return history,model
p = {'lr': (0.5, 5, 10),
'hidden_layers':[0, 1, 2],
'batch_size': (20, 30, 50),
'epochs': [50],
#'times':[216,300,600],
#'neurons':[416,600,1200],
'dropout': (0, 0.5, 5),
'weight_regulizer':[None],
'emb_output_dims': [None],
'optimizer': [Adam, "Nadam", "RMSprop"],
'activation':["relu", "selu"],
}
t = ta.Scan(x=X,
y=y,
model=autoEncoder,
#grid_downsample=1,
params=p,
val_split = 0,
experiment_name='im' )
Than you very much

@mikkokotila
Copy link
Contributor

@cynthia166 could you clean up the formatting of the above so it becomes more readable.

@mikkokotila mikkokotila self-assigned this Jun 21, 2021
@mikkokotila mikkokotila added the user support nothing is wrong with Talos label Jun 21, 2021
@cynthia166
Copy link
Author

cynthia166 commented Jun 21, 2021

Hello Miko,

I cleaned up the format:
My question is: How can i make the talos.Evaluate(scan_object) work I get an error becaus my x is multiple inputs, and the error says that list has no shape.

My code:

def autoEncoder(x_train, y_train, x_val, y_val, params):
    '''
    Autoencoder for Collaborative Filter Model
    '''
    #model = Sequential()
    #users_items_matrix, content_info = X
    #content_info = X[:,420:X.shape[1]]
    #users_items_matrix = X[:,0:420]
    # Input
    input_layer   = Input(shape=(420,), name='UserScore')
    input_content = Input(shape=(50,), name='Itemcontent')
    
    # Encoder
    # -----------------------------
    enc = Dense(512, activation=params["activation"], name='EncLayer1')(input_layer)

    # Content Information
    #embbeding Turns positive integers (indexes) into dense vectors of fixed size.
    x_content = Embedding(100, params['firtr_neurons'], input_length=50)(input_content)
    x_content = Flatten()(x_content)
    x_content = Dense(params['firtr_neurons'], activation=params["activation"], 
                                name='ItemLatentSpace')(x_content)
    # Latent Space
    # -----------------------------
    # Dropout layer randomly sets input units to 0 with a frequency of rate at each step during training time, which helps prevent overfitting. Inputs not set to 0 are scaled up by 1/(1 - rate) such that the sum over all inputs is unchanged.
    lat_space = Dense(params['firtr_neurons'], activation=params["activation"], name='UserLatentSpace')(enc)
    
    lat_space= add([lat_space, x_content], name='LatentSpace')
    lat_space = Dropout(params["dropout"], name='Dropout')(lat_space) # Dropout

    # Decoder
    # -----------------------------
    dec = Dense(params['firtr_neurons']*2, activation=params["activation"], name='DecLayer1')(lat_space)

    # Output
    output_layer = Dense(420, activation='linear', name='UserScorePred')(dec)

    # this model maps an input to its reconstruction
    model = Model([input_layer, input_content], output_layer)    
    

#model.compile(optimizer = SGD(lr=0.0001), loss='mse')
    model.compile(
     optimizer="Adam",
    loss='mean_squared_error',)
    model.summary()
    out = model.fit(x=x_train,
                    y=y_train,
                    validation_data=(x_val, y_val),
                    epochs=50,
                    batch_size=params['batch_size'],
                    verbose=0)
    return  out,model


p = {#'lr': (0.5, 5, 10),
     #'hidden_layers':[0, 1, 2],
     'batch_size': [50,100,150],
     'epochs': [50,60],
     #'times':[216,300,600],
     'firtr_neurons':[216,316],
     'dropout': [  .8,.9],
     #'weight_regulizer':[None],
     #'emb_output_dims': [None],
     #'optimizer': ["Adam", "Nadam", "RMSprop"],
     'activation':[ "selu","relu"],
     }

  scan_object = ta.Scan(x=[x_train, x1_train],
                          y=y_train,
                          x_val=[x_val, x1_val],
                          y_val=y_val,
                          params=p,
                          model=autoEncoder,
                        experiment_name="1",
                         
                        )

talos.Evaluate(scan_object)

@cynthia166
Copy link
Author

Thank you so much :)

@cynthia166
Copy link
Author

image

@mikkokotila
Copy link
Contributor

Sorry for not replying earlier. Have you looked at this example for multiple inputs: https://autonomio.github.io/talos/#/Examples_Multiple_Inputs

@mikkokotila mikkokotila changed the title #troubleshoot Using Evaluate with multiple inputs Jan 29, 2022
@mikkokotila
Copy link
Contributor

This will be handled in #582 so merging with that.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
user support nothing is wrong with Talos
Projects
None yet
Development

No branches or pull requests

2 participants