You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Expected the last dense layer to have 3 dimensions, but got array with shape (100, 1)
The reason why I change the shape of categorical input layer to none is that I am not able to concatenate a defined layer with other two undefined layer at the last step.
I choose embedding layer in Categorical encoder because "The shape of the input to "Flatten" is not fully defined (got (None, 1). Make sure to pass a complete "input_shape" or "batch_input_shape" argument to the first layer in your model.", so I use embedding layer to keep them have the same dimension.
Could you please help me with these problems? Thank you in advance.
The text was updated successfully, but these errors were encountered:
Hi!
The issue here is that both title and description embedding need GlobalMaxPooling, not just MaxPool1d.
Global max pooling pools over the entire time axis so the resulting tensor will be 2-dimensional [batch, units]. You can then safely concatenate over "units" axis.
I'd also recommend using more than one unit in dense layers.
If it still won't fix after global max pooling, please ping me again in this very issue.
I stuck in building the model in week 2. Here is my code.
def build_model(n_tokens=len(tokens), n_cat_features=len(categorical_vectorizer.vocabulary_), hid_size=64):
l_title = L.Input(shape=[None], name="Title")
l_descr = L.Input(shape=[None], name="FullDescription")
l_categ = L.Input(shape=[None], name="Categorical")
I met with following problem.
Expected the last dense layer to have 3 dimensions, but got array with shape (100, 1)
The reason why I change the shape of categorical input layer to none is that I am not able to concatenate a defined layer with other two undefined layer at the last step.
I choose embedding layer in Categorical encoder because "The shape of the input to "Flatten" is not fully defined (got (None, 1). Make sure to pass a complete "input_shape" or "batch_input_shape" argument to the first layer in your model.", so I use embedding layer to keep them have the same dimension.
Could you please help me with these problems? Thank you in advance.
The text was updated successfully, but these errors were encountered: