Skip to content

[BUG] NeuMF - PyTorch backend - leaky_relu unrecognised #679

@Gabriel-Kissin

Description

@Gabriel-Kissin

Description

In help(NeuMF):

 |  act_fn: str, default: 'relu'
 |      Name of the activation function used for the MLP layers.
 |      Supported functions: ['sigmoid', 'tanh', 'elu', 'relu', 'selu, 'relu6', 'leaky_relu']

the Leaky ReLU is spelt as leaky_relu.

However, in backend_pt.py#L19 leaky_relu is spelt inconsistently:

activation_functions = {
    "sigmoid": nn.Sigmoid(),
    "tanh": nn.Tanh(),
    "elu": nn.ELU(),
    "selu": nn.SELU(),
    "relu": nn.ReLU(),
    "relu6": nn.ReLU6(),
    "leakyrelu": nn.LeakyReLU(),   # whoops
}

This unsurprisingly results in KeyError: 'leaky_relu' when we use the activation_functions dict in backend_pt.py#L80 mlp_layers.append(activation_functions[act_fn.lower()]).

Note that in backend_tf.py#L27 this is spelt consistently with the docs:

act_functions = {
    "sigmoid": tf.nn.sigmoid,
    "tanh": tf.nn.tanh,
    "elu": tf.nn.elu,
    "selu": tf.nn.selu,
    "relu": tf.nn.relu,
    "relu6": tf.nn.relu6,
    "leaky_relu": tf.nn.leaky_relu,
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions