Skip to content

Commit

Permalink
Remove unused beta param for silu, use torch op directly
Browse files Browse the repository at this point in the history
The beta param was only accepted on the tensorflow/torch backends
and not in the `keras.ops` API, nor was it tested. I think best
just to ditch, since no one could be relying on it.
  • Loading branch information
mattdangerw committed Apr 1, 2024
1 parent 104fe8e commit c008881
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions keras/backend/tensorflow/nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,8 @@ def softsign(x):
return tf.nn.softsign(x)


def silu(x, beta=1.0):
return tf.nn.silu(x, beta=beta)
def silu(x):
return tf.nn.silu(x)


def log_sigmoid(x):
Expand Down
4 changes: 2 additions & 2 deletions keras/backend/torch/nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,9 +47,9 @@ def softsign(x):
return tnn.softsign(x)


def silu(x, beta=1.0):
def silu(x):
x = convert_to_tensor(x)
return x * sigmoid(beta * x)
return tnn.silu(x)


def log_sigmoid(x):
Expand Down

0 comments on commit c008881

Please sign in to comment.