Skip to content

Commit

Permalink
fix attention, no depthwise convs by default
Browse files Browse the repository at this point in the history
change attention keys and values to the module inputs instead of convolution outputs
  • Loading branch information
Guitaricet committed Oct 8, 2020
1 parent d1d92e9 commit bd665b0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion unet_transformer/unet_transformer_layer.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def __init__(
model_dim=None,
ffn_hidden=None,
conv_skip_connection=False,
depthwise_conv=True,
depthwise_conv=False,
):
super().__init__()

Expand Down

0 comments on commit bd665b0

Please sign in to comment.