Skip to content
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.

Commit

Permalink
#10 Update default attention width
Browse files Browse the repository at this point in the history
  • Loading branch information
CyberZHG committed Jan 31, 2019
1 parent 67cd314 commit 9268631
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion keras_self_attention/seq_self_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def __init__(self,
self.return_attention = return_attention
self.history_only = history_only
if history_only and attention_width is None:
self.attention_width = int(1e10)
self.attention_width = int(1e9)

self.use_additive_bias = use_additive_bias
self.use_attention_bias = use_attention_bias
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

setup(
name='keras-self-attention',
version='0.32.0',
version='0.33.0',
packages=['keras_self_attention'],
url='https://github.com/CyberZHG/keras-self-attention',
license='MIT',
Expand Down

0 comments on commit 9268631

Please sign in to comment.