Skip to content

Commit

Permalink
Be explicit
Browse files Browse the repository at this point in the history
  • Loading branch information
dirkgr committed Sep 12, 2023
1 parent 61004d4 commit 8d094b6
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 8 deletions.
4 changes: 2 additions & 2 deletions configs/v1-mix-medium-mcli.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ model:
include_bias: false
block_type: sequential
layer_norm_type: low_precision
#layer_norm_with_affine: false
bias_for_layer_norm: true
layer_norm_with_affine: true # workaround for the layer norm bug
bias_for_layer_norm: true # workaround for the layer norm bug
activation_type: swiglu
residual_dropout: 0.0
embedding_dropout: 0.0
Expand Down
4 changes: 2 additions & 2 deletions configs/v1-mix-medium.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ model:
include_bias: false
block_type: sequential
layer_norm_type: low_precision
#layer_norm_with_affine: false
bias_for_layer_norm: true
layer_norm_with_affine: true # workaround for the layer norm bug
bias_for_layer_norm: true # workaround for the layer norm bug
activation_type: swiglu
residual_dropout: 0.0
embedding_dropout: 0.0
Expand Down
4 changes: 2 additions & 2 deletions configs/v1-mix-small-mcli.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ model:
include_bias: false
block_type: sequential
layer_norm_type: low_precision
#layer_norm_with_affine: false
bias_for_layer_norm: true
layer_norm_with_affine: true # workaround for the layer norm bug
bias_for_layer_norm: true # workaround for the layer norm bug
activation_type: swiglu
residual_dropout: 0.0
embedding_dropout: 0.0
Expand Down
4 changes: 2 additions & 2 deletions configs/v1-mix-small.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ model:
include_bias: false
block_type: sequential
layer_norm_type: low_precision
#layer_norm_with_affine: false
bias_for_layer_norm: true
layer_norm_with_affine: true # workaround for the layer norm bug
bias_for_layer_norm: true # workaround for the layer norm bug
activation_type: swiglu
residual_dropout: 0.0
embedding_dropout: 0.0
Expand Down

0 comments on commit 8d094b6

Please sign in to comment.