Skip to content

Commit

Permalink
Fixed fp16 training/inference with factors-combine concat (#926)
Browse files Browse the repository at this point in the history
  • Loading branch information
arturnn committed Mar 22, 2022
1 parent 78bef7a commit 23c36ec
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
### Fixed
- Scripts using PyYAML now use `safe_load`; see https://msg.pyyaml.org/load
- Fixed check for `fortran_ordering` in cnpy
- Fixed fp16 training/inference with factors-combine concat method

### Changed
- Make guided-alignment faster via sparse memory layout, add alignment points for EOS, remove losses other than ce
Expand Down
3 changes: 1 addition & 2 deletions src/layers/embedding.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -57,8 +57,7 @@ Embedding::Embedding(Ptr<ExpressionGraph> graph, Ptr<Options> options)
auto lemmaEmbs = rows(E_, lemmaIndices);
int dimFactors = FactorEmbMatrix_->shape()[0];
auto factEmbs
= dot(graph->constant(
{(int)data.size(), dimFactors}, inits::fromVector(factorIndices), Type::float32),
= dot(graph->constant({(int)data.size(), dimFactors}, inits::fromVector(factorIndices)),
FactorEmbMatrix_);

return concatenate({lemmaEmbs, factEmbs}, -1);
Expand Down

0 comments on commit 23c36ec

Please sign in to comment.