Fixed fp16 training/inference with factors-combine concat (#926)

This commit is contained in:
Artur Nowakowski 2022-03-22 11:07:41 +01:00 committed by GitHub
parent 78bef7aeba
commit 23c36ec1a3
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
2 changed files with 2 additions and 2 deletions

View File

@ -13,6 +13,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.
### Fixed
- Scripts using PyYAML now use `safe_load`; see https://msg.pyyaml.org/load
- Fixed check for `fortran_ordering` in cnpy
- Fixed fp16 training/inference with factors-combine concat method
### Changed
- Make guided-alignment faster via sparse memory layout, add alignment points for EOS, remove losses other than ce

View File

@ -57,8 +57,7 @@ Embedding::Embedding(Ptr<ExpressionGraph> graph, Ptr<Options> options)
auto lemmaEmbs = rows(E_, lemmaIndices);
int dimFactors = FactorEmbMatrix_->shape()[0];
auto factEmbs
= dot(graph->constant(
{(int)data.size(), dimFactors}, inits::fromVector(factorIndices), Type::float32),
= dot(graph->constant({(int)data.size(), dimFactors}, inits::fromVector(factorIndices)),
FactorEmbMatrix_);
return concatenate({lemmaEmbs, factEmbs}, -1);