mirror of
https://github.com/facebookresearch/fairseq.git
synced 2024-11-12 21:52:01 +03:00
c6409c029d
Summary: - Fix issue with encoder padding mask - Also add lengths as a field in encoder_out of encode_src method - Add a conditional clause in transformer_monotonic_attention.py to handle the case where encoder_padding_mask is None Reviewed By: jmp84 Differential Revision: D28080936 fbshipit-source-id: 99f78c5e3fe5644960ade44210ea78280ef53b8c |
||
---|---|---|
.. | ||
docs | ||
eval/agents | ||
models | ||
modules | ||
utils | ||
__init__.py | ||
README.md |
Simultaneous Translation
Examples of simultaneous translation in fairseq