Summary:
now that we are moving to using dataclasses to define fairseq configuration, having aliases for options is no longer practical. this pr removes "max-sentences" argument while keeping its alias "batch-size", which is more appropriate
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/1333
Reviewed By: shruti-bh
Differential Revision: D24121305
Pulled By: alexeib
fbshipit-source-id: 34343cea54c8f2c8b059c38ef9f29b66e76df9fb
Summary:
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/1268
We previously had a memory leak when using sharded datasets. In particular,
each sharded dataset is a new FairseqDataset instance, and the cache is keyed
by the `dataset` instance. Since we never clear the cache, this would
eventually cause the system to run out of CPU RAM.
This diff disables caching when using sharded datasets.
Note that we also change the signature to `get_batch_iterator`, which needs to
propagate to many places. We previously avoided this update when adding
`data_buffer_size`, so I'm also adding that everywhere.
Reviewed By: ngoyal2707
Differential Revision: D23319135
fbshipit-source-id: 6bcd6aee141ad9cc234448c49106a8dbf8ea1800
Summary:
# Before submitting
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [ ] Did you read the [contributor guideline](https://github.com/pytorch/fairseq/blob/master/CONTRIBUTING.md)?
- [ ] Did you make sure to update the docs?
- [ ] Did you write any new necessary tests?
## What does this PR do?
Fixes # (issue).
## PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
## Did you have fun?
Make sure you had fun coding �
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/1119
Reviewed By: myleott
Differential Revision: D20712488
fbshipit-source-id: 941ef251c9e2deb8933d88188fac56ee8c5be9b7
Summary:
# Before submitting
- [ ] Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
- [ ] Did you read the [contributor guideline](https://github.com/pytorch/fairseq/blob/master/CONTRIBUTING.md)?
- [ ] Did you make sure to update the docs?
- [ ] Did you write any new necessary tests?
## What does this PR do?
Fixes # (issue).
## PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
## Did you have fun?
Make sure you had fun coding �
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/1113
Reviewed By: myleott
Differential Revision: D20670665
fbshipit-source-id: 8e2846637195b7200f1f60a8421d2fe5ffab789b
Summary:
We are somewhat inconsistent in whether we're using 0-based or 1-based indexing for epochs. This should fix things to be 0-based internally, with logging and checkpoint naming still using 1-based indexing.
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/1053
Reviewed By: spencerp
Differential Revision: D20160715
Pulled By: myleott
fbshipit-source-id: 4ed94f9c371e1bfe29bcfa087fa6756507d6e627
Summary: This is needed to support other build environments (e.g., Windows)
Reviewed By: ngoyal2707
Differential Revision: D19409984
fbshipit-source-id: e970510781abf92f1b02d0961bc30e1210b524dd