Miscellaneous documentation improvements: (#868)

Summary:
- More clearly document the correspondence between FairseqAdam and torch.optim.AdamW
- Add ResamplingDataset to Sphinx docs
Pull Request resolved: https://github.com/fairinternal/fairseq-py/pull/868

Differential Revision: D17523244

Pulled By: jma127

fbshipit-source-id: 8e7b34b24889b2c8f70b09a52a625d2af135734b
This commit is contained in:
Jerry Ma 2019-09-23 12:25:44 -07:00 committed by Facebook Github Bot
parent 3b09b98b66
commit 3f4fc50163
2 changed files with 8 additions and 0 deletions

View File

@ -30,6 +30,8 @@ provide additional functionality:
:members:
.. autoclass:: fairseq.data.ConcatDataset
:members:
.. autoclass:: fairseq.data.ResamplingDataset
:members:
.. autoclass:: fairseq.data.RoundRobinZipDatasets
:members:
.. autoclass:: fairseq.data.TransformEosDataset

View File

@ -15,6 +15,12 @@ from . import FairseqOptimizer, register_optimizer
@register_optimizer('adam')
class FairseqAdam(FairseqOptimizer):
"""Adam optimizer for fairseq.
Important note: this optimizer corresponds to the "AdamW" variant of
Adam in its weight decay behavior. As such, it is most closely
analogous to torch.optim.AdamW from PyTorch.
"""
def __init__(self, args, params):
super().__init__(args)