Compatibility fix with Hydra 1.1 (#3722)

Summary:
One of the changes in Hydra 1.1 is that the default composition order is changing.
This is documented [here](https://hydra.cc/docs/upgrades/1.0_to_1.1/default_composition_order).
In Hydra 1.1, a config is overriding values introduced by the defaults list while in Hydra 1.0 - the defaults list was overriding the values in the config.

fairseq is currently depending on the previous behavior:
The class `FairseqConfig` defines config values, and it's expecting them to be overridden by the defaults list.
This result in a different config being created when running `fairseq_cli/hydra_train.py` with Hydra 1.0 and with 1.1.

Hydra 1.1 introduced the `_self_` keyword in the defaults list to control the composition order. In order to achieve the behavior of Hydra 1.0, `_self_` should be added as the first item in the defaults list.

To allow for a smoother migration, Hydra 1.0 is ignoring `_self_` starting from 1.0.7 (previous versions will issue an error).

This diff adds `_self_` as the first item in the defaults list the fairseq config, and introduce a dependency a Hydra 1.0 version that is equal or newer to 1.0.7.

### Testing:
I ensured that the following yield the same composed config:
Default config with Hydra 1.0.6, 1.0.7 and 1.1.0

`examples/wav2vec/config/finetuning/base_10h.yaml` with Hydra 1.0.6, 1.0.7 and 1.1.0.

This can be achieved by outputing the generated config using `--cfg job` and compating the outputs.

Pull Request resolved: https://github.com/pytorch/fairseq/pull/3722

Reviewed By: dianaml0

Differential Revision: D29917677

Pulled By: jieru-hu

fbshipit-source-id: 7e645b83cccb03fc80a6702e302c4643d2b14a78
This commit is contained in:
Omry Yadan 2021-07-26 16:35:40 -07:00 committed by Facebook GitHub Bot
parent 67ff6baa42
commit 53802e7812
3 changed files with 3 additions and 2 deletions

View File

@ -5,6 +5,7 @@ hydra:
dir: .
defaults:
- _self_
- task: null
- model: null
- criterion: cross_entropy

View File

@ -479,7 +479,7 @@ class MemoryEfficientFP16Optimizer(
"Unsupported optimizer: {}".format(optimizer.__class__.__name__)
)
super().__init__(cfg.optimizer)
super().__init__(getattr(cfg, "optimizer", None))
self.wrapped_optimizer = optimizer
if getattr(cfg.common, "fp16_scale_window", None) is None:

View File

@ -201,7 +201,7 @@ def do_setup(package_data):
"cffi",
"cython",
'dataclasses; python_version<"3.7"',
"hydra-core<1.1",
"hydra-core>=1.0.7,<1.1",
"omegaconf<2.1",
'numpy<1.20.0; python_version<"3.7"',
'numpy; python_version>="3.7"',