-
- Downloads
Make learned positional embedding optional
Summary: - Add learned positional embedding binary flag to masked LM model. - Add base arch config for masked LM model which sets all the binary parameters to False. Otherwise some of the binary flag parameters will always be override by config in `xlm_architecture` (e.g. encoder_learned_pos) Reviewed By: liezl200 Differential Revision: D15054487 fbshipit-source-id: d78827f352b9160a89c9dc4f45b9fce15a2f234d
Showing
- fairseq/models/masked_lm.py 45 additions, 22 deletionsfairseq/models/masked_lm.py
- fairseq/models/transformer.py 1 addition, 15 deletionsfairseq/models/transformer.py
- fairseq/modules/__init__.py 2 additions, 0 deletionsfairseq/modules/__init__.py
- fairseq/modules/positional_embedding.py 25 additions, 0 deletionsfairseq/modules/positional_embedding.py
- fairseq/modules/transformer_sentence_encoder.py 4 additions, 14 deletionsfairseq/modules/transformer_sentence_encoder.py
Loading
Please register or sign in to comment