Update upgrade_state_dict in transformer.py to upgrade_state_dict_named (#317)
Summary: Pull Request resolved: https://github.com/pytorch/fairseq/pull/317 When upgrading `state_dict` variable, `upgrade_state_dict` function in TransformerEncoder/TransformerDecoder doesn't handle multiple encoders/decoders, however, D10052908 will be the case. Before the change, we will hit error message [1] when loading checkpoint for multilingual_transformer model in D10052908. This diff will fix it. Reviewed By: myleott, liezl200 Differential Revision: D10375418 fbshipit-source-id: 7104c1a463e78f3fa33d8479a37c51608be50610
Loading
Please register or sign in to comment