-
Notifications
You must be signed in to change notification settings - Fork 248
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Workaround transformers overwriting model_type when saving dpr models (…
…#765) * Workaround transformers overwriting model_type when saving dpr models * Added test for saving/loading camembert dpr model * Setting base_model.bert_model for DPR models loaded in FARM style * Fix loading dpr model with standard bert * Removed whitespace from test case question * Renaming names of model weights when saving dpr models * Assign transformers model as dpr_encoder...bert_model * Rename save directory to prevent two tests using same directory * Fix loading DPR with standard BERT models * Extending dpr test case to different models * Adjust names of model weights only if non-standard BERT DPR model * DPREncoder classes handle renaming of model weights instead of LanguageModel parent class Co-authored-by: Timo Moeller <timo.moeller@deepset.ai>
- Loading branch information
1 parent
1f1fe4c
commit 84f67e0
Showing
2 changed files
with
338 additions
and
7 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.