Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Size mismatch error when loading a pretrained model #87

Open
jackywangtj66 opened this issue Aug 16, 2023 · 1 comment
Open

Size mismatch error when loading a pretrained model #87

jackywangtj66 opened this issue Aug 16, 2023 · 1 comment

Comments

@jackywangtj66
Copy link

I tried to load a pretrained mobilevit model downloaded from model zoo for fine-tuning on a classification task with 6 labels. I do have excluded the last layer:
image

But I still encountered the following error:
2023-08-16 15:05:22 - ERROR - Unable to load pretrained weights from ./TrainedModels/Models/MobileVitv1/mobilevit_xxs.pt. Error: Error(s) in loading state_dict for MobileViT:
size mismatch for classifier.fc.weight: copying a param with shape torch.Size([1000, 320]) from checkpoint, the shape in current model is torch.Size([6, 320]).
size mismatch for classifier.fc.bias: copying a param with shape torch.Size([1000]) from checkpoint, the shape in current model is torch.Size([6]).. Exiting!!!

Do you have any idea?

@jackywangtj66
Copy link
Author

Found the problem myself.
Resume_exclude_scopes should be under 'model' instead of 'model.classification'.
Suggestion: would you consider putting the scopes arguments under 'pretrained'? Since they will be useless without a pretrained model.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant