Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Unexpected key(s) in state_dict: "transformer.text_encoder.embeddings.position_ids". #106

Open
vdorbala opened this issue Jun 26, 2024 · 2 comments

Comments

@vdorbala
Copy link

When running the demo script, I get -
Unexpected key(s) in state_dict: "transformer.text_encoder.embeddings.position_ids".

while trying to load the model using -
model, postprocessor = torch.hub.load('ashkamath/mdetr:main', 'mdetr_efficientnetB5', pretrained=True, return_postprocessor=True)

Does the model not work anymore?

@iremeyiokur
Copy link

iremeyiokur commented Aug 11, 2024

It might be related with the transformers version. I solved the problem by adding strict=False into load_state_dict as proposed in the following link which basically ignores non-matching keys. I hope it will work for you.

https://discuss.pytorch.org/t/missing-keys-unexpected-keys-in-state-dict-when-loading-self-trained-model/22379/6

@Buchimi
Copy link

Buchimi commented Nov 21, 2024

How can this be done in the colab?

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants