Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

How to make the embedding changeable in backpropagation #24

Open
Z-Jeff opened this issue Apr 29, 2020 · 1 comment
Open

How to make the embedding changeable in backpropagation #24

Z-Jeff opened this issue Apr 29, 2020 · 1 comment

Comments

@Z-Jeff
Copy link

Z-Jeff commented Apr 29, 2020

It seems that the word embedding are kept static during training.
How to make the embedding changeable in backpropagation?

@rafaelgreca
Copy link

I know that it is a old issue, but just set the requires_grad parameter to True (the default value is True), like this:

## create the embedding layer
self.embedding = nn.Embedding(vocab_size, embedding_dim, padding_idx = 0)
self.embedding.load_state_dict({'weight': embedding_weights})
self.embedding.weight.requires_grad = True

In this case, I am loading a pretrained embedding weights into a embedding layer and setting it to be trainable during the training process.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants