Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Question about the projector backpropogation #7

Open
jinggqu opened this issue Sep 9, 2024 · 0 comments
Open

Question about the projector backpropogation #7

jinggqu opened this issue Sep 9, 2024 · 0 comments

Comments

@jinggqu
Copy link

jinggqu commented Sep 9, 2024

Hello and thank you for your excellent work.

I have a question about the projector. From the code of this repository, I noticed that the parameters of the project are not included in the optimizer (shown in below), which is also evidenced in the paper Algorithm 1. I would like to know the reason for not training the projector and classifier as this is not mentioned in the paper.

params = list(self.model_1.parameters()) + list(self.model_2.parameters())
optimizer = torch.optim.Adam(params,lr=opt.lr)

Thanks again for your contribution to the open source community.

@jinggqu jinggqu changed the title Question about the projector Question about the projector backpropogation Sep 9, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant