Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Results for VisDA-C dadaset #2

Open
Leesoon1984 opened this issue May 6, 2022 · 2 comments
Open

Results for VisDA-C dadaset #2

Leesoon1984 opened this issue May 6, 2022 · 2 comments

Comments

@Leesoon1984
Copy link

Leesoon1984 commented May 6, 2022

While Fine-tuning the distilled model, the performance on the VisDA-C dataset drops, the results are as follows:

seed=2019: 74.98 --> 74.80
seed=2020: 76.17 --> 75.17
seed=2021: 79.8 --> 78.6.

The learning rate setting for VisDA-C dataset may matter.

@tim-learn
Copy link
Owner

While Fine-tuning the distilled model, the performance on the VisDA-C dataset drops, the results are as follows:

seed=2019: 74.98 --> 74.80
seed=2020: 76.17 --> 75.17
seed=2021: 79.8 --> 78.6.

The learning rate setting for VisDA-C dataset may matter.

Sorry for the late reply, I will upload the code for VisDA-C in the following month.

@zou-yawen
Copy link

While Fine-tuning the distilled model, the performance on the VisDA-C dataset drops, the results are as follows:

seed=2019: 74.98 --> 74.80
seed=2020: 76.17 --> 75.17
seed=2021: 79.8 --> 78.6.

The learning rate setting for VisDA-C dataset may matter.

hello, I try to repeat the result for VisDA-C dataset with DINE, I can't get the result could you please tell me your parameters

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants