We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
While Fine-tuning the distilled model, the performance on the VisDA-C dataset drops, the results are as follows:
seed=2019: 74.98 --> 74.80 seed=2020: 76.17 --> 75.17 seed=2021: 79.8 --> 78.6.
The learning rate setting for VisDA-C dataset may matter.
The text was updated successfully, but these errors were encountered:
While Fine-tuning the distilled model, the performance on the VisDA-C dataset drops, the results are as follows: seed=2019: 74.98 --> 74.80 seed=2020: 76.17 --> 75.17 seed=2021: 79.8 --> 78.6. The learning rate setting for VisDA-C dataset may matter.
Sorry for the late reply, I will upload the code for VisDA-C in the following month.
Sorry, something went wrong.
hello, I try to repeat the result for VisDA-C dataset with DINE, I can't get the result could you please tell me your parameters
No branches or pull requests
While Fine-tuning the distilled model, the performance on the VisDA-C dataset drops, the results are as follows:
The learning rate setting for VisDA-C dataset may matter.
The text was updated successfully, but these errors were encountered: