DenoiseRep is a computation-free, label-optional and model-irrelevant algorithm to incrementally improve representation learning.
DenoiseRep: Denoising Model for Representation Learning.
Zhengrui Xu&, Guan'an Wang&^, Xiaowen Huang*, Jitao Sang.
NeurIPS 2024 (Oral)
&Equal Contribution
^Project Lead
*Contact Author
- 2024.12.27: code released. thanks to Zhengrui Xu's contribution, who is the code developer.
- 2024.10.28: init project, code coming soon.
- release DenoiseRep basic code.
- implement
DenoiseLinear
. - implement
DenoiseConv2d
. - a tutorial of cifar10.
- implement Person-ReID experiments.
- implement Classification (ImageNet) experiments.
- implement Detection / Segmentation experiments.
Tasks | Model | Backbone | Dataset | Metric | Baseline | +DenoiseRep |
---|---|---|---|---|---|---|
Classification | ViT | ViT patch=4 | Cifar-10 | acc@1 | 85.6% | 86.2% (model)(log) |
Classification | Swin-Transformer | Swinv2-T | ImageNet | acc@1 | 81.8% | 82.1% (model)(log) |
Person-ReID | TransReID-SSL | ViT-S | MSMT17 | mAP | 66.3% | 67.3% (model)(log) |
cd denoiserep_op
bash make.sh
pip show denoiserep
load your model trained with the original pipeline, and convert to denoiserep.
train your model by adding ploss.
a training trick to obtain better performance.
please see more details by comparing train_cifar10.py and train_cifar10_denoise.py
If you find denoise-rep useful in your research, please consider citing:
@inproceedings{xu2024denoiserep,
title={DenoiseRep: Denoising Model for Representation Learning},
author={zhengrui Xu and Guan'an Wang and Xiaowen Huang and Jitao Sang},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems},
year={2024},
url={https://openreview.net/forum?id=OycU0bAus6}
}
Our implementation is mainly based on the following codebases. We gratefully thank the authors for their wonderful works.
TransReID-SSL, Swin-Transformer, mmdetection, mmsegmentation.