Skip to content

kokolerk/DivIL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DivIL: Unveiling and Addressing Over-Invariance for Out-of- Distribution Generalization

Paper Github License License

🚀🚀🚀 Official implementation of DivIL: Unveiling and Addressing Over-Invariance for Out-of- Distribution Generalization

💡 Highlights

  • We discover and theoretically define the over-invariance phenomenon, i.e., the loss of important details in invariance when alleviating the spurious features, which exists in almost all of the previous IL methods.
  • We propose Diverse Invariant Learning (DivIL), combining both invariant constraints and unsupervised contrastive learning with randomly masking mechanism to promote richer and more diverse invariance.
  • Experiments conducted on 12 benchmarks, 4 different invariant learning methods across 3 modali-ties (graphs, vision, and natural language) demonstrate that DivIL effectively enhances the out-of-distribution generalization performance, verifying the over-invariance insight.

🛠️ Usage

We organize our code in the following strucute. The detailed guidance is included in the README.md of each subdirectory(Graph, ColoredMNIST and GPT2_nli).

DivIL/
├── README.md
├── Graph/
│   ├── README.md
│   └── datasets/
│   └── dataset_gen/
│   └── models/
│   └── main-batch_aug.py
│   └── ...
├── ColoredMNIST/
│   ├── README.md
│   ├── train_coloredmnist.py
│   └── ...
├── GPT2_nli/
│   ├── README.md
│   ├── main.py
│   └── ...
├── synthetic_data_experiment/
└── ...

✒️ Citation

This repo benefits from CIGA and DomainBed. Thanks for their wonderful works.

If you find our work helpful for your research, please consider giving a star ⭐ and citation 📝

@misc{wang2025divil,
    title={DivIL: Unveiling and Addressing Over-Invariance for Out-of- Distribution Generalization}, 
    author={Jiaqi Wang and Yuhang Zhou and Zhixiong Zhang and Qiguang Chen and Yongqiang Chen and James Cheng},
    year={2025},
    eprint={2502.12413},
    archivePrefix={arXiv},
    primaryClass={cs.LG},
    url={https://arxiv.org/abs/2502.12413}, 
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •