Feb 21, 2025
- SigLIP 2 ViT image encoders added (https://huggingface.co/collections/timm/siglip-2-67b8e72ba08b09dd97aecaf9)
- Variable resolution / aspect NaFlex versions are a WIP
- Add 'SO150M2' ViT weights trained with SBB recipes, great results, better for ImageNet than previous attempt w/ less training.
vit_so150m2_patch16_reg1_gap_448.sbb_e200_in12k_ft_in1k
- 88.1% top-1vit_so150m2_patch16_reg1_gap_384.sbb_e200_in12k_ft_in1k
- 87.9% top-1vit_so150m2_patch16_reg1_gap_256.sbb_e200_in12k_ft_in1k
- 87.3% top-1vit_so150m2_patch16_reg4_gap_256.sbb_e200_in12k
- Updated InternViT-300M '2.5' weights
- Release 1.0.15
Feb 1, 2025
- FYI PyTorch 2.6 & Python 3.13 are tested and working w/ current main and released version of
timm
Jan 27, 2025
- Add Kron Optimizer (PSGD w/ Kronecker-factored preconditioner)
What's Changed
- Fix metavar for
--input-size
by @JosuaRieder in #2417 - Add arguments to the respective argument groups by @JosuaRieder in #2416
- Add missing training flag to convert_sync_batchnorm by @collinmccarthy in #2423
- Fix num_classes update in reset_classifier and RDNet forward head call by @brianhou0208 in #2421
- timm: add all to init by @adamjstewart in #2399
- Fiddling with Kron (PSGD) optimizer by @rwightman in #2427
- Try to force numpy<2.0 for torch 1.13 tests, update newest tested torch to 2.5.1 by @rwightman in #2429
- Kron flatten improvements + stochastic weight decay by @rwightman in #2431
- PSGD: unify RNG by @ClashLuke in #2433
- Add vit so150m2 weights by @rwightman in #2439
- adapt_input_conv: add type hints by @adamjstewart in #2441
- SigLIP 2 by @rwightman in #2440
- timm.models: explicitly export attributes by @adamjstewart in #2442
New Contributors
- @collinmccarthy made their first contribution in #2423
- @ClashLuke made their first contribution in #2433
Full Changelog: v1.0.14...v1.0.15