Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

How to approach TTA with albumentations? #136

Closed
sakvaua opened this issue Dec 5, 2018 · 3 comments
Closed

How to approach TTA with albumentations? #136

sakvaua opened this issue Dec 5, 2018 · 3 comments

Comments

@sakvaua
Copy link

sakvaua commented Dec 5, 2018

Hi.
Is there a recommended approach to do test time augmentation? I would like to pick several random augmentations and apply them to test and validation sets for later stacking. I'm not sure how to proceed. How do I make sure that one set of augmentations is applied to the whole test set and then to validation set and not random augmentations to every image?
Thanks.

@BloodAxe
Copy link
Contributor

BloodAxe commented Dec 5, 2018

It is not supported out of the box, but you may use a simple trick to fix random seed before applying augmentation at test time:

# TTA
random.seed(1234)
data = transform(**data)

@albu
Copy link
Contributor

albu commented Dec 21, 2018

you can use p=1 transforms or always_apply transforms if it's something like horizontal flip or vertical flip (determined except probability)

@albu albu closed this as completed Dec 21, 2018
@BloodAxe
Copy link
Contributor

@sakvaua Hi, albumentations does not support this by design. However this is popular task, so I take a chance and pr my library with handy PyTorch extensions including TTA: https://github.com/BloodAxe/pytorch-toolbelt/blob/develop/pytorch_toolbelt/inference/tta.py

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants