-
Notifications
You must be signed in to change notification settings - Fork 7k
Changing to AdaptiveAvgPool2d on SqueezeNet and ResNet #643
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Conversation
The Average Pool Layer on SqueezeNet and ResNet was hardcoded, this was changed, now any the input size is accepted.
@houseroad will this change break ONNX? I'm inclined to make all torchvision models take an |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/pytorch/pytorch/blob/master/torch/onnx/symbolic.py#L599
(1,1) case is supported. I have manually tested SqueezeNet, it works fine. I think this change will NOT break ONNX test.
Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Adan!
Thanks to you and your work!! |
The update allows Alexnet to process images larger or smaller than prescribed imagenet size using adaptive average pooling. Will be useful while finetuning or testing on different resolution images. Similar to #643 and #672. I did not include adaptive avg pool in features or classifier block so that these predefined blocks can be used as it is.
The update allows VGG to process images larger or smaller than prescribed imagenet size using adaptive average pooling. Will be useful while finetuning or testing on different resolution images. Similar to #643 and #672. I did not include adaptive avg pool in features or classifier block so that these predefined blocks can be used as it is.
Hi,
I'm dealing with this problem with the hardcoded AvgPool2d on SqueezeNet, some people notice this a year ago, also in ResNet, so I made these changes, Pytorch must be as robust as possible. I left the other AvgPool2d layers because they have another purpose.
nn.AvgPool2d -> nn.AdaptiveAvgPool2d((1,1)) assures the output's size like (batch_size, num_classes, 1,1) on SqueezeNet and ResNet.