Skip to content

How to tranform efficient-pytorch to efficient-onnx #20

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Closed
peyer opened this issue Jun 6, 2019 · 14 comments
Closed

How to tranform efficient-pytorch to efficient-onnx #20

peyer opened this issue Jun 6, 2019 · 14 comments

Comments

@peyer
Copy link

peyer commented Jun 6, 2019

I have tried to convert efficient-pytorch to efficient-onnx with api (torch.onnx.export), but I meet a problem showing below info
Failed to export an ONNX attribute, since it's not constant, please try to make things (e.g., kernel size) static if possible
How could I fix it ?

@lukemelas
Copy link
Owner

Yes, I can try to make the kernel size static in the next significant update.

@peyer
Copy link
Author

peyer commented Jun 7, 2019

@lukemelas Thansk!

@lukemelas
Copy link
Owner

Done! I had to change some things with the padding, so it took a little while.

Here is an example:
https://colab.research.google.com/drive/1rOAEXeXHaA8uo3aG2YcFDHItlRJMV0VP

@lukemelas
Copy link
Owner

lukemelas commented Jun 30, 2019

Closing this issue, but definitely re-open if you have any issues.

Also, let me know if exporting to ONNX / mobile works for you!

@peyer
Copy link
Author

peyer commented Jul 1, 2019

@lukemelas Successd and thanks!

@lukemelas
Copy link
Owner

Great to hear!

@deepconsc
Copy link

@lukemelas Hey! Appreciate your great work.
For the latest version, exporting to onnx fails due to Swish. Any ideas?

@bkhti4
Copy link

bkhti4 commented Jan 15, 2021

Please use model.set_swish(memory_efficient=False).

@kendyChina
Copy link

@bkhti4 Thanks! When I replaced SwishImplementation() with Swish(), I converted the model to onnx successfully.

@lukemelas
Copy link
Owner

lukemelas commented Apr 15, 2021

How there's nn.SiLU as well, so I believe all models should be memory-efficient and exportable :)

The older functions are still included for backward compatibility with old versions of PyTorch

@fangliang425
Copy link

fangliang425 commented Apr 20, 2021

@lukemelas Hi, when I simply ran

model = EfficientNet.from_name(model_name='efficientnet-b0')
model.eval()
model.set_swish(memory_efficient=False)
torch.onnx.export(model, torch.rand(10, 3, 240, 240), "EfficientNet-B0.onnx")

, and got error as blow, any ideas? Thanks.
RuntimeError: Exporting the operator silu to ONNX opset version 9 is not supported. Please open a bug to request ONNX export support for the missing operator.

environment:
Ubuntu 20.04
torch 1.7.1
cuda 11.0
efficientnet-pytorch 0.7.1

@aojue1109
Copy link

@lukemelas嗨,当我简单地跑步时

model = EfficientNet.from_name(model_name='efficientnet-b0')
model.eval()
model.set_swish(memory_efficient=False)
torch.onnx.export(model, torch.rand(10, 3, 240, 240), "EfficientNet-B0.onnx")

,并因错误而遭受打击,有什么主意吗?谢谢。
RuntimeError: Exporting the operator silu to ONNX opset version 9 is not supported. Please open a bug to request ONNX export support for the missing operator.

环境:
Ubuntu 20.04
火炬1.7.1
cuda 11.0
高效net-pytorch 0.7.1

Is it resolved? I need your help!

@chilin0525
Copy link

@aojue1109 I converted the model to onnx successfully. There's my gist link.
If still not solve your problem I think you should re-open the issue.

@dgdgksj
Copy link

dgdgksj commented Nov 1, 2022

@chilin0525 thanks!!

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants