-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
how to export onnx? #37
Comments
have you solved this problem? i met some problem when i try to export onnx |
I had to take the axe to the code to get ONNX export to work. You can use the results here: https://github.com/agoryuno/deepsolo-onnx |
@agoryuno Thanks for providing the ONNX export notebook. During onnx inference, I got these output nodes with shapes. Can you please guide me to which output belongs to what. I am interested in obtaining the bbox for text detected in the image |
could you provide the version of your torch and some packages? |
i also try to use the notebook to convert onnx model, but i got the unsupported value type 'Instance', is there any suggestion? thank you |
My code for exporting onnx model is above. You can try and pay attention to your torch vision and python>=3.9.
|
@Gavinic Sorry, Ican not find why the results are different. I try the same input, however, the outputs of the backbone are different, which is strange. Have you found the bug? |
when i export onnx model, i meet some wrong, can i add your contact information to ask more details? |
import shutil CHECKPOINT = "rects_res50_finetune.pth" # If you use other pth, pls change the CONFIG DIMS = (480,480) img = img.astype(np.uint8) |
@agoryuno @Gavinic the result inferenced by exported onnx model seems different from pth file, and i cann't use the exported onnx model to get the final recgnized text,have you solved this problem? and how is your result? the inference result does really work? |
can you tell me your environment for onnx export. I try to export my pth file, but failed. i want to check the relevant export environment
| |
lingtaner
|
|
***@***.***
|
---- Replied Message ----
| From | ***@***.***> |
| Date | 12/28/2023 15:23 |
| To | ***@***.***> |
| Cc | ***@***.***> ,
***@***.***> |
| Subject | Re: [ViTAE-Transformer/DeepSolo] how to export onnx? (Issue #37) |
@Gavinic Sorry, Ican not find why the results are different. I try the same input, however, the outputs of the backbone are different, which is strange. Have you found the bug?
@***@***.*** the result inferenced by exported onnx model seems different from pth file, and i cann't use the exported onnx model to get the final recgnized text,have you solved this problem? and how is your result? the inference result does really work?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
have you know how to use the onnx model's inference result, i also found this result is different. |
can you tell me your environment for onnx export? |
|
thanks for your work!
The text was updated successfully, but these errors were encountered: