Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

.t7 to .onnx problem #39

Open
Printeger opened this issue May 4, 2022 · 0 comments
Open

.t7 to .onnx problem #39

Printeger opened this issue May 4, 2022 · 0 comments

Comments

@Printeger
Copy link

Hi, nice work.

im trying to convert .t7 to .onnx model, but i got something wrong. Currently using: torch.onnx.export()

    path = "/home/t72onnx/best_model.t7"
    model = PAConv(args).to(device)
    dummy_input = torch.randn(16, 3, 1024).to(device)
    model.load_state_dict({k.replace('module.', ''):v for k, v in torch.load(path, map_location=device).items()})
    model.eval()
   
    torch.onnx.export(model.to(device),               # model being run
                  dummy_input.to(device),                         # model input (or a tuple for multiple inputs)
                  "my_mobileface.onnx",   # where to save the model (can be a file or file-like object)
                  export_params=True,        # store the trained parameter weights inside the model file
                  opset_version=12,
                  verbose=False, 
                  do_constant_folding=True,  # whether to execute constant folding for optimization
                  input_names = ['input'],   # the model's input names
                  output_names = ['output'], # the model's output names
                #   Operator_export_types = torch.onnx.OperatorExportTypes.onnx_aten_fallack,
                 )

but i got RuntimeError with AssignScoreWithK

Traceback (most recent call last):
  File "main.py", line 307, in <module>
    torch.onnx.export(model.to(device),               # model being run
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/__init__.py", line 271, in export
    return utils.export(model, args, f, export_params, verbose, training,
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 88, in export
    _export(model, args, f, export_params, verbose, training, input_names, output_names,
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 709, in _export
    proto, export_map = graph._export_onnx(
RuntimeError: ONNX export failed: Couldn't export Python operator AssignScoreWithK

is that because torch.onnx.export does not surport PAConv model?
Many thanks!

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant