Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

python model successfully tested on polygraphy tensorrt, but failed, when loading on torch2trt #915

Open
ninono12345 opened this issue Jan 4, 2024 · 0 comments

Comments

@ninono12345
Copy link

Hello everyone.

I have been working on a project trying to convert my pytorch model to tensorrt for faster inference.

I have converted my model to onnx successfully, and run a polygraphy test:

polygraphy run tomp101_head_latest3.onnx --trt

my received output for a test run is that the model was converted successfully and an inference test has also been successful
image

but when using this torch2trt code:

`om = load_network("tomp101.pth.tar").eval().cuda()

input_data = (torch.randn((1, 1024, 18, 18)).cuda(),
torch.randn((1, 1024, 18, 18)).cuda(),
torch.randn((1, 1, 18, 18)).cuda(),
torch.randn((1, 4,18,18)).cuda())

model_trt = torch2trt(om.head, input_data)`

keep in mind that another model with this code:

`x = torch.ones((1, 3, 288, 288)).cuda()

model_tompnet = torch2trt(model, [x])
outtt = model_tompnet(x)`

converts and runs successfully returning no error, and the returned tensor is of the correct shape, so we know that cuda, cuDNN, TensorRT were installed correctly.

THANK YOU EVERYBODY IN ADVANCE FOR ANY HELP!!!

IF ANY MORE INFORMATION IS NEEDED, PLEASE ASK AND I WILL PROVIDE

Environment:

Windows 10
python 3.10.13
pytorch 2.1.2+cu121
cuda 12.1 update 1
cuDNN 8.9.7
Tensorrt 8.6

not running on anaconda or anything, just straight windows

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant