You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
converts and runs successfully returning no error, and the returned tensor is of the correct shape, so we know that cuda, cuDNN, TensorRT were installed correctly.
THANK YOU EVERYBODY IN ADVANCE FOR ANY HELP!!!
IF ANY MORE INFORMATION IS NEEDED, PLEASE ASK AND I WILL PROVIDE
Environment:
Windows 10
python 3.10.13
pytorch 2.1.2+cu121
cuda 12.1 update 1
cuDNN 8.9.7
Tensorrt 8.6
not running on anaconda or anything, just straight windows
The text was updated successfully, but these errors were encountered:
Hello everyone.
I have been working on a project trying to convert my pytorch model to tensorrt for faster inference.
I have converted my model to onnx successfully, and run a polygraphy test:
polygraphy run tomp101_head_latest3.onnx --trt
my received output for a test run is that the model was converted successfully and an inference test has also been successful
but when using this torch2trt code:
`om = load_network("tomp101.pth.tar").eval().cuda()
input_data = (torch.randn((1, 1024, 18, 18)).cuda(),
torch.randn((1, 1024, 18, 18)).cuda(),
torch.randn((1, 1, 18, 18)).cuda(),
torch.randn((1, 4,18,18)).cuda())
model_trt = torch2trt(om.head, input_data)`
keep in mind that another model with this code:
`x = torch.ones((1, 3, 288, 288)).cuda()
model_tompnet = torch2trt(model, [x])
outtt = model_tompnet(x)`
converts and runs successfully returning no error, and the returned tensor is of the correct shape, so we know that cuda, cuDNN, TensorRT were installed correctly.
THANK YOU EVERYBODY IN ADVANCE FOR ANY HELP!!!
IF ANY MORE INFORMATION IS NEEDED, PLEASE ASK AND I WILL PROVIDE
Environment:
Windows 10
python 3.10.13
pytorch 2.1.2+cu121
cuda 12.1 update 1
cuDNN 8.9.7
Tensorrt 8.6
not running on anaconda or anything, just straight windows
The text was updated successfully, but these errors were encountered: