-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
use onnx model to inference #75
Comments
As I know, recently in TensorRT 6.0 the nvidia official has already provided a sample that running the onnx-yolov3 model. And it is written in python? |
thank you for your reply . i have solved my problem |
@JensenHJS in this repo they use a YOLO last layer (not the three convolution output layers). Were you able to get that into your onnx model? |
onnx doesn't support yolo layer. the output of trt file go through yolo layer,it is the final output. |
I can provide the onnx model. Have you planned to support onnx models with yolov3 currently?
it's original model is yolov3.weights, using python API convert it to yolov3-608.onnx,then convert onnx to yolo-608.trt. Yolov3.weights is downloaded from darknet official website.
https://drive.google.com/drive/folders/1fRcxY5YgEQ8DUmS1tEdDKxL45SsKkvFh?usp=sharing
The text was updated successfully, but these errors were encountered: