Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Yolov3-tiny Does not show any detection results #58

Open
Techyee opened this issue Aug 7, 2019 · 4 comments
Open

Yolov3-tiny Does not show any detection results #58

Techyee opened this issue Aug 7, 2019 · 4 comments

Comments

@Techyee
Copy link

Techyee commented Aug 7, 2019

I tried to use yolov3-tiny model from Mobilenet-YOLO.

Step 1. I modified last layer of yolov3-tiny as below.

layer {
bottom: "layer17-yolo"
bottom: "layer23-conv"
top: "yolo-det"
name: "yolo-det"
type: "Yolo"
}

Step 2. I modified YoloConfig.h as below, with the information of tinyyolo prototxt.

  //tinyYolo layer17 layer23
  YoloKernel yolo1 = {
      13,
      13,
      {81,82, 135,169, 344,319}
  };
  YoloKernel yolo2 = {
      26,
      26,
      {10,14, 23,27, 37,58}
  };

}

Step3. Since tinyyolo only gets 2 yolokernel, so I marked yolo3 on yololayer.cu

    mYoloKernel.push_back(yolo1);
    mYoloKernel.push_back(yolo2);

// mYoloKernel.push_back(yolo3);

and It looked like I did all conversion works. Even engine builds and inference did not emit any errors. However, there was no red box at all.
result
Picture above is result from yolov3-608
test
Picture above is result from tinyyolo.

Any advice?

P.S Oh, I also downloaded tinyyolo.caffemodel and used it during inference.
P.S.2. my cmd line for execution was
./install/runYolov3 --caffemodel=./yolov3-tiny.caffemodel --prototxt=./yolov3-tiny.prototxt --input=./test.jpg --W=416 --H=416 --class=80 --nms=0

@ElonKou
Copy link

ElonKou commented Sep 6, 2019

hi,did you solved the question?I'd like to run tiny-yolo3 on my TX2, But I can't find "tinyyolo.caffemodel", can you share the download link for me,thanks you very much.

@lewes6369
Copy link
Owner

Can you show me the prototxt for your model. What is the type of "layer17-yolo"? And is there any log info about runing the cmd line? Maybe you have to try the cpu version of yoloLayer and cout some information to debug it.

@lidapengpeng
Copy link

I encountered the same problem as you. How did you solve this problem in the end?

@liudakai2
Copy link

I don't know if it is too late. I happened to find out that the key issue that caused yolov3-tiny not working is the "layer12-maxpool". To be specific, in darknet yolov3-tiny take the maxpool layer with kernel-size=2 and stride=1, while in order to avoid changing the tensor shape after the maxpool layer, in caffe or pytorch or tensorrt or anything, we must set the kernel-size/stride to 1/1 or 3/1 respactively. That is the culprit.

One solution to fix it out is to manually change the kernel-size in this maxpool layer or just remove it, and retrain your network in darknet, and then convert the darknet.weights to caffe.caffemodel. When you get your new trt.engine, you will find it finally works.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants