-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
trt model conf is negative #39
Comments
provide your model file (onnx model) |
好的,感谢,您意思是我转onnx时候有问题么,prevoid 这个单词我没明白,看您地址是西安,索性我就用中文了TT |
提供一下您的模型文件, 我看一下 linaom1214@163.com |
我大概找到了问题,pth转onnx的时候,也就是 |
end2end 和 不包含 nms的模型差距很大吗? 之前我测试的是好的 |
您是用几类模型测的 |
80 类 coco |
官方那个不行的 |
???? |
我重新测试了一次,多目标、单目标 代码都是好的呀,端到端和不包含nms的模型是可以对齐的, 也许是多Batch的原因,我能想到的就只有这了,等我搞完论文好好研究一下。 |
感谢 我这边还在尝试 |
确实是 多batch的原因 我在更新动态batch的时候,发现转换会出错, 您 那边有进展吗 |
我也遇到分数为负数的情况,检查了很多次代码都没问题。后来偶然发现是FP16时才会出现负的分数,FP32结果是正常的 |
FP16 会有一定的精度损失 |
fp16的score与原始pt推理的score的和基本上是1,很奇怪。 |
你这个问题解决了吗,我这边也遇到了非负的情况,可以交流下吗 |
没有解决呢,可能是tensorrt nms插件的问题吧。FP16的话我最后是放弃了端对端nms的方式,自己做后处理 |
I train a single class model by yolov7 and conver trt.
python3 yolov7/export.py --grid --simplify
python3 T_F_Y_S/export.py -o xxx.onnx -e xxx.engine -p fp16 --end2end
when I use trt model inference, the boxes and class are same as pth model, but conf is negative and always in [-0.4,-0.6]
can you help me,thanks!!!
The text was updated successfully, but these errors were encountered: