Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

对ViT骨干网络细微改动后性能大幅下降 #63

Open
wanderer1230 opened this issue Aug 25, 2022 · 2 comments
Open

对ViT骨干网络细微改动后性能大幅下降 #63

wanderer1230 opened this issue Aug 25, 2022 · 2 comments

Comments

@wanderer1230
Copy link

您好!抱歉打扰您。我最近尝试对您提出的Baseline进行改进,例如在PatchEmbed_overlap中加入卷积,但性能都大幅下降。

我看到您之前的回答 : “最根本的原因在于你新加的conv层没有预训练而是随机初始化,而vit的其他层全部都是ImageNet预训练的,导致网络输入全部错乱了,因此产生了掉点”

我想请问一下“网络输入全部错乱”指的是什么意思?以及如何解决这个问题?(目前的实验条件不支持我重新在ImageNet上重新预训练模型)

@michuanhaohao
Copy link
Collaborator

这个没办法解决,改动backbone效果很差这个问题就是一直制约ReID backbone设计发展的一个重要因素,不仅是vit,过去的CNN也是如此。所以可以看到所有的工作都不会改backbone的结构。

这个问题只能通过预训练来解决,要么ImageNet预训练,要么用TransReID-SSL去预训练

@zzk2021
Copy link

zzk2021 commented Jan 2, 2023

没有预训练效果差很多,尤其是mAP掉的严重。cnn也是如此

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants