-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[General Issue]为什么加载预训练权重后反而效果不好? #81
Comments
你好,我检查了一下,云端的 BIT_LEVIRCD 权重好像出了些问题,有可能是文件损坏导致的。我已经更新了该权重文件,建议你手动删除 另外,指定预训练权重只需要 |
好的好的,谢谢您的回复,我待会就去试试。按理说,这样的预训练权重是不是不用fine-tune,也能跑到很高的分数? |
应该是上一周维护的时候误删除了。我已经修复此问题,请再次尝试。 |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions. |
欢迎您的提问。辛苦您提供以下信息,以方便我们快速定位和解决问题:
develop branch
paddle2.4.1.post17
ubuntu20.04
python3.9
cuda11.7,cudnn8.4.0,nccl2.14.3
我测试了关于bit、cdnet两个网络在Levir数据集上的表现,同时训练100epoch,当我没有加载pretrain_weights时:
bit的test结果为'iou':0.75169,'f1':0.85246,'oacc':0.98627,'kappa':0.85105,
cdnet的test结果为'iou':0.66071,'f1':0.79570,'oacc':0.97966,'kappa':0.78500。
而当我在run_task.py的model.tain()中加入,pretrain_weights='BIT_LEVIRCD'或者pretrain_weights='cdnet_LEVIRCD'后,100个epoch的结果反而为:
bit:'iou':0.68747,'f1':0.81480,'oacc':0.98233,'kappa':0.80557,
cdnet:'iou':0.50427,'f1':0.67045,'oacc':0.97045,'kappa':0.65529。
我不能理解为什么在加载预训练权重后,精度反而下降了很多?
两者的差距仅仅只是我在model.train中添加了pretrain_weights参数,如下:
我添加预训练的日志文件为:
trainlog.log
The text was updated successfully, but these errors were encountered: