Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

稀疏训练epoch设置的更大,在同样的剪枝率下剪枝的结果反而更差! #4

Closed
Joejwu opened this issue Jul 15, 2022 · 3 comments

Comments

@Joejwu
Copy link

Joejwu commented Jul 15, 2022

Hello!您好,我在按照您的代码以及实验13相同的设置下进行的两次剪枝实验;
实验1:sparsity training epochs = 50;
pruning percent = 30%;
结果如下:

| P | origin:0.6032 | after prune:0.5359 | loss ratio:0.1114
| R | origin:0.4673 | after prune:0.3807 | loss ratio:0.1854
| mAP@.5 | origin:0.5084 | after prune:0.4142 | loss ratio:0.1852
| mAP@.5:.95 | origin:0.3244 | after prune:0.2414 | loss ratio:0.2560

finetuning epoch = 100;
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.319
finetuning后精度损失基本和您实验中是差不多的。

实验2:sparsity training epochs = 100;
pruning percent = 30%;
结果如下:

| P | origin:0.6460 | after prune:0.4668 | loss ratio:0.2774
| R | origin:0.4777 | after prune:0.3093 | loss ratio:0.3525
| mAP@.5 | origin:0.5308 | after prune:0.3079 | loss ratio:0.4201
| mAP@.5:.95 | origin:0.3413 | after prune:0.1745 | loss ratio:0.4888

因为直接剪枝的结果更差,所以此处暂未进行finetuning!

问题1:我看您实验13的设置中也是进行了100个epoch的训练,所以很好奇为啥稀疏训练epoch时间更长,反而效果更差了?
或者说这是不是说明稀疏训练的次数并非越多越好,类似于正常训练时可能会因为训练次数过多而导致的过拟合。

问题2:稀疏训练的过程就会导致精度损失,然后在剪枝后也会导致精度损失,即便进行了finetuning也还是存在精度损失,这也就意味着整个剪枝过程中会面临两次精度下降的问题,所以想请教一下您如何看待这个问题?以及是否可以通过稀疏训练更多的epoch来提高精度,或者是稀疏训练较少的次数,但是剪枝后finetuning更长时间来更大程度上恢复精度?

Looking forward to your reply! Thanks!

@uyzhang
Copy link
Owner

uyzhang commented Jul 15, 2022

你好,我这边给的实验13的mAP@.5结果,是只进行稀疏性训练后的结果,并没有进行剪枝,所以我用实验13去做后续试验的原因是,稀疏性训练之后的结果比较好,所以没有进行剪枝之后的效果对比的实验。

@uyzhang
Copy link
Owner

uyzhang commented Jul 15, 2022

稀疏性训练不是越多越好,越多之后,趋于0的权重变多,网络自然会变差,但是100可能没有达到你说的过拟合的状态,具体你可以看下分布图,然后第二个问题,finetuning时间长应该可以更大恢复精度,但是应该还是会有个极限的。

@Joejwu
Copy link
Author

Joejwu commented Jul 15, 2022

收到,非常感谢!我再去做一些对比实验看看效果如何!

@uyzhang uyzhang closed this as completed Sep 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants