Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about the pretrain model "90k" and "90k_bn" #14

Closed
irfanICMLL opened this issue Nov 29, 2017 · 3 comments
Closed

about the pretrain model "90k" and "90k_bn" #14

irfanICMLL opened this issue Nov 29, 2017 · 3 comments

Comments

@irfanICMLL
Copy link

irfanICMLL commented Nov 29, 2017

I have evalute the miou of these two pre train model.
The results are as follows:
90k: 78.56%
90k_bn: 70.34%
Where does the pretrain model "90k_bn" come from?
Can I fune-tuning the "90k" model?
Thanks a lot for your implementation, and looking forword for your reply~~:)

@hellochick
Copy link
Owner

Hey @irfanICMLL .
The pretrained model "90k_bn" comes from the source code of ICNet:
https://github.com/hszhao/ICNet/tree/master/evaluation/model
You can fine-tune on it, but I think it cannot be much better.

The reason is, this model have been already pruned by the original author. So, if you focus on the number of kernels, you will find all the numbers of kernels are half of those in PSPNet.

So the best approach is, fine-tuning on the model before pruning, and then do model compression.
Hope my answer can help you. Btw, I keep training ADE20k dataset on it, and get result in the bottom of README.

@irfanICMLL
Copy link
Author

So the decrease of the accuracy is caused by the purning?
And if I want to fune-tuning the "90k", I need to write another script? What is the loss function of the ICNET. I found that the ICNET only has one branch for output, while the ICNET_BN has three and the loss is in a combined form. The loss function described in the artical is also a combined form. I feel really confused....

@hellochick
Copy link
Owner

@irfanICMLL , I don't think the decrease of the accuracy is caused by the pruning, I still can't find the answer to the strange accuracy. Maybe the model is different to each other.
You can directly modified train.py, I have create the loss formula in it. You can take a look on it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants