Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BiSeNet mean IoU for R18 #12

Open
ms-krajesh opened this issue Feb 18, 2019 · 20 comments
Open

BiSeNet mean IoU for R18 #12

ms-krajesh opened this issue Feb 18, 2019 · 20 comments

Comments

@ms-krajesh
Copy link

Hi!, I am only able to get mean IoU: 70.446% for BiSeNet, when R18 is used as a backbone. I have trained BiSeNet on CityScape leftImg8bit folder (used gtFine folder for GT's) with train input image dimensions 1024x1024.
The achieved results are still little below than the results you mentioned on GitHub repository page (Mean IoU : 74.6). Did you have used network parameters other than mentioned in the config file uploaded on the GitHub repository? Thanks for your time.

@daodaofr
Copy link

I get a similar mean IoU 69.688% with R18. I train the model from scratch because the pretrained R18 model is not released. Can you share the pretrained model?

@lxtGH
Copy link

lxtGH commented Feb 19, 2019

Personally I think you should use OHEM loss for training which his paper didn't mention.

@daodaofr
Copy link

Personally I think you should use OHEM loss for training which his paper didn't mention.

Thank you but I've already utilized OHEM.

@lxtGH
Copy link

lxtGH commented Feb 19, 2019

Maybe you can try large cropsize

@yu-changqian
Copy link
Owner

@msc-rajesh Which experiments did you use, cityscapes.bisenet.R18.speed or cityscapes.bisenet.R18? I will re-run it to check the performance.

@daodaofr
Copy link

@ycszen I run cityscapes.bisenet.R18 with one GPU. I get mean IoU 69.688%.

@yu-changqian
Copy link
Owner

@daodaofr I have re-run the cityscapes.bisenet.R18 experiment. The performance is normal. I run this experiment on 4 GPUs. Besides, I think maybe you train from scratch resulting in the performance drop. You can load the official R18 model in Pytorch before I release the pre-trained model.

@ms-krajesh
Copy link
Author

@daodaofr : I ran cityscape.bisenet.R18 experiment on 4 GPU's having product name NVidia GeForce GTX 1080 Ti.
I have trained BiSeNet from scratch on CityScape leftImg8bit folder (used "labelTrainIds" instead of "labelIds" from gtFine folder for GT's) with train input image dimensions 1024x1024.
I got mean IoU value 70.446% on validation folder of the CityScape dataset.

@daodaofr
Copy link

@ycszen @ms-krajesh
I run the model from official R18 model in Pytorch, and I got 72.753% mean IoU.
I think gap is from the pretrained model.

@ms-krajesh
Copy link
Author

@daodaofr : Did you used "labelTrainIds" or "labelIds" for GT's?

@daodaofr
Copy link

daodaofr commented Feb 27, 2019

@ms-krajesh I used labelIds

@chenxiaoyu523
Copy link

@daodaofr but the class number in labelids is 33, which is not corresponding with the code, do you have any other processes?

@daodaofr
Copy link

daodaofr commented Mar 4, 2019

@chenxiaoyu523 Yes, I set the label of invalid classes to ignore_index.

@jiaxue-ai
Copy link

@ycszen I run your R18.speed and R18 with 4 1080Ti. the accuracies for last epoch models are 74.2 and 75.2. which is a little bit less than your reported accuracy (74.6 and 76.3). Where should be the problem? shall I check the acc for each epoch?

@alexanderfrey
Copy link

Can someone upload a pretrained model please ?

@haitaobiyao
Copy link

I can't get the right result, and the loss not converge with Bisenet. I don't know why ,can you give me some Suggest

@yu-changqian
Copy link
Owner

@jiaxue1993 Maybe you can evaluate the models of the last ten epochs.

@yu-changqian
Copy link
Owner

@alexanderfrey The pre-trained models have released except the Xception39.

@yu-changqian
Copy link
Owner

@haitaobiyao Could you give more details.

@zhenyouwei
Copy link

Hi!, I download the pretrained model(R18)and train model from the link you provided,but I am only able to get mean IoU: 65.2% ,Is there any problem with the parameter setting?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants