Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vgg16-397923af.pth download #63

Closed
shangjianan2 opened this issue Aug 28, 2018 · 11 comments
Closed

vgg16-397923af.pth download #63

shangjianan2 opened this issue Aug 28, 2018 · 11 comments

Comments

@shangjianan2
Copy link

链接:https://pan.baidu.com/s/1DFxRkVBRjbDBDKKXdND47A 密码:u0xf

@shangjianan2
Copy link
Author

Sometimes we can not download vgg16-397923af.pth. Here is vgg16-397923af.pth I have already download. I hope it will useful to you.

@chenyuntc
Copy link
Owner

Thank you! I'll update it in README and close the issue here.

@Stephenfang51
Copy link

Dear Sir

Thanks for the Baidu Pan.

However, the "convert_caffe_pretrain.py" can't convert the "vgg16-397923af" into torch version. I tried and it failed.

The network has different structure between vgg16-397923a and vgg16-00b39a1b I guess

I suggest people can download the vgg16-00b39a1b.pth on Google Colab and covert to torch version using "convert_caffe_pretrain.py" as Chenyun said in README.

Thanks !

@greenyin
Copy link

greenyin commented Apr 2, 2020

Dear Sir

Thanks for the Baidu Pan.

However, the "convert_caffe_pretrain.py" can't convert the "vgg16-397923af" into torch version. I tried and it failed.

The network has different structure between vgg16-397923a and vgg16-00b39a1b I guess

I suggest people can download the vgg16-00b39a1b.pth on Google Colab and covert to torch version using "convert_caffe_pretrain.py" as Chenyun said in README.

Thanks !

I also encountered the same problem, did you solve it?

@shida666
Copy link

shida666 commented Apr 4, 2020

Hey , everyone ! The file “vgg16-397923af.pth“ we downloaded is a pre-trained model pretrained in Pytroch environment ! it doesn't be pretrained in the Caffe ! So after we download the vgg16-397923af.pth file into "checkpoints" folder , and we dont need to make the "python misc/convert_caffe_pretrain.py"in our shell , and we should directly to the "python train.py train --env='fasterrcnn-caffe' --plot-every=100 --caffe-pretrain“ ! Don't forget change the name of this file! via"mv vgg16-397923af.pth vgg16_caffe.pth "

@shida666
Copy link

shida666 commented Apr 4, 2020

Dear Sir
Thanks for the Baidu Pan.
However, the "convert_caffe_pretrain.py" can't convert the "vgg16-397923af" into torch version. I tried and it failed.
The network has different structure between vgg16-397923a and vgg16-00b39a1b I guess
I suggest people can download the vgg16-00b39a1b.pth on Google Colab and covert to torch version using "convert_caffe_pretrain.py" as Chenyun said in README.
Thanks !

I also encountered the same problem, did you solve it?

directly to insert the "python train.py train --env='fasterrcnn-caffe' --plot-every=100 --caffe-pretrain“ don't make the ‘’python misc/convert_caffe_pretrain.py‘’

@chenaifang
Copy link

I want to use vgg16_bn, but directly use vgg16_bn-6c64b313.pth in Pytorch. I can only get 58mAP. I would like to ask if there are other pre-training models of vgg16_bn? @shangjianan2 @chenyuntc @greenyin @ZichengDuan @Stephenfang51

@ZichengDuan
Copy link

@chenaifang Hi, Aifang. Unfortunately, I didn't explore other backbone networks for this project.

@chenaifang
Copy link

Excuse me, do you know how to get the pre-training model of vgg16_bn in general? @ZichengDuan

@yuanzhenhuan
Copy link

thank you对对对

@DanZhang123
Copy link

I want to use vgg16_bn, but directly use vgg16_bn-6c64b313.pth in Pytorch. I can only get 58mAP. I would like to ask if there are other pre-training models of vgg16_bn? @shangjianan2 @chenyuntc @greenyin @ZichengDuan @Stephenfang51

Hello, have you solved it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants