New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issues about size of model #258
Comments
My dataset contains photo with 1024*1024 pixels |
Its okay, for example 756 to 512 to10 global blocks is 700mb, 512 to 512 to 13 global blocks is 1gb. If you want to decrease size for some reason (it will affect quality) you can decrease -n_blocks_global. Other parameters are not so heavy weighted. |
Got it. Thanks very much! |
hi @1999kevin and @kex243, I didn't find the option -n_blocks_global on the train or base options. How can I use this function? What does it do? Appreciate any inputs! |
@alelordelo hi, you can find this option at line 48 in base_options.py. |
Thanks a lot for helping! : )
Do you know what this does in terms of quality loss?
… On 4 May 2021, at 01:48, 1999kevin ***@***.***> wrote:
@alelordelo <https://github.com/alelordelo> hi, you can find this option at line 48 in base_options.py.
<https://user-images.githubusercontent.com/37931674/116946527-efa74000-acac-11eb-876f-b16b6ea35841.png>
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub <#258 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AEHDNT3ANYRLOWEHRPWURV3TL4Y37ANCNFSM42PY4QAA>.
|
no sure about it |
I use the following command to train my model: python train.py --name myname --dataroot ./myroot/ --netG local --label_nc 0 --input_nc 5. It works for me and I can get a good model. However, it seems that the size of model is so large. It is 2.9GB. Is this case normal or I have done something wrong?
The text was updated successfully, but these errors were encountered: