Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the crop size #9

Closed
yinhanxi opened this issue Nov 10, 2021 · 3 comments
Closed

About the crop size #9

yinhanxi opened this issue Nov 10, 2021 · 3 comments

Comments

@yinhanxi
Copy link

When training,i set the crop size to 512×512 by default,but i have some problem with the pooling layer.
_lib/backbone/PSMNet/submodule.py_
_output_branch1 = self.branch1(output_skip)_
_RuntimeError: Given input size: (128x32x32). Calculated output size: (128x0x0). Output size is too small_
You set a larger size when training,right?Or there is another problem.
Thank you

@fabiotosi92
Copy link
Owner

Hi, the crop size should be set considering the aspect_ratio flag. In fact, crops are computed directly on the high-resolution images. Thus, if you want images at 512x512 as input of the network during training, you should set (for example) crop_width=2048 and crop_height=2048, aspect ratio=0.25. You can find some training scripts in the script folder.

@yinhanxi
Copy link
Author

yinhanxi commented Dec 2, 2021

thankyou for sharing such a nice work^_^

@yinhanxi
Copy link
Author

yinhanxi commented Dec 3, 2021

another question
Why Laplace and not Gauss
thankyou

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants