-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Swapped width and height #3
Comments
Hi, just wanted to let you know that I've seen your message. Having a quick look, it does seem like I've accidentally swapped the |
Hi, I tried out your implementation with w and h restored to the natural order back when I opened this issue and it functions perfectly to this day. I think you were probably working with square-shaped inputs when you wrote this, therefore this accidental swap didn't pose a problem for you. Anyway, your implementation saves me a lot of time and allows my code to run quite a bit faster! |
You were right about the issue. Thank you very much for pointing that out. It indeed did not matter if H and W were swapped for squared inputs and crashed when input was not square. It's fixed now! Happy Holidays 😄 |
Hi Gabriele
, huge respect for what you managed to achieve here.Your knowledge and expertise about tensor indexing truly impress as previously I couldn't imagine how random crop operations could be done in batches.
And after studying your code for a while I'm gradually coming to understand how such complex indexing works.
But one small thing here on line 123 of batchnorms.py really buffles me:
w, h = padded.size(2), padded.size(3)
I can't understand why the height and width variables are swapped. If this is indeed your intention, could you kindly shed some light on why it should be this way?
The text was updated successfully, but these errors were encountered: