New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Simple Example does not work #1
Comments
Hi Markus, Thank you very much! :) What is the size of the input signal? |
Hey thanks for the fast response see above: inn = torch.randn((1,3, 28, 28)) but I also tested it with other dimensions |
Oh sorry, I missed that. I found the error. It was at our side. Sorry about that. So, the problem is that the bias in the Conv2d was being overwritten during the uniform initialization of the biases. I checked this only for the 1D case, and it leads to errors in 2D. If you pull the repo again, it should work now. Regarding your code, please change the horizon here: With that being said, I would not expect the results with the MLP kernels to be very good. Normal MLPs actually perform pretty poorly. This is why we need implicit neural representations (e.g., MAGNets). I would try that out if the results with the MLP are not very good. Please let me know if this solves the problems for now :) Cheers, |
Perfect, thanks! And yes I will try MAGNets, that was just a first small test :) |
Hey there!
Thanks for the great work and open source code.
I have tried a very simple example but couldnt get it to work:
-->
(you can ignore everything after the first conv, borrowed from pytorch examples)
I tried different configuration (above is only one example).
Thanks for any help :)
The text was updated successfully, but these errors were encountered: