-
-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wip issue74 poisson encoder #357
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work @theLamentingGirl !
I would appreciate to hear your opinion on two suggestions below.
One is that the current torch.manual_seed
sets the global seed. Which may be fine, but in some cases, that may not be what the user wants. What if they, for instance, just wants to set the seed in that function, but not globally?. One way around that is to use a Generator which acts as the seed for the random sampling. This would be local, so we won't destroy whatever the user did outside this code. How about something like this?
def poisson_encode(..., generator: Optional[torch.Generator] = None) -> torch.Tensor:
...
return (torch.rand(..., generator=generator) ...)
This way, the user can either 1) set the general seed outside the function and not use the generator (in which case it would be zero and torch.rand
would get a generator=None
argument - which is fine) or 2) specifically set the generator to the function which will be local and device-specific (you can do torch.Generator(device="cuda")
for example). Let me know your thoughts.
Regarding the second change, I would appreciate if you can remove the rough.py
file so it doesn't gets pushed to the repo. Either by adding it to .gitignore
or by removing it entirely.
Finally, if you feel like nitpicking, it's good practice to write seed is not None
instead of seed != None
because !=
uses equality which can be overwritten. And in our case, that's bad because we don't want the user to accidentally shoot themselves in the foot by using an overwritten (hacked) torch.Tensor
class. This is so hypothetical it likely won't happen in our lifetime, though :P
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nicely done. I'm pretty happy with this, with the one exception that I think we should call the parameter generator
instead of seed_generator
to follow the PyTorch convention.
Otherwise, I think we should merge! 👍
norse/torch/functional/encode.py
Outdated
@@ -134,6 +134,8 @@ def poisson_encode( | |||
seq_length: int, | |||
f_max: float = 100, | |||
dt: float = 0.001, | |||
seed_generator: torch.Generator = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we call this parameter generator
to follow the PyTorch conventions?
Nice work! I'll leave the checks and merge it later. Thanks a lot @theLamentingGirl 🙏 |
Thanks a lot @theLamentingGirl 🥳 |
rough.py is a dummy python file for my reference that can be ignored.