Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wip issue74 poisson encoder #357

Merged
merged 7 commits into from
Jan 6, 2023

Conversation

harini-sudha
Copy link
Collaborator

rough.py is a dummy python file for my reference that can be ignored.

Copy link
Member

@Jegp Jegp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work @theLamentingGirl !

I would appreciate to hear your opinion on two suggestions below.

One is that the current torch.manual_seed sets the global seed. Which may be fine, but in some cases, that may not be what the user wants. What if they, for instance, just wants to set the seed in that function, but not globally?. One way around that is to use a Generator which acts as the seed for the random sampling. This would be local, so we won't destroy whatever the user did outside this code. How about something like this?

def poisson_encode(..., generator: Optional[torch.Generator] = None) -> torch.Tensor:
    ...
   return (torch.rand(..., generator=generator) ...)

This way, the user can either 1) set the general seed outside the function and not use the generator (in which case it would be zero and torch.rand would get a generator=None argument - which is fine) or 2) specifically set the generator to the function which will be local and device-specific (you can do torch.Generator(device="cuda") for example). Let me know your thoughts.

Regarding the second change, I would appreciate if you can remove the rough.py file so it doesn't gets pushed to the repo. Either by adding it to .gitignore or by removing it entirely.

Finally, if you feel like nitpicking, it's good practice to write seed is not None instead of seed != None because != uses equality which can be overwritten. And in our case, that's bad because we don't want the user to accidentally shoot themselves in the foot by using an overwritten (hacked) torch.Tensor class. This is so hypothetical it likely won't happen in our lifetime, though :P

Copy link
Member

@Jegp Jegp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nicely done. I'm pretty happy with this, with the one exception that I think we should call the parameter generator instead of seed_generator to follow the PyTorch convention.

Otherwise, I think we should merge! 👍

@@ -134,6 +134,8 @@ def poisson_encode(
seq_length: int,
f_max: float = 100,
dt: float = 0.001,
seed_generator: torch.Generator = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we call this parameter generator to follow the PyTorch conventions?

@Jegp
Copy link
Member

Jegp commented Jan 5, 2023

Nice work! I'll leave the checks and merge it later. Thanks a lot @theLamentingGirl 🙏

@Jegp Jegp merged commit e147394 into norse:main Jan 6, 2023
@Jegp
Copy link
Member

Jegp commented Jan 6, 2023

Thanks a lot @theLamentingGirl 🥳

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants