Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generator objects should not always use the same seed #25071

Open
ssnl opened this issue Aug 23, 2019 · 1 comment
Open

Generator objects should not always use the same seed #25071

ssnl opened this issue Aug 23, 2019 · 1 comment
Labels
module: random Related to random number generation in PyTorch (rng generator) triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@ssnl
Copy link
Collaborator

ssnl commented Aug 23, 2019

I don't think that this is the intended behavior. This is most of the times what you don't want when you have multiple generators.

The initial seed doesn't seem to come from the current RNG either. It breaks reproducibility. I suppose it also makes sense to grab from std::random_device, urandom, current time, or something. But currently it is just fixed.

In [39]: torch.Generator().initial_seed()
Out[39]: 67280421310721

In [40]: torch.Generator().initial_seed()
Out[40]: 67280421310721

In [41]: torch.Generator().initial_seed()
Out[41]: 67280421310721

In [42]: torch.rand(2, 3, 4, generator=torch.Generator())
Out[42]:
tensor([[[0.2673, 0.8725, 0.3353, 0.4030],
         [0.7871, 0.4576, 0.0719, 0.9715],
         [0.7147, 0.4275, 0.0846, 0.4904]],

        [[0.1662, 0.1538, 0.3638, 0.5145],
         [0.7122, 0.8787, 0.8431, 0.9879],
         [0.5595, 0.4359, 0.1276, 0.2768]]])

In [43]: torch.rand(2, 3, 4, generator=torch.Generator())
Out[43]:
tensor([[[0.2673, 0.8725, 0.3353, 0.4030],
         [0.7871, 0.4576, 0.0719, 0.9715],
         [0.7147, 0.4275, 0.0846, 0.4904]],

        [[0.1662, 0.1538, 0.3638, 0.5145],
         [0.7122, 0.8787, 0.8431, 0.9879],
         [0.5595, 0.4359, 0.1276, 0.2768]]])

In [44]: torch.rand(2, 3, 4, generator=torch.Generator())
Out[44]:
tensor([[[0.2673, 0.8725, 0.3353, 0.4030],
         [0.7871, 0.4576, 0.0719, 0.9715],
         [0.7147, 0.4275, 0.0846, 0.4904]],

        [[0.1662, 0.1538, 0.3638, 0.5145],
         [0.7122, 0.8787, 0.8431, 0.9879],
         [0.5595, 0.4359, 0.1276, 0.2768]]])
In [53]: torch.Generator().initial_seed()
Out[53]: 67280421310721

In [54]: torch.rand(2, 3, 4)
Out[54]:
tensor([[[0.2384, 0.0965, 0.9460, 0.3316],
         [0.5213, 0.7518, 0.1692, 0.3696],
         [0.6316, 0.8950, 0.2026, 0.6782]],

        [[0.9342, 0.9530, 0.8290, 0.5884],
         [0.4184, 0.0874, 0.5385, 0.2840],
         [0.1021, 0.2164, 0.3872, 0.6371]]])

In [55]:

In [55]: torch.Generator().initial_seed()
Out[55]: 67280421310721

Compared to numpy. where each RNG state has a different seed:

In [48]: numpy.random.RandomState().rand()
Out[48]: 0.897157178998516

In [49]: numpy.random.RandomState().rand()
Out[49]: 0.9620486919176156

In [50]: numpy.random.RandomState().rand()
Out[50]: 0.9457612837975016

In [51]: numpy.random.RandomState().rand()
Out[51]: 0.3025858768725779

cc @syed-ahmed who probably has comments :)

@ssnl ssnl changed the title Generator objects always use the same seed Generator objects should not always use the same seed Aug 23, 2019
@vishwakftw
Copy link
Contributor

A related use-case that had to be fixed by seeding the Generator instance: #24881.

@vishwakftw vishwakftw added the module: random Related to random number generation in PyTorch (rng generator) label Aug 23, 2019
@izdeby izdeby added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Aug 23, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: random Related to random number generation in PyTorch (rng generator) triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants