Generator objects should not always use the same seed #25071
Labels
module: random
Related to random number generation in PyTorch (rng generator)
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
I don't think that this is the intended behavior. This is most of the times what you don't want when you have multiple generators.
The initial seed doesn't seem to come from the current RNG either. It breaks reproducibility. I suppose it also makes sense to grab from
std::random_device
,urandom
, current time, or something. But currently it is just fixed.Compared to numpy. where each RNG state has a different seed:
cc @syed-ahmed who probably has comments :)
The text was updated successfully, but these errors were encountered: