-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Seeding in Transform, make the same transform all the time #191
Comments
may be it was intended so, and this is my understanding of seeding which is different ... I thought it would produce a different number each call, (but If I run it again, I would get the same number) |
Hi @romainVala, This point of setting the seed is to get the same results every time. Maybe a better name for the argument would be |
Here's some more information about reproducibility in PyTorch: https://pytorch.org/docs/stable/notes/randomness.html |
hmm, but if you get the same results for each call, it is no more a random transform ... |
AFAIK random numbers in the libraries we use are generated using pseudorandom number generators. I suppose they're actually pseudorandom transforms :) In [1]: import torch
In [3]: torch.rand(1)
Out[3]: tensor([0.0833])
In [4]: torch.rand(1)
Out[4]: tensor([0.7058])
In [5]: torch.rand(1)
Out[5]: tensor([0.8763])
In [6]: torch.manual_seed(42)
Out[6]: <torch._C.Generator at 0x7f01b881f3d0>
In [7]: torch.rand(1)
Out[7]: tensor([0.8823])
In [8]: torch.rand(1)
Out[8]: tensor([0.9150])
In [9]: torch.rand(1)
Out[9]: tensor([0.3829])
In [10]: torch.manual_seed(42)
Out[10]: <torch._C.Generator at 0x7f01b881f3d0>
In [11]: torch.rand(1)
Out[11]: tensor([0.8823])
In [12]: torch.rand(1)
Out[12]: tensor([0.9150])
In [13]: torch.rand(1)
Out[13]: tensor([0.3829]) You can set a global ( In this answer there is some more info. |
ok, this was then my understanding of seed keyword in the transform that confuse me (you should specify more this behavior in the doc ) I do not really see any case where this may be useful ... Anyway as you want: thanks |
🐛Bug
I notice it at least for RandomAffine RandomElastic and RandomNoise
To reproduce
Now change seed=10 to seed=None and you get a diferent number each time
I thing it is due to this line
torchio/torchio/transforms/augmentation/random_transform.py
Line 32 in 0e8172c
it should not be called here, juste once in the init()
The text was updated successfully, but these errors were encountered: