You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to train a model on two different OS (ubuntu:18.04, macOS 11.6.5) and get the same result. I use seed_everything as well as
Trainer( deterministic=True, ..)
Both models are initialized to identically. And both train on the cpu.
Training with data that has nice continuous values, I get the identical models at the end. However, if I use data that has a bunch of onehot features, I get similar models on both OS but as the epochs go up they diverge slowly probably due to the small errors/difference in precision adding up.
Does anyone have any ideas of what could cause this issue? Any ideas on how to fix this?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am trying to train a model on two different OS (ubuntu:18.04, macOS 11.6.5) and get the same result. I use
seed_everything
as well asBoth models are initialized to identically. And both train on the cpu.
Training with data that has nice continuous values, I get the identical models at the end. However, if I use data that has a bunch of onehot features, I get similar models on both OS but as the epochs go up they diverge slowly probably due to the small errors/difference in precision adding up.
Does anyone have any ideas of what could cause this issue? Any ideas on how to fix this?
Beta Was this translation helpful? Give feedback.
All reactions