You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do you mean while training? Do you mean after smoothing? In the simple case the importance is proportional to the loss. If you want to compute the importance of each sample in the dataset it is simply the loss value of each sample.
If you look into the implementations we have developed more importance metrics (not yet in the paper).
Hello, what if I want to get the importance weight of each sample in a tensor?
The text was updated successfully, but these errors were encountered: