You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For unbalanced tasks, generally, we only need to rebalance the samples for training. Same as general case, you already have same number of negative samples and positive samples by negative sampling. I don't know the reason why we should put extra weight(pos_weight) to the positive samples.
So ,can you explain it?
Another question is that 'norm' seems to just scale the loss. I suppose that if we remove it the result wouldn't change.
The text was updated successfully, but these errors were encountered:
This is only necessary since we don't perform any negative sampling in this
implementation (but instead take into account all negative edges). In
practice, I recommend training GAEs with negative sampling -- we will
release an optimized implementation for this case in the coming weeks, stay
tuned!
On Thu, May 9, 2019 at 3:53 PM Jxu-Thu ***@***.***> wrote:
Hi @tkipf <https://github.com/tkipf>:
I have questions about 'pos_weight' in 77 line and 'norm' 78 line in
https://github.com/tkipf/gae/blob/master/gae/train.py.
For unbalanced tasks, generally, we only need to rebalance the samples for
training. Same as general case, you already have same number of negative
samples and positive samples by negative sampling. I don't know the reason
why we should put extra weight(pos_weight) to the positive samples.
So ,can you explain it?
Another question is that 'norm' seems to just scale the loss. I suppose
that if we remove it the result wouldn't change.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#33>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABYBYYC3KZIKH3XC67MQVE3PUQUGTANCNFSM4HL2XBYA>
.
Hi @tkipf:
I have questions about 'pos_weight' in 77 line and 'norm' 78 line in https://github.com/tkipf/gae/blob/master/gae/train.py.
For unbalanced tasks, generally, we only need to rebalance the samples for training. Same as general case, you already have same number of negative samples and positive samples by negative sampling. I don't know the reason why we should put extra weight(pos_weight) to the positive samples.
So ,can you explain it?
Another question is that 'norm' seems to just scale the loss. I suppose that if we remove it the result wouldn't change.
The text was updated successfully, but these errors were encountered: