Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NeuralKG中是否提供与原始论文中一样的TransE模型? #17

Closed
FujiwaraKeine opened this issue Dec 5, 2022 · 1 comment
Closed

Comments

@FujiwaraKeine
Copy link

在阅读代码中发现,NeuralKG中提供的默认的TransE模型不是与TransE模型的原始论文中不完全相同的,如loss函数采用了一种基于logSigmoid的方式,这与原始TransE基于margin的方式不同。

请问对原始论文中模型的这些改变,对模型性能的影响有多大?除此以外,NeuralKG中是否提供与原始论文中一样的TransE模型?

@Modberge
Copy link
Collaborator

Modberge commented Dec 8, 2022

非常感谢您的建议,我们已经在github仓库中已经更新了对于MarginRankingLoss的支持,并会在后续正式发布。

我们发现对于KGE模型,通常采用self-adversarial loss的方法要比marginranking loss方法效果要好,以下文章也可以证明这个观点:
CAKE: A Scalable Commonsense-Aware Framework For Multi-View Knowledge Graph Completion
KGTuner: Efficient Hyper-parameter Search for Knowledge Graph Learning

希望这会对您有所帮助

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants