Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About Title Generation #17

Closed
Hanlard opened this issue Oct 15, 2020 · 0 comments
Closed

About Title Generation #17

Hanlard opened this issue Oct 15, 2020 · 0 comments

Comments

@Hanlard
Copy link

Hanlard commented Oct 15, 2020

哈喽,拜读了大作,很佩服你们的工作。
论文在附录B中提到:"won't leak the information to the autogressive generative objection ..."
但看源码时发现一点疑惑,OAG数据文本生成时,您使用LSTM进行的解码。输入的其实是论文标题的全部正确字符(emb),就是说每次预测下一个字符-i时,用的都是前i-1个正确字符。这里跟传统的自回归文本生成不太一样,请问作者是基于训练难易度的考虑吗?
是因为下游任务不需要文本生成,这里只是利用文本生成做预训练来提升模型对图结构和节点属性文本的学习,所以不需要那么严格的自回归?

@Hanlard Hanlard closed this as completed Oct 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant