Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer相关的问题 #57

Closed
logolemon opened this issue Nov 23, 2022 · 2 comments
Closed

Transformer相关的问题 #57

logolemon opened this issue Nov 23, 2022 · 2 comments
Labels
discussion Discussion on DocEE and SentEE

Comments

@logolemon
Copy link

老师您好,读您的论文,看您将所有的BiLSTM层替换为transformer的,这个在您的代码里应该如何实现,

@logolemon logolemon added the discussion Discussion on DocEE and SentEE label Nov 23, 2022
@Spico197
Copy link
Owner

感谢您对本项目的关注。PTPCG模型实现的代码在 dee/models/trigger_aware.py/TriggerAwarePrunedCompleteGraph 中,主要继承了 dee/models/lstmmtl2complete_graph.py/LSTMMTL2CompleteGraphModel。关于用于实体再编码的 span_lstmmention_lstm ,您可以替换成 dee/modules/transformer.py 中的 encoder 模块。对于要素抽取和事件表示,因为用了双向 LSTM 头尾位置表示拼接作为句子表示,您只需要按 Doc2EDAG 一样对全句表示做 max-pooling 即可。

@logolemon
Copy link
Author

好的 老师谢谢您的解答

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion Discussion on DocEE and SentEE
Projects
None yet
Development

No branches or pull requests

2 participants