You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
在 2018年11月5日,下午5:01,dzpdzpdzp ***@***.***> 写道:
作者你好,就是关于您论文中的“Attentive Module”中的这个模型和论文“Attention is all you need” 中提出来的Transformer模型真的是完全一模一样的吗?
如果是一样的,请问一下在您的代码中Transformer模型的相关参数设置在哪里呀?
麻烦您解答一下
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub <#6>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AF-seYg_k3fKb9hO8QE5OXMyj2T8x0o8ks5ur_58gaJpZM4YOBOJ>.
作者你好,就是关于您论文中的“Attentive Module”中的这个模型和论文“Attention is all you need” 中提出来的Transformer模型真的是完全一模一样的吗?
如果是一样的,请问一下在您的代码中Transformer模型的相关参数设置在哪里呀?
麻烦您解答一下
The text was updated successfully, but these errors were encountered: