Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fusion module alpha parameter #43

Open
LambdaLi opened this issue Dec 25, 2023 · 0 comments
Open

Fusion module alpha parameter #43

LambdaLi opened this issue Dec 25, 2023 · 0 comments

Comments

@LambdaLi
Copy link

Hello, your work is amazing. I have a question about the alpha parameter of the cross attention and self attention fusion module in the decoder. It was 0.5 in version one and the paper, but it became 0.3 in version two. Does this mean that the network pays more attention to the characteristics of the encoder?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant