Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What‘s the value of transform module mean? #32

Open
wen0618 opened this issue Dec 10, 2019 · 3 comments
Open

What‘s the value of transform module mean? #32

wen0618 opened this issue Dec 10, 2019 · 3 comments

Comments

@wen0618
Copy link

wen0618 commented Dec 10, 2019

I used GCNet in my model and it 's very good.But I what to know what‘s the value of transform module mean?Is it suggest the importance of each channel?The lower the value, the less important it is?Hope your anwser ,thanks!

@xvjiarui
Copy link
Owner

Sorry for the late reply.

The value is a linear transformation that transforms feature before skip connection. Please refer to g in fig 2 of NonLocal, V in fig 2 left of Attention is all your need, or W_V in fig 2 right of Relation Network for more details.

Btw, it's good to hear that GCNet works. Thanks for your interest.

@wen0618
Copy link
Author

wen0618 commented Dec 13, 2019

Sorry for the late reply.

The value is a linear transformation that transforms feature before skip connection. Please refer to g in fig 2 of NonLocal, V in fig 2 left of Attention is all your need, or W_V in fig 2 right of Relation Network for more details.

Btw, it's good to hear that GCNet works. Thanks for your interest.

Thank you very much for your answer!According to my understanding,, we can think of its value as the importance of the channel, can we?By the way,I think GCNet is a kind of channel attentional mechanism,right?

@xvjiarui
Copy link
Owner

By value of transform module, are you referring to the output of Global Context Block before residue connection?
In this case, we are not explicitly using channel-wise attention. The output doesn't necessarily imply the importance.
Btw, he SENet could be one example of channel attentional mechanism.
Moreover, the main idea of GCNet is to explicitly capture the global context, as the name indicates.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants