Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question regarding attention-module #11

Open
andreped opened this issue Jan 11, 2023 · 0 comments
Open

Question regarding attention-module #11

andreped opened this issue Jan 11, 2023 · 0 comments

Comments

@andreped
Copy link

Hello! Great project. Always nice to see implementations for challenging problems made public :]

I have been implementing various solutions to solve MIL problems, but have recently started looking into multi-class MIL.

I therefore, came across your implementation, that seems similar to what I was thinking of doing.

I was just wondering if you have made any other attempts to address the same problem which have yielded better results.

Also, your multi-attention approach where you are applying attention for each class separately and combining these through concatenations, seems extremely similar to what is done in multi-head attention, commonly used in ViTs (for example see here). Any ideas?

In advance, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant