You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! Great project. Always nice to see implementations for challenging problems made public :]
I have been implementing various solutions to solve MIL problems, but have recently started looking into multi-class MIL.
I therefore, came across your implementation, that seems similar to what I was thinking of doing.
I was just wondering if you have made any other attempts to address the same problem which have yielded better results.
Also, your multi-attention approach where you are applying attention for each class separately and combining these through concatenations, seems extremely similar to what is done in multi-head attention, commonly used in ViTs (for example see here). Any ideas?
In advance, thanks!
The text was updated successfully, but these errors were encountered:
Hello! Great project. Always nice to see implementations for challenging problems made public :]
I have been implementing various solutions to solve MIL problems, but have recently started looking into multi-class MIL.
I therefore, came across your implementation, that seems similar to what I was thinking of doing.
I was just wondering if you have made any other attempts to address the same problem which have yielded better results.
Also, your multi-attention approach where you are applying attention for each class separately and combining these through concatenations, seems extremely similar to what is done in multi-head attention, commonly used in ViTs (for example see here). Any ideas?
In advance, thanks!
The text was updated successfully, but these errors were encountered: