Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Channel/Spatial/Element wise Attention Modules #44

Closed
15 tasks
innat opened this issue Jan 12, 2022 · 7 comments
Closed
15 tasks

Channel/Spatial/Element wise Attention Modules #44

innat opened this issue Jan 12, 2022 · 7 comments
Labels

Comments

@innat
Copy link
Contributor

innat commented Jan 12, 2022

In the timm package, it provides some soft attention modules to building network blocks and I think it's a good fit here, for example:

and many others.

@bhack
Copy link
Contributor

bhack commented Jan 12, 2022

I have not extensively checked about these in the model gardens components, but just picking the First One in the list:

https://github.com/tensorflow/models/blob/master/official/vision/beta/projects/yolo/modeling/layers/nn_blocks.py#L1170

@LukeWood
Copy link
Contributor

Currently we are prioritizing components that are required to achieve state of the art results on specific tasks. I.e., imagenet1k classification, COCO object detection, etc. Any chance I could get some guidance as to where these components excel?

Thanks

@innat
Copy link
Contributor Author

innat commented Jan 13, 2022

@LukeWood
The above list isn't meant to be the replacement for the current priority list. The point is whether these cv components are fit in keras-cv or not.

KerasCV is a repository of modular building blocks (layers, ... .

It can be considered as a to-do list, an interested contributor can get references from here in the coming days. Let me know if I miss something. Also, I think a few components should be already on the current priority list, for example, Squeeze-and-Excitation or CBAM.

@bhack
Copy link
Contributor

bhack commented Jan 14, 2022

II think that in this specific field you need to always find a balance between popupalirity, state of the art, sedimentation and the available human and computing resources you have at a specific point on the time dimension in your community.
So you need to define a policy with an equilibrium over all these tensions expecially if you want try to growth your code contributor community and not just your user base.

E.g. we are still investing resoruces for resnext-rs and waiting for the citation/popularity threshold for the next STOA:
https://arxiv.org/abs/2201.03545

Often an high popupalirity component/model is needed to be maintained just cause it is a reccurent baseline for new accademic work.

Another addition dimension/tension Is the computing resources required by a model and its own components.

@old-school-kid
Copy link

@innat @LukeWood
I have a two part question about this?

  1. Are we adding modules like Spatial Attention Module and Channel Attention Module or should we be adding the whole model (example CBAM as whole) as an application with parameters to change configuration of the model?
  2. As @bhack pointed out we cannot solely depend on state of the art and we have to take into account the popularity and its "promosing"prospects. Soif you could throw some light on that area too.

Could you post these separately so people could be assigned if interested and maybe some of the model could be prioritized.

@old-school-kid
Copy link

Ah, also can SWIN transformer be added to the list?

@innat
Copy link
Contributor Author

innat commented Jan 31, 2022

@old-school-kid
about (1). I would prefer these as modules. For example, if I want to experiment with my custom model and channel-wise attention module I should do that with the API call.
about (2). I agree with those points. But it's not clear to me how to define promising things here. And what role tfa.layers can serve here.

I didn't make a separate post regarding the above modules because I was not sure (and still) whether it's welcomed or we may need some discussion before approaching. The discussion thread wasn't created when I posted it. Maybe that's the right place. But I think, if it's welcomed, the interested contributor may pick up their interest from the above list and send PR.

For Swin-Transformer, it's been asked, #15545

@innat innat mentioned this issue Feb 4, 2022
5 tasks
@keras-team keras-team locked and limited conversation to collaborators Feb 4, 2022
@LukeWood LukeWood converted this issue into discussion #102 Feb 4, 2022

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
Projects
None yet
Development

No branches or pull requests

4 participants