-
Notifications
You must be signed in to change notification settings - Fork 310
add SAM model #1847
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add SAM model #1847
Conversation
|
@divyashreepathihalli I think you need to move this to the keras-hub directory now after the rename |
done |
mattdangerw
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Left some comments, look like they might show up in the old location sadly.
| self.bottom_right_corner_embed = keras.layers.Embedding( | ||
| 1, hidden_size, name="bottom_right_corner_embed" | ||
| ) | ||
| self.not_a_point_embed = keras.layers.Embedding( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is a "not a point embed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is the original implementation of this. - self.not_a_point_embed = nn.Embedding(1, embed_dim)
When you use SAM without providing explicit point prompts, , the model still needs some input to represent the "absence" of point information. This is where not_a_point_embed is used. not_a_point_embed acts like a placeholder or default value when you don't specify point prompts. It allows the model to function correctly even without explicit point guidance.
SamanehSaadat
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, Divya!
Left some nit comments!
| (batch_size, 0, image_size, image_size, 1) | ||
| ), | ||
| } | ||
| # todo: update preset name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this for later? Or the presets are uploaded and this can be updated?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is for later. The weights are yet to be added. I will update the name once we have them uploaded.
| query=queries, value=queries, key=queries | ||
| ) | ||
| else: | ||
| queries_with_pe = queries + query_pe |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does pe stand for here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
positional embeddings. I updated the names wherever we have _pe
No description provided.