Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Abstracting prompting transformer for use in L2P and S-Prompt #420

Merged
merged 3 commits into from
Sep 19, 2023

Conversation

prabhuteja12
Copy link
Contributor

The previous version of L2P had the "prompting" mechanism tightly coupled. This PR separates the Prompting and the prompt selection strategy.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@github-actions
Copy link

Coverage report

Note

Coverage evolution disabled because this PR targets a different branch
than the default branch, for which coverage data is not available.

The coverage rate is 85.43%.

95.23% of new lines are covered.

Diff Coverage details (click to unfold)

src/renate/benchmark/models/l2p.py

95.12% of new lines are covered (91.8% of the complete file).
Missing lines: 206, 214

src/renate/updaters/experimental/l2p.py

100% of new lines are covered (54.79% of the complete file).

)
# text transformers dont support cls_feat.
else:
if self._is_text_transformer:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can flatten the if else(if else) part to if, elif, else

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in d8ef1c8

prompter = PromptPool(
embedding_dim=transformer._embedding_size,
embedding_dim=transformer.transformer._embedding_size,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the following, we access a lot of protected attributes of the transformer. do we want to keep it that way or rather make them public?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in 3896135

class PromptedTransformer(nn.Module):
"""This generic module is the basic prompted transformer. It takes in a model string and creates
the appropriate transformer. If not prompted, it returns features, and if prompted, it returns
the full feature using those prompts and the input image/text.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this description still accurate? I don't see the input being returned. Maybe clarify difference between features and full feature. Without context, it might not even be clear that this is the model output.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in 3896135.

@wistuba wistuba self-assigned this Sep 18, 2023
@wistuba wistuba merged commit 864c1a4 into dev Sep 19, 2023
18 checks passed
@wistuba wistuba deleted the pt_prompt_transformer branch September 19, 2023 14:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants