-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NeVa token fusion #9245
NeVa token fusion #9245
Conversation
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
…nsample height and weight Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
No impact on my side.
nemo/collections/nlp/modules/common/text_generation_strategy.py
Outdated
Show resolved
Hide resolved
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
@@ -265,6 +263,9 @@ def preprocess_multimodal(sources: dict, multimodal_cfg: dict, cur_token_len: in | |||
if media_type == 'video': | |||
num_patches *= multimodal_cfg['num_frames'] | |||
|
|||
if multimodal_cfg['mm_mlp_adapter_type'] == 'mlp_downsample': | |||
num_patches //= 4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
did you handle padding somewhere else? for odd number dimension
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added in latest commit. I now check if media_tensor height or width divided by patch size is odd, if so then pad patch_dim +1. Same logic added to text_generation_strategy.
image_processor.crop_size['width'], | ||
) | ||
|
||
self.num_media_latents = (self.multimodal_cfg['crop_size'][0] // self.multimodal_cfg['patch_dim']) * ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
consider odd number crop size
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've not seen odd number height or widths for crop_size in clip or siglip
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
Signed-off-by: paul-gibbons <paul@gibbonspaul.com>
Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
* token fusion via mlp downsampling + media_type default fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * inference update Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * adapter fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * config refactor, remove image_token_len dependency, transpose mlp_downsample height and weight Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * removing image_token_len in text generation strategy Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * fix patch_dim text generation Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * crop-size fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * fixing RGB reversal bug Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * crop_size default -> None in text_generation_strategy Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * patch_dim padding for mlp_downsample Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * patch_dim padding update Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * updating h/w patch_dim naming convention Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> --------- Signed-off-by: paul-gibbons <paul@gibbonspaul.com> Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> Co-authored-by: paul-gibbons <paul-gibbons@users.noreply.github.com> Signed-off-by: Boxiang Wang <boxiangw@nvidia.com>
* token fusion via mlp downsampling + media_type default fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * inference update Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * adapter fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * config refactor, remove image_token_len dependency, transpose mlp_downsample height and weight Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * removing image_token_len in text generation strategy Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * fix patch_dim text generation Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * crop-size fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * fixing RGB reversal bug Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * crop_size default -> None in text_generation_strategy Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * patch_dim padding for mlp_downsample Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * patch_dim padding update Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * updating h/w patch_dim naming convention Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> --------- Signed-off-by: paul-gibbons <paul@gibbonspaul.com> Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> Co-authored-by: paul-gibbons <paul-gibbons@users.noreply.github.com> Signed-off-by: Jan Lasek <janek.lasek@gmail.com>
* token fusion via mlp downsampling + media_type default fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * inference update Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * adapter fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * config refactor, remove image_token_len dependency, transpose mlp_downsample height and weight Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * removing image_token_len in text generation strategy Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * fix patch_dim text generation Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * crop-size fix Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * fixing RGB reversal bug Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * crop_size default -> None in text_generation_strategy Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * patch_dim padding for mlp_downsample Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * patch_dim padding update Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> * updating h/w patch_dim naming convention Signed-off-by: paul-gibbons <paul@gibbonspaul.com> * Apply isort and black reformatting Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> --------- Signed-off-by: paul-gibbons <paul@gibbonspaul.com> Signed-off-by: paul-gibbons <paul-gibbons@users.noreply.github.com> Co-authored-by: paul-gibbons <paul-gibbons@users.noreply.github.com>
What does this PR do ?
Adding in token fusion via mlp_downsample adapter from VILA paper.
Collection: [Note which collection this PR will affect]
Changelog
Usage
# Add a code snippet demonstrating how to use this
GitHub Actions CI
The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.
The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".
Before your PR is "Ready for review"
Pre checks:
PR Type:
If you haven't finished some of the above items you can still open "Draft" PR.
Who can review?
Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.
Additional Information