Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LoRA for MoE Layer #9396

Merged
merged 5 commits into from
Jun 11, 2024
Merged

LoRA for MoE Layer #9396

merged 5 commits into from
Jun 11, 2024

Conversation

cuichenx
Copy link
Collaborator

@cuichenx cuichenx commented Jun 6, 2024

What does this PR do ?

Add LoRA for MoE layer (e.g. Mixtral model).
This does not necessarily improve the convergence of LoRA on Mixtral. However, it is important for QLoRA to achieve the best memory savings

Collection: [Note which collection this PR will affect]

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

  • The following flag now works for MoE models:
model.peft.lora_tuning.target_modules=[all]

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

Signed-off-by: Chen Cui <chcui@nvidia.com>
@github-actions github-actions bot added the NLP label Jun 6, 2024
Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>
@cuichenx cuichenx marked this pull request as ready for review June 6, 2024 17:07
Signed-off-by: Chen Cui <chcui@nvidia.com>
Signed-off-by: Chen Cui <chcui@nvidia.com>
akoumpa
akoumpa previously approved these changes Jun 11, 2024
Copy link
Member

@akoumpa akoumpa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks.

Signed-off-by: Chen Cui <chcui@nvidia.com>
@cuichenx cuichenx merged commit c51cdbb into main Jun 11, 2024
112 checks passed
@cuichenx cuichenx deleted the chcui/moe_lora branch June 11, 2024 19:55
janekl pushed a commit that referenced this pull request Jun 12, 2024
* initial moe lora impl

Signed-off-by: Chen Cui <chcui@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>

* fix dangling adapter

Signed-off-by: Chen Cui <chcui@nvidia.com>

* update to newest mcore code

Signed-off-by: Chen Cui <chcui@nvidia.com>

---------

Signed-off-by: Chen Cui <chcui@nvidia.com>
Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>
Co-authored-by: cuichenx <cuichenx@users.noreply.github.com>
Signed-off-by: Jan Lasek <janek.lasek@gmail.com>
galv pushed a commit to galv/NeMo that referenced this pull request Jun 13, 2024
* initial moe lora impl

Signed-off-by: Chen Cui <chcui@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>

* fix dangling adapter

Signed-off-by: Chen Cui <chcui@nvidia.com>

* update to newest mcore code

Signed-off-by: Chen Cui <chcui@nvidia.com>

---------

Signed-off-by: Chen Cui <chcui@nvidia.com>
Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>
Co-authored-by: cuichenx <cuichenx@users.noreply.github.com>
JesusPaz pushed a commit to JesusPaz/NeMo that referenced this pull request Jun 18, 2024
* initial moe lora impl

Signed-off-by: Chen Cui <chcui@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>

* fix dangling adapter

Signed-off-by: Chen Cui <chcui@nvidia.com>

* update to newest mcore code

Signed-off-by: Chen Cui <chcui@nvidia.com>

---------

Signed-off-by: Chen Cui <chcui@nvidia.com>
Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>
Co-authored-by: cuichenx <cuichenx@users.noreply.github.com>
rohitrango pushed a commit to rohitrango/NeMo that referenced this pull request Jun 25, 2024
* initial moe lora impl

Signed-off-by: Chen Cui <chcui@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>

* fix dangling adapter

Signed-off-by: Chen Cui <chcui@nvidia.com>

* update to newest mcore code

Signed-off-by: Chen Cui <chcui@nvidia.com>

---------

Signed-off-by: Chen Cui <chcui@nvidia.com>
Signed-off-by: cuichenx <cuichenx@users.noreply.github.com>
Co-authored-by: cuichenx <cuichenx@users.noreply.github.com>
@ko3n1g ko3n1g mentioned this pull request Jul 18, 2024
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants