Skip to content

Add skills: automodel-expert-lora and megatron-bridge-lora-sft#47

Closed
Doondi-Ashlesh wants to merge 3 commits into
NVIDIA:mainfrom
Doondi-Ashlesh:feat/add-automodel-megatron-lora-skills
Closed

Add skills: automodel-expert-lora and megatron-bridge-lora-sft#47
Doondi-Ashlesh wants to merge 3 commits into
NVIDIA:mainfrom
Doondi-Ashlesh:feat/add-automodel-megatron-lora-skills

Conversation

@Doondi-Ashlesh
Copy link
Copy Markdown

Onboarding type

  • New product onboarding (new components.d/automodel.yml for NeMo-AutoModel)
  • Other: adding megatron-bridge-lora-sft skill to existing Megatron-Bridge product

For new product onboarding — author affirmations

  • Skills cleared for open source release — both skills document public APIs in NVIDIA-owned open-source repos (NVIDIA-NeMo/Automodel, NVIDIA-NeMo/Megatron-Bridge)
  • License: Dual (Apache 2.0 + CC-BY 4.0)
  • No new license or third-party component introduced beyond what the source repos already carry
  • Source repos are public and under NVIDIA-owned GitHub org (NVIDIA-NeMo)
  • skills/ path used for new entries

What this PR adds

skills/NeMo-AutoModel/automodel-expert-lora
Covers applying LoRA to fused MoE expert layers in NeMo AutoModel via PeftConfig with target_modules and moe_rank_scaling. Documents the GroupedExpertsTE limitation and apply_lora_to_linear_modules API. Confirmed from nemo_automodel/components/_peft/lora.py and tests/unit_tests/_peft/test_lora_experts.py.

skills/Megatron-Bridge/megatron-bridge-lora-sft
Covers LoRA, DoRA, and adapter export in Megatron-Bridge via the LoRA/DoRA dataclasses, normalize_moe_lora for MoE rank normalization, and AutoBridge.export_adapter_ckpt for HuggingFace PEFT export. Confirmed from src/megatron/bridge/peft/lora.py, dora.py, and examples/conversion/adapter/export_adapter.py.

All PRs

  • All commits signed off with DCO (Signed-off-by: Doondi-Ashlesh <doondiashlesh@gmail.com>)

Dtammineedi and others added 2 commits May 3, 2026 16:40
- Add automodel-expert-lora to skills/NeMo-AutoModel/ with component entry
- Add megatron-bridge-lora-sft to skills/Megatron-Bridge/
- Each skill includes SKILL.md, card.yaml, and evals/evals.json

Signed-off-by: Doondi-Ashlesh <doondiashlesh@gmail.com>
Signed-off-by: Doondi-Ashlesh <doondiashlesh@gmail.com>
sayalinvidia pushed a commit that referenced this pull request May 5, 2026
The DCO check (#38) verifies every commit carries a Signed-off-by
trailer but doesn't validate that the author email matches an
NVIDIA-affiliated address. Catalog content is published externally
under NVIDIA's name — accepting commits from arbitrary personal/
external email addresses creates IP-traceability gaps that are
hard to clean up after the fact.

PR #47 surfaced this gap concretely: an external contributor opened
a catalog onboarding PR with commits authored from gmail.com and
eduquencher.com addresses. Detection happened during human review
only; this workflow makes it an automated gate.

The check walks every non-merge commit between base and head, and
fails if any commit's author OR committer email isn't @nvidia.com or
@users.noreply.github.com (github-noreply covers NVIDIA-org members
who hide their personal email).

The automated/sync-skills branch is exempt — same rationale as the
DCO check, it's the bot mirror, not a contributor.

Companion change: catalog-pr-reviewer skill updated with the same
check inline so reviewers see the violation locally before opening
the PR rather than after CI fails.

Signed-off-by: Moshe Abramovitch <moshea@nvidia.com>
@Doondi-Ashlesh
Copy link
Copy Markdown
Author

Hi @mosheabr and @sayalinvidia, since PR #49 references this PR as a test case for the new author check, wanted to ask directly if there a way for external contributions, or is this catalog limited to NVIDIA employees?

Happy to go through whatever review process is needed. Thanks for your time.

@sayalinvidia
Copy link
Copy Markdown
Collaborator

Thanks for the contribution, @Doondi-Ashlesh! At the moment we're only accepting NVIDIA-authored skills, but stay tuned for updates on this in the future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants