Skip to content

[Feature] 为 FusedMoE 添加 hidden_size 显式参数支持#7361

Merged
chang-wenbin merged 2 commits intoPaddlePaddle:developfrom
zhoutianzi666:add_hidden_size_moe
Apr 13, 2026
Merged

[Feature] 为 FusedMoE 添加 hidden_size 显式参数支持#7361
chang-wenbin merged 2 commits intoPaddlePaddle:developfrom
zhoutianzi666:add_hidden_size_moe

Conversation

@zhoutianzi666
Copy link
Copy Markdown
Collaborator

@zhoutianzi666 zhoutianzi666 commented Apr 13, 2026

Motivation
解耦 FusedMoE 对 fd_config.model_config.hidden_size 的强依赖,使类设计更加灵活,便于未来扩展...]

Modifications
在 FusedMoE.init 中新增 hidden_size 参数,默认值为 -1
移除原有从 fd_config.model_config.hidden_size 获取 hidden_size 的逻辑
更新所有调用方显式传入 hidden_size=fd_config.model_config.hidden_size
更新相关测试文件

Usage or Command

Accuracy Tests

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

@paddle-bot
Copy link
Copy Markdown

paddle-bot bot commented Apr 13, 2026

Thanks for your contribution!

@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.


“liuruian” seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

@zhoutianzi666 zhoutianzi666 changed the title commit add hidden_size as a arg of moe Apr 13, 2026
Copy link
Copy Markdown
Collaborator

@chang-wenbin chang-wenbin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

PaddlePaddle-bot

This comment was marked as outdated.

Copy link
Copy Markdown

@PaddlePaddle-bot PaddlePaddle-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤖 AI Code Review | 2026-04-13

📋 Review 摘要

PR 概述:为 FusedMoE 类添加 hidden_size 参数,使调用方可以显式指定 hidden_size 值,而不再从 fd_config.model_config.hidden_size 自动获取。

变更范围model_executor/layers/moe/moe.pymodel_executor/models/ 下的 6 个 MoE 模型文件、4 个测试文件

影响面 Tag[Models] [OP]

📝 PR 规范检查

问题

  1. PR 标题缺少官方 Tag
  2. PR 描述未填写 Motivation 和 Modifications

标题建议(可直接复制):

  • [Feature] 为 FusedMoE 添加 hidden_size 显式参数支持
  • [OP] 修改 FusedMoE 构造函数,支持传入 hidden_size 参数

描述模板(可直接复制):

Motivation

[填写修改原因,例如:解耦 FusedMoE 对 fd_config.model_config.hidden_size 的强依赖,使类设计更加灵活,便于未来扩展...]

Modifications

  1. FusedMoE.__init__ 中新增 hidden_size 参数,默认值为 -1
  2. 移除原有从 fd_config.model_config.hidden_size 获取 hidden_size 的逻辑
  3. 更新所有调用方显式传入 hidden_size=fd_config.model_config.hidden_size
  4. 更新相关测试文件

问题

级别 文件 概述
🔴 Bug moe.py:156 hidden_size 参数默认值为 -1(magic number),且缺少验证逻辑

总体评价

PR 的变更意图合理,将 hidden_size 从自动获取改为显式传入,提高了接口的灵活性。所有现有的调用方都已正确更新,测试文件也同步修改。

但存在一个潜在的安全问题:hidden_size 参数默认值为 -1,且没有验证逻辑。如果未来有新代码忘记传入该参数,会导致 self.hidden_size = -1,可能在后续代码(如 forward_chunked_moe 中的 paddle.empty)中引发错误或意外行为。

def __init__(
self,
fd_config,
hidden_size: int = -1,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 Bug hidden_size 参数默认值为 -1(magic number),且没有验证逻辑。

如果未来有代码忘记传入 hidden_size 参数,会导致 self.hidden_size = -1,这可能在后续代码中(如 forward_chunked_moe 中的 paddle.empty(shape=[0, self.hidden_size], ...))引发错误或意外行为。

建议修复方式:

方案 1(推荐):移除默认值,强制所有调用方显式传入

def __init__(
    self,
    fd_config,
    hidden_size: int,  # 移除默认值
    ...
):

方案 2:添加验证逻辑

def __init__(
    self,
    fd_config,
    hidden_size: int = -1,
    ...
):
    super().__init__()
    ...
    assert hidden_size > 0, f"hidden_size must be positive, got {hidden_size}"
    self.hidden_size = hidden_size

@codecov-commenter
Copy link
Copy Markdown

Codecov Report

✅ All modified and coverable lines are covered by tests.
⚠️ Please upload report for BASE (develop@225fc8d). Learn more about missing BASE report.

Additional details and impacted files
@@            Coverage Diff             @@
##             develop    #7361   +/-   ##
==========================================
  Coverage           ?   74.10%           
==========================================
  Files              ?      383           
  Lines              ?    53591           
  Branches           ?     8406           
==========================================
  Hits               ?    39713           
  Misses             ?    11181           
  Partials           ?     2697           
Flag Coverage Δ
GPU 74.10% <100.00%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@chang-wenbin chang-wenbin changed the title add hidden_size as a arg of moe [Feature] 为 FusedMoE 添加 hidden_size 显式参数支持 Apr 13, 2026
@chang-wenbin chang-wenbin merged commit 73bd4ab into PaddlePaddle:develop Apr 13, 2026
35 of 38 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants