Skip to content

[RL] support moe-topk use topk_reduce_func#7218

Open
zoooo0820 wants to merge 4 commits intoPaddlePaddle:developfrom
zoooo0820:align_moe_topk
Open

[RL] support moe-topk use topk_reduce_func#7218
zoooo0820 wants to merge 4 commits intoPaddlePaddle:developfrom
zoooo0820:align_moe_topk

Conversation

@zoooo0820
Copy link
Copy Markdown
Collaborator

@zoooo0820 zoooo0820 commented Apr 7, 2026

Motivation

  • 由于不同模型在组网部分的moe-topk计算逻辑有少许差异,支持在组网时传入 topk_reduce_func保证数值的准确性,并在FD_USE_PHI_MOE_TOPK生效时,不在noaux_ac算子内部计算normalize和scaling,而是使用传入的topk_reduce_func在算子外部计算normalize和scaling

  • 由于验证上述组合方式可以保证数值准确性,移除了此前数值对齐的moe_topk_select 散OP实现

Modifications

Usage or Command

Accuracy Tests

Checklist

  • Add at least a tag in the PR title.
    • Tag list: [[FDConfig],[APIServer],[Engine], [Scheduler], [PD Disaggregation], [Executor], [Graph Optimization], [Speculative Decoding], [RL], [Models], [Quantization], [Loader], [OP], [KVCache], [DataProcessor], [BugFix], [Docs], [CI], [Optimization], [Feature], [Benchmark], [Others], [XPU], [HPU], [GCU], [DCU], [Iluvatar], [Metax]]
    • You can add new tags based on the PR content, but the semantics must be clear.
  • Format your code, run pre-commit before commit.
  • Add unit tests. Please write the reason in this PR if no unit tests.
  • Provide accuracy results.
  • If the current PR is submitting to the release branch, make sure the PR has been submitted to the develop branch, then cherry-pick it to the release branch with the [Cherry-Pick] PR tag.

@paddle-bot
Copy link
Copy Markdown

paddle-bot bot commented Apr 7, 2026

Thanks for your contribution!

@zoooo0820 zoooo0820 changed the title support moe-topk use topk_reduce_func [RL] support moe-topk use topk_reduce_func Apr 7, 2026
@codecov-commenter
Copy link
Copy Markdown

codecov-commenter commented Apr 7, 2026

Codecov Report

❌ Patch coverage is 23.07692% with 10 lines in your changes missing coverage. Please review.
⚠️ Please upload report for BASE (develop@8cb417e). Learn more about missing BASE report.

Files with missing lines Patch % Lines
fastdeploy/model_executor/layers/moe/moe.py 9.09% 8 Missing and 2 partials ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             develop    #7218   +/-   ##
==========================================
  Coverage           ?   72.82%           
==========================================
  Files              ?      377           
  Lines              ?    53217           
  Branches           ?     8311           
==========================================
  Hits               ?    38755           
  Misses             ?    11739           
  Partials           ?     2723           
Flag Coverage Δ
GPU 72.82% <23.07%> (?)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Copy Markdown

@fastdeploy-bot fastdeploy-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📋 Review 摘要

PR 概述:支持 MoE TopK 通过 topk_reduce_func 进行自定义归一化,当 FD_USE_PHI_MOE_TOPK 生效时在算子外部计算 normalize 和 scaling,并移除了此前用于数值对齐的 moe_topk_select 函数实现。

变更范围fastdeploy/model_executor/layers/moe/fastdeploy/model_executor/models/

影响面 Tag[RL] [OP]

📝 PR 规范检查

PR 标题包含 [RL] tag,符合规范。PR 描述填写了 Motivation 和 Modifications,但 Checklist 中未勾选 Accuracy Tests。

问题

级别 文件 概述
🟡 建议 fastdeploy/model_executor/layers/moe/moe.py:136 FD_USE_PHI_MOE_TOPK=Truerenormalize=True 时,若 topk_reduce_func=None 则归一化不会执行
🟡 建议 fastdeploy/model_executor/models/glm4_moe.py:185 其他模型未传入 topk_reduce_func,可能在特定环境下产生数值问题

总体评价

代码变更逻辑清晰,通过支持 topk_reduce_func 参数实现了 MoE TopK 的自定义归一化机制。但存在以下需要注意的问题:

  1. 行为不一致风险:当 FD_USE_PHI_MOE_TOPK=Truerenormalize=True 时,只有当 topk_reduce_func 不为 None 时才会执行归一化。如果调用方未传入该参数,归一化操作会被静默跳过,可能导致数值结果与预期不符。

  2. 模型兼容性问题:目前只有 glm4_moe.py 中传入了 topk_reduce_func,其他模型(如 deepseek_v3.pygpt_oss.pyernie4_5_moe.py 等)在使用 FusedMoE 时未传入该参数。如果这些模型在 FD_USE_PHI_MOE_TOPK=True 环境下运行且 renormalize=True,可能会遇到数值问题。

建议在 FD_USE_PHI_MOE_TOPK 环境变量启用时,对 topk_reduce_func 为 None 的情况添加明确的警告或错误处理,或者提供默认的归一化行为。

Copy link
Copy Markdown

@fastdeploy-bot fastdeploy-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤖 AI Code Review | 2026-04-08

📋 Review 摘要

PR 概述:支持 moe-topk 使用 topk_reduce_func 参数,在 FD_USE_PHI_MOE_TOPK 生效时将 normalize 和 scaling 计算移至算子外部,并移除了此前数值对齐的 moe_topk_select 散 OP 实现。

变更范围model_executor/layers/moe/model_executor/models/tests/

影响面 Tag[RL] [OP]

📝 PR 规范检查

PR 标题和描述均符合规范。标题包含 [RL] Tag,描述清楚说明了 Motivation 和 Modifications。

问题

级别 文件 概述
🔴 Bug moe.py:137 FD_USE_PHI_MOE_TOPK=1 且调用方未传递 topk_reduce_func 时,topk_reduce_func 默认值为 None,导致调用 topk_reduce_func(topk_values) 时抛出 TypeError

总体评价

整体设计思路合理,将 normalize 和 scaling 计算移至算子外部以支持不同模型的差异。但 get_moe_scores 函数中存在一个潜在的运行时错误:当 FD_USE_PHI_MOE_TOPK=1topk_reduce_funcNone 时,代码会尝试调用 None 对象。

Copy link
Copy Markdown

@fastdeploy-bot fastdeploy-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📋 Review 摘要

PR 概述:新增 topk_reduce_func 参数,支持在 FD_USE_PHI_MOE_TOPK 模式下自定义 MoE topk 选择后的归一化逻辑,移除了旧的 moe_topk_select Python 实现

变更范围model_executor/layers/moe/model_executor/models/glm4_moe.py

影响面 Tag[RL] [OP]

📝 PR 规范检查

标题建议(可直接复制):

  • [RL] support moe-topk use topk_reduce_func

描述建议

  • Accuracy Tests 部分需要补充实际的精度测试结果或说明为何不需要精度测试

问题

级别 文件 概述
🟡 建议 fastdeploy/model_executor/layers/moe/moe.py:136 topk_reduce_func 行为未在 docstring 中说明
🟡 建议 fastdeploy/model_executor/layers/moe/moe.py:93 topk_reduce_func 默认值与逻辑不一致
🟡 建议 PR 描述 Accuracy Tests 部分未提供精度测试结果

总体评价

代码变更整体合理,正确地添加了 topk_reduce_func 参数支持,在 FD_USE_PHI_MOE_TOPK 模式下将其用于外部归一化计算。测试文件已正确更新,移除了对已删除函数的引用。建议补充文档和精度测试结果。


行内评论

详见以下具体建议。

redundant_ep_rank_num_plus_one,
)
if envs.FD_USE_PHI_MOE_TOPK:
if original_renormalize:
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 建议 get_moe_scores 函数的 docstring 没有更新 topk_reduce_func 参数的说明。

建议添加参数文档说明:

"""
compute moe scores using e_score_correction_bias.

Args:
    ...
    topk_reduce_func: Callable function for normalizing topk_values.
        Only used when FD_USE_PHI_MOE_TOPK=1 and renormalize=True.
        Default: lambda x: x.sum(axis=-1, keepdim=True) + 1e-20
"""

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants