Skip to content

Conversation

@ys950902
Copy link
Contributor

XCCL will be used for XPU device on Pytorch-2.8, with this support will remove torch-ccl on XPU device, and we will also reserve the old path for torch-CCL enable.

@ys950902
Copy link
Contributor Author

Hi @loadams, sorry for late response, cause xccl official support is based on pytorch-2.8, so fix is based on this. #7113 this pr is too old based on new deepseed, I opened a new pr for xccl support.

@ys950902 ys950902 force-pushed the xccl_enable branch 6 times, most recently from a2b9990 to 4877579 Compare May 21, 2025 05:02
ys950902 added 3 commits May 21, 2025 14:14
Signed-off-by: yisheng <yi.sheng@intel.com>
Signed-off-by: yisheng <yi.sheng@intel.com>
Signed-off-by: yisheng <yi.sheng@intel.com>
Signed-off-by: yisheng <yi.sheng@intel.com>
@loadams loadams added this pull request to the merge queue May 22, 2025
Merged via the queue into deepspeedai:master with commit bdba823 May 22, 2025
12 checks passed
@ys950902 ys950902 deleted the xccl_enable branch May 23, 2025 08:24
deepcharm pushed a commit to deepcharm/DeepSpeed that referenced this pull request Jun 16, 2025
XCCL will be used for XPU device on Pytorch-2.8, with this support will
remove torch-ccl on XPU device, and we will also reserve the old path
for torch-CCL enable.

---------

Signed-off-by: yisheng <yi.sheng@intel.com>
Co-authored-by: Ma, Guokai <guokai.ma@gmail.com>
Signed-off-by: Max Kovalenko <mkovalenko@habana.ai>
mauryaavinash95 pushed a commit to DataStates/DeepSpeed that referenced this pull request Oct 4, 2025
XCCL will be used for XPU device on Pytorch-2.8, with this support will
remove torch-ccl on XPU device, and we will also reserve the old path
for torch-CCL enable.

---------

Signed-off-by: yisheng <yi.sheng@intel.com>
Co-authored-by: Ma, Guokai <guokai.ma@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants