Skip to content

Conversation

@shiyang-weng
Copy link
Contributor

We want to use torchrec on CPU. On CPU, torchrec dependent on the CPU version of fbgemm_gpu.
On CPU, fbgemm_gpu not include fbgemm_gpu.experimental.
Similar issue: #2591
env:

pip3 install --pre torch torchvision --index-url https://download.pytorch.org/whl/nightly/cpu
pip3 install --pre torchao --index-url https://download.pytorch.org/whl/nightly/cpu
pip3 install --pre fbgemm_gpu --index-url https://download.pytorch.org/whl/nightly/cpu

Reproduce: import torchao
Error:

  File "/home/wengshiy/ao/torchao/quantization/quantize_/workflows/int4/int4_preshuffled_tensor.py", line 25, in <module>
    if not _is_fbgemm_gpu_genai_available():
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/home/wengshiy/ao/torchao/utils.py", line 1162, in _is_fbgemm_gpu_genai_available
    import fbgemm_gpu.experimental.gen_ai  # noqa: F401
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'fbgemm_gpu.experimental'

@pytorch-bot
Copy link

pytorch-bot bot commented Nov 5, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3292

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit f8e50ab with merge base 01374eb (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 5, 2025
@shiyang-weng shiyang-weng marked this pull request as draft November 5, 2025 03:26
@shiyang-weng
Copy link
Contributor Author

@Xia-Weiwen @mingfeima Could you help review this PR?

if importlib.util.find_spec("fbgemm_gpu") is None:
if (
importlib.util.find_spec("fbgemm_gpu") is None
or importlib.util.find_spec("fbgemm_gpu.experimental") is None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh maybe just do importlib.util.find_spec(fbgemm_gpu.experimental.gen_ai) is None?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to check fbgemm first, otherwise the following error will occur.

  File "/home/wengshiy/ao/torchao/quantization/quantize_/workflows/int4/int4_preshuffled_tensor.py", line 25, in <module>
    if not _is_fbgemm_gpu_genai_available():
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/home/wengshiy/ao/torchao/utils.py", line 1159, in _is_fbgemm_gpu_genai_available
    importlib.util.find_spec("fbgemm_gpu.experimental") is None
    ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib.util>", line 91, in find_spec
ModuleNotFoundError: No module named 'fbgemm_gpu'

@shiyang-weng shiyang-weng marked this pull request as ready for review November 5, 2025 05:05
@shiyang-weng
Copy link
Contributor Author

@pytorchbot label "topic: not user facing"

@pytorch-bot pytorch-bot bot added the topic: not user facing Use this tag if you don't want this PR to show up in release notes label Nov 6, 2025
@shiyang-weng
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 mandatory check(s) are pending/not yet run. The first few are:

  • Facebook CLA Check

Dig deeper by viewing the pending checks on hud

Details for Dev Infra team Raised by workflow job

Failing merge rule: superuser

@Xia-Weiwen Xia-Weiwen merged commit d8c1f02 into pytorch:main Nov 6, 2025
20 of 22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: not user facing Use this tag if you don't want this PR to show up in release notes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants