Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error about causal_product_cpu.cpython-38-darwin.so on Mac #124

Open
XiaoqZhang opened this issue Jul 17, 2023 · 2 comments
Open

Error about causal_product_cpu.cpython-38-darwin.so on Mac #124

XiaoqZhang opened this issue Jul 17, 2023 · 2 comments

Comments

@XiaoqZhang
Copy link

Hi, I am installing the package on Mac using M1 chip. I am using python=3.8.10 and torch=2.0.1. I tried to install pytorch-fast-transformers using pip install --user pytorch-fast-transforms or building from the source. However, when I tried from fast_transformers.attention import AttentionLayer, it always gives me this error:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/attention/__init__.py", line 13, in <module>
    from .causal_linear_attention import CausalLinearAttention
  File "/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/attention/causal_linear_attention.py", line 15, in <module>
    from ..causal_product import causal_dot_product
  File "/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/causal_product/__init__.py", line 9, in <module>
    from .causal_product_cpu import causal_dot_product as causal_dot_product_cpu, \
ImportError: dlopen(/Users/xiaoqi/Documents/projects/proj_mol2mof/MoLFORMER/fast-transformers/fast_transformers/causal_product/causal_product_cpu.cpython-38-darwin.so, 0x0002): symbol not found in flat namespace '___kmpc_for_static_fini'

Could you please help me to have a look at what the problem could be? Thank you in advance!

@rezacopol
Copy link

same issue.

@indra-ipd
Copy link

Any update on if this issue was resolved? I have the same error when I try
from fast_transformers.attention import AttentionLayer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants