Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: cannot import name 'neuron_xla_compile' from 'libneuronxla' #33

Closed
yogendra-yatnalkar opened this issue Sep 1, 2023 · 3 comments

Comments

@yogendra-yatnalkar
Copy link

Hi team, for all the models, I am getting the below error while importing transformers_neuronx.{MODEL_NAME}.model

>>> from transformers_neuronx.gptj.model import GPTJForSampling
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/ssm-user/aws_neuron_venv_pytorch/lib64/python3.7/site-packages/transformers_neuronx/gptj/model.py", line 16, in <module>
    from transformers_neuronx import compiler
  File "/home/ssm-user/aws_neuron_venv_pytorch/lib64/python3.7/site-packages/transformers_neuronx/compiler.py", line 31, in <module>
    from libneuronxla import neuron_xla_compile
ImportError: cannot import name 'neuron_xla_compile' from 'libneuronxla' (/home/ssm-user/aws_neuron_venv_pytorch/lib64/python3.7/site-packages/libneuronxla/__init__.py)

The transformers_neuronx version is: 0.6.x
The torch_neuronx version is: 1.13.1.1.9.0
OS Used: Amazon Linux 2
Kernel: kernel-devel-5.10.167-147.601

Please help me to resolve this.

@dacorvo
Copy link

dacorvo commented Sep 1, 2023

You need to update to torch-neuronx-1.13.1.1.10.0

@jeffhataws
Copy link
Contributor

Thanks @yogendra-yatnalkar. @dacorvo is correct. For the latest transformers-neuronx 0.6 in release 2.13, please use release 2.13 torch-neuronx version 1.13.1.1.10.1. As mentioned in transformers-neuronx release notes, in transformers-neuronx 0.6 we added the Neuron Persistent Cache for compilation to automatically load pre-compiled model artifacts. We will update documentation to make it clearer.

@yogendra-yatnalkar
Copy link
Author

Hi @dacorvo thanks, this helped a lot. @jeffhataws understood, thanks for the explanation, adding this in documentation will surely help.


For future readers facing this, just adding some extra information:
image

I was using the Pytorch 1.11 (AL2) DLAMI earlier, but in that, I was never able to update the torch-neuronx to the required version.
When I switched to Pytorch 1.13 (AL2) DLAMI, I was easily able to update the drivers and things worked out of box.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants