Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault after change cuda-based python code to xpu-based python code with codegeex2-6b mode #10296

Closed
ganghe opened this issue Mar 1, 2024 · 3 comments
Assignees

Comments

@ganghe
Copy link

ganghe commented Mar 1, 2024

Hi Team,

I tried to change codegeex2 cuda-based python example code to xpu-based python code with codegeex2-6b mode file,
but the program always crashed.

The cuda-based python file is here,
https://github.com/THUDM/CodeGeeX2/blob/main/demo/fastapicpu.py

I can run it with cpu device, e.g. "python fastapicpu.py --model-path codegeex2-6b --cpu".
but, I changed cuda-based python code to xpu-based python code, run it with gpu device
with the command as below
source /opt/intel/oneapi/setvars.sh --force
python -X faulthandler fastapicpu.py --model-path codegeex2-6b

Then, the program crashed at line 207,
"model = model.to('xpu')"

The related files are attached.
fastapicpu.old.txt - cuda based python file
fastapicpu.py.txt - xpu based python file
cmd.txt - startup command script
output.txt - program output log
mode file - https://hf-mirror.com/THUDM/codegeex2-6b

cmd.txt
fastapicpu.old.txt
fastapicpu.py.txt
output.txt

@NovTi
Copy link
Contributor

NovTi commented Mar 5, 2024

Thank you for your question. This problem has been reproduced and I am currently working on it

@NovTi
Copy link
Contributor

NovTi commented Mar 5, 2024

This issue is caused by the module import order and we suggest you put import torch above from transformers import AutoTokenizer's and from bigdl.llm.transformers import AutoModel like this to solve this error.

import torch
from transformers import AutoTokenizer
from bigdl.llm.transformers import AutoModel

@ganghe
Copy link
Author

ganghe commented Mar 8, 2024

Hi NovTi,

Thank for your helps.
The fix works for me.

-Gang

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants