Skip to content

[Question]: UIE微调之后导出静态模型报错 #10421

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Chinesejunzai opened this issue Apr 16, 2025 · 1 comment
Open

[Question]: UIE微调之后导出静态模型报错 #10421

Chinesejunzai opened this issue Apr 16, 2025 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@Chinesejunzai
Copy link

请提出你的问题

files may be required. You can download and install ccache from: https://github.com/ccache/ccache/blob/master/doc/INSTALL.md
warnings.warn(warning_message)
Traceback (most recent call last):
File "/mnt/hdd/houxiaojun/workspace/project/PaddleNLP_dev_20250407/llm/predict/export_model.py", line 22, in
from llm.predict.predictor import ModelArgument, PredictorArgument, create_predictor
ModuleNotFoundError: No module named 'llm'

运行目录是在llm文件夹下
运行的命令是: python predict/export_model.py --model_name_or_path ./checkpoints/laojun_data/uie_ckpts --output_path ./sta
tic_models/uie_export_model --dtype bfloat16 --inference_model 1 --append_attn 1

@Chinesejunzai Chinesejunzai added the question Further information is requested label Apr 16, 2025
@Chinesejunzai
Copy link
Author

进入export_model.py中将from llm.predict.predictor import ModelArgument, PredictorArgument, create_predictor改为from predict.predictor import ModelArgument, PredictorArgument, create_predictor, 然后再llm目录下执行export PYTHONPATH=.:$PYTHONPATH, 然后再执行导出命令, 可正常完成

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants