-
Notifications
You must be signed in to change notification settings - Fork 943
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't set model alias #520
Comments
@hongyin163 thanks for reporting this, it was an unintended breaking change when updating to pydantic v2 which reserves any parameters starting with |
Hi, am i running into the same issue? I am on Windows environment, i am able to get the "python ./examples/low_level_api/Chat.py" running fine. python -m llama_cpp.server --model ./../../llama.cpp/models/v1/7B/ggml-model-q4_0.bin --port 7777 --host 192.168.0.1 --n_gpu_layers 30 --n_threads 4 --n_ctx 2048 ggml_init_cublas: found 1 CUDA devices: You may be able to resolve this warning by setting |
I was able to change the model_alias to alias in the app.py and the warning went away But the error still there |
Same issue, any update? |
I have the same problem with:docker run --rm -it -p 8001:8001 -v ./modellollama:/models -e MODEL=/models/llama-2-7b-chat.ggmlv3.q2_K.bin ghcr.io/abetlen/llama-cpp-python:latest
|
same issue when add
then got error:
|
bump |
Still running into the same issue. What is the location of the "model_alias" variable to change? `2023-11-16 17:37:06 /usr/local/lib/python3.10/dist-packages/pydantic/internal/fields.py:128: UserWarning: Field "model_alias" has conflict with protected namespace "model". |
@kawnah are you running the latest version of |
Still running into the same issue. |
Prerequisites
Please answer the following questions for yourself before submitting an issue.
Expected Behavior
I run command as follow to start server
I can run it if I don't set --model_alias parameter,but if I set --model_alias as above, there will be an error as follow.
Current Behavior
The execute result is follow:
Environment and Context
Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.
$ lscpu
$ uname -a
Darwin honyindeMacBook-Pro.local 22.3.0 Darwin Kernel Version 22.3.0: Mon Jan 30 20:39:46 PST 2023; root:xnu-8792.81.3~2/RELEASE_ARM64_T6020 arm64
Failure Information (for bugs)
Steps to Reproduce
python -m llama_cpp.server --model models/airoboros-7b-gpt4/airoboros-7b-gpt4-1.4.ggmlv3.q4_0.bin --model_alias gpt-4
Failure Logs
I try to debug the code to solve this problem, if doesn't set type of function add_argument, that will be ok.
The text was updated successfully, but these errors were encountered: