-
Notifications
You must be signed in to change notification settings - Fork 385
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Download llama_cpp_python-0.2.75.tar.gz failure #1516
Comments
可以试试清华源:https://mirrors.tuna.tsinghua.edu.cn/help/pypi/ |
Collecting vllm-nccl-cu12<2.19,>=2.18 (from vllm<0.4.2,>=0.2.6->xinference[all]) |
This download link is good for me. Maybe your network has some issues. You can try some other PyPI mirrors, e.g. http://mirrors.aliyun.com/pypi/simple/ |
This issue is stale because it has been open for 7 days with no activity. |
This issue was closed because it has been inactive for 5 days since being marked as stale. |
Describe the bug
pip install "xinference[all]" , Downloading llama_cpp_python-0.2.75.tar.gz fail
To Reproduce
To help us to reproduce this bug, please provide information below:
Expected behavior
A clear and concise description of what you expected to happen.
Additional context
Add any other context about the problem here.
When I downloaded this file, the download failed because of a network timeout, and the download speed was very slow. High probability is the problem of my campus network, but is there a solution?
The text was updated successfully, but these errors were encountered: