-
Notifications
You must be signed in to change notification settings - Fork 59.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
不支持groq兼容的openai endpoint[Bug] #4599
Comments
Title: Groq-compatible openai endpoint is not supported [Bug] |
把v1去掉试试,链接拼错了 |
Try removing v1, the link is spelled wrong |
应该不是错的,文档在这里 https://console.groq.com/docs/openai ,不过稍后我也会尝试下 |
It shouldn't be wrong, the documentation is here https://console.groq.com/docs/openai, but I will try it later. |
去掉应该是可以了,谢谢!(模型是在变量中配置增加的) |
It should be OK to remove it, thank you! (The model is configured in variables) |
我有一样的问题,但是删掉v1还是不行 |
I have the same problem, but deleting v1 still doesn’t work. |
Bug Description
想使用groq的llama3模型,docker-compose配置了
使用模型后,返回错误:
{
"error": {
"message": "Unknown request URL: POST /openai/v1/v1/chat/completions. Please check the URL for typos, or see the docs at https://console.groq.com/docs/",
"type": "invalid_request_error",
"code": "unknown_url"
}
}
next-web上Gemini模型正常使用;groq api在open-webui上使用正常
Steps to Reproduce
{
"error": {
"message": "Unknown request URL: POST /openai/v1/v1/chat/completions. Please check the URL for typos, or see the docs at https://console.groq.com/docs/",
"type": "invalid_request_error",
"code": "unknown_url"
}
}
Expected Behavior
希望可以使用groq的llama3模型
Screenshots
No response
Deployment Method
Desktop OS
No response
Desktop Browser
No response
Desktop Browser Version
No response
Smartphone Device
No response
Smartphone OS
No response
Smartphone Browser
No response
Smartphone Browser Version
No response
Additional Logs
No response
The text was updated successfully, but these errors were encountered: