Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

不支持groq兼容的openai endpoint[Bug] #4599

Closed
3 tasks
GitSarp opened this issue Apr 30, 2024 · 9 comments
Closed
3 tasks

不支持groq兼容的openai endpoint[Bug] #4599

GitSarp opened this issue Apr 30, 2024 · 9 comments
Labels
bug Something isn't working

Comments

@GitSarp
Copy link

GitSarp commented Apr 30, 2024

Bug Description

想使用groq的llama3模型,docker-compose配置了

- "BASE_URL=https://api.groq.com/openai/v1"
- "OPENAI_API_KEY=xxxxx"

使用模型后,返回错误:
{
"error": {
"message": "Unknown request URL: POST /openai/v1/v1/chat/completions. Please check the URL for typos, or see the docs at https://console.groq.com/docs/",
"type": "invalid_request_error",
"code": "unknown_url"
}
}
next-web上Gemini模型正常使用;groq api在open-webui上使用正常

Steps to Reproduce

  1. docker-compose配置了
- "BASE_URL=https://api.groq.com/openai/v1"
- "OPENAI_API_KEY=xxxxx"
  1. 启动容器,访问异常
    {
    "error": {
    "message": "Unknown request URL: POST /openai/v1/v1/chat/completions. Please check the URL for typos, or see the docs at https://console.groq.com/docs/",
    "type": "invalid_request_error",
    "code": "unknown_url"
    }
    }

Expected Behavior

希望可以使用groq的llama3模型

Screenshots

No response

Deployment Method

  • Docker
  • Vercel
  • Server

Desktop OS

No response

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

@GitSarp GitSarp added the bug Something isn't working label Apr 30, 2024
@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Title: Groq-compatible openai endpoint is not supported [Bug]

@Kosette
Copy link
Contributor

Kosette commented Apr 30, 2024

把v1去掉试试,链接拼错了

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Try removing v1, the link is spelled wrong

@GitSarp
Copy link
Author

GitSarp commented Apr 30, 2024

把v1去掉试试,链接拼错了

应该不是错的,文档在这里 https://console.groq.com/docs/openai ,不过稍后我也会尝试下

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Try removing v1, the link is misspelled

It shouldn't be wrong, the documentation is here https://console.groq.com/docs/openai, but I will try it later.

@GitSarp
Copy link
Author

GitSarp commented Apr 30, 2024

把v1去掉试试,链接拼错了

去掉应该是可以了,谢谢!(模型是在变量中配置增加的)

@GitSarp GitSarp closed this as completed Apr 30, 2024
@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Try removing v1, the link is misspelled

It should be OK to remove it, thank you! (The model is configured in variables)

@LuyaoZhuang
Copy link

我有一样的问题,但是删掉v1还是不行

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


I have the same problem, but deleting v1 still doesn’t work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants