We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
本地ollama无法聊天,我不知道哪里设置错误,请帮帮我!
本地ollama无法聊天
本地ollama无法聊天,其它应用可以
windows11
edge
No response
The text was updated successfully, but these errors were encountered:
Bot detected the issue body's language is not English, translate it automatically.
Title: [Bug] Local ollama cannot chat
Sorry, something went wrong.
将接口地址修改回/api/openai,在项目中增加.env文件,并添加 BASE_URL=http://localhost:11434
Change the interface address back to /api/openai, add the .env file to the project, and add BASE_URL=http://localhost:11434
解决了吗,我的问题和你类似
Have you solved it? My problem is similar to yours.
No branches or pull requests
Bug Description
本地ollama无法聊天,我不知道哪里设置错误,请帮帮我!
Steps to Reproduce
本地ollama无法聊天
Expected Behavior
本地ollama无法聊天,其它应用可以
Screenshots
Deployment Method
Desktop OS
windows11
Desktop Browser
edge
Desktop Browser Version
No response
Smartphone Device
No response
Smartphone OS
No response
Smartphone Browser
No response
Smartphone Browser Version
No response
Additional Logs
No response
The text was updated successfully, but these errors were encountered: