Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Semantic search broken using ollama #6948

Open
ianks opened this issue Mar 14, 2025 · 2 comments
Open

[Bug] Semantic search broken using ollama #6948

ianks opened this issue Mar 14, 2025 · 2 comments
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models

Comments

@ianks
Copy link

ianks commented Mar 14, 2025

📦 Platform

Self hosting Docker

📦 Deploymenet mode

server db(lobe-chat-database image)

📌 Version

1.70.11

💻 Operating System

Other Linux

🌐 Browser

Chrome

🐛 Bug Description

Requests using semantic search started failing recently when it used to work with the same DEFAULT_FILES_CONFIG=embedding_model=ollama/bge-m3. It seems to 404'ing somewhere along the way, but the logs don't help very much:

Getting text from response
2025/03/13 10:54PM 30 pid=28 hostname=lobe msg=Error in tRPC handler (lambda) on path: chunk.semanticSearchForChat, type: mutation
Error [TRPCError]: 
    at o (.next/server/chunks/54689.js:1:1504)
    at f (.next/server/chunks/54689.js:4:286)
    at async f (.next/server/chunks/54689.js:4:68)
    at async f (.next/server/chunks/54689.js:4:68)
    at async f (.next/server/chunks/54689.js:4:68)
    at async f (.next/server/chunks/54689.js:4:68)
    at async r (.next/server/chunks/54689.js:1:8297)
    at async (.next/server/chunks/78604.js:1:25947) {
  code: 'INTERNAL_SERVER_ERROR',
  [cause]: Error: 
      at <unknown> (.next/server/chunks/54689.js:1:1815)
      at new a (.next/server/chunks/54689.js:1:1857)
      at o (.next/server/chunks/54689.js:1:1504)
      at f (.next/server/chunks/54689.js:4:286)
      at async f (.next/server/chunks/54689.js:4:68)
      at async f (.next/server/chunks/54689.js:4:68)
      at async f (.next/server/chunks/54689.js:4:68)
      at async f (.next/server/chunks/54689.js:4:68)
      at async r (.next/server/chunks/54689.js:1:8297) {
�
    error: {
      message: '<html>\r\n' +
        '<head><title>404 Not Found</title></head>\r\n' +
        '<body>\r\n' +
        '<center><h1>404 Not Found</h1></center>\r\n' +
        '<hr><center>nginx</center>\r\n' +
        '</body>\r\n' +
        '</html>\r\n',
      name: 'ResponseError',
      status_code: 404
    },
    errorType: 'OllamaBizError',
    provider: 'ollama'
  }
}

I've confirmed that the bge-m3 model is downloaded, and that ollama chat still works. This is a newly introduced bug.

📷 Recurrence Steps

No response

🚦 Expected Behavior

No response

📝 Additional Information

No response

@ianks ianks added the 🐛 Bug Something isn't working | 缺陷 label Mar 14, 2025
@github-project-automation github-project-automation bot moved this to Roadmap - Chat 1.x in Lobe Chat Routine Mar 14, 2025
@lobehubbot
Copy link
Member

👀 @ianks

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@dosubot dosubot bot added the ollama Relative to Ollama Provider and ollama models label Mar 14, 2025
Copy link

dosubot bot commented Mar 14, 2025

Hey @ianks! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together!

A similar issue was previously reported and closed: "[Bug] Can't use Ollama custom model" [1]. The issue was resolved by a pull request that fully supported Ollama with browser request mode [2].

For your current issue, a 404 error in the context of semantic search using the Ollama model might be caused by an invalid proxy URL configuration. This error can occur if the Ollama is configured with an HTTP proxy URL, leading to URL construction errors. Removing the Ollama proxy URL configuration might resolve the issue [3].

Ensure that your configuration settings, especially related to the proxy, are correct. If the problem persists, reviewing the changes made in the pull request that addressed the previous issue might provide additional insights [2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐛 Bug Something isn't working | 缺陷 ollama Relative to Ollama Provider and ollama models
Projects
Status: Roadmap - Chat 1.x
Development

No branches or pull requests

2 participants