You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Feel free to adjust the title of this bug report, I know it's a terrible one.
Bug Report
Description
Bug Summary: The RAG feature loses context on model response regeneration when using Groq models via the API integration into Open WebUI.
Expected Behavior:
The RAG feature should maintain context and provide consistent responses throughout the conversation, even on model response regeneration.
Actual Behavior:
The RAG feature does not maintain context and provides inconsistent responses throughout the conversation when using Groq models via the API integration into Open WebUI. However, when using a locally installed LLM, the RAG feature is able to maintain context and provide consistent responses throughout the conversation.
Environment
Operating System:
Windows 11 Pro Insider Preview
Version 22H2
Installed on 2/4/2024
OS build 23620.1000
Experience Windows Feature Experience Pack 1000.23620.1000.0
Browser: FireFox v123.0.1 (64-bit)
Reproduction Details
Steps to Reproduce:
Integrate Groq models via the API into Open WebUI
Interact with the RAG feature
Observe the initial response
Regenerate the response
Observe the loss of context in the regenerated response
Confirmation:
I have read and followed all the instructions provided in the README.md.
I am on the latest version of both Open WebUI and Ollama.
Additional Information
Note: This issue does not occur when using a locally installed LLM. The RAG feature is able to maintain context and provide consistent responses throughout the conversation when using a locally installed LLM. As far as I am aware, this issue is specific to the integration of Groq models via the API in Open WebUI, but it could potentially effect multiple APIs, being any that I don't have access to.
Edit: If anyone could replicate this issue and verify that it is indeed a bug, please share down below in the comments.
The text was updated successfully, but these errors were encountered:
tjbck
linked a pull request
Mar 9, 2024
that will
close
this issue
Feel free to adjust the title of this bug report, I know it's a terrible one.
Bug Report
Description
Bug Summary: The RAG feature loses context on model response regeneration when using Groq models via the API integration into Open WebUI.
Expected Behavior:
The RAG feature should maintain context and provide consistent responses throughout the conversation, even on model response regeneration.
Actual Behavior:
The RAG feature does not maintain context and provides inconsistent responses throughout the conversation when using Groq models via the API integration into Open WebUI. However, when using a locally installed LLM, the RAG feature is able to maintain context and provide consistent responses throughout the conversation.
Environment
Operating System:
Windows 11 Pro Insider Preview
Version 22H2
Installed on 2/4/2024
OS build 23620.1000
Experience Windows Feature Experience Pack 1000.23620.1000.0
Browser: FireFox v123.0.1 (64-bit)
Reproduction Details
Steps to Reproduce:
Confirmation:
Additional Information
Note: This issue does not occur when using a locally installed LLM. The RAG feature is able to maintain context and provide consistent responses throughout the conversation when using a locally installed LLM. As far as I am aware, this issue is specific to the integration of Groq models via the API in Open WebUI, but it could potentially effect multiple APIs, being any that I don't have access to.
Edit: If anyone could replicate this issue and verify that it is indeed a bug, please share down below in the comments.
The text was updated successfully, but these errors were encountered: