Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Multi-Model Sequential Response Generation #2209

Closed
4 tasks done
cybersholt opened this issue May 12, 2024 · 1 comment
Closed
4 tasks done

BUG: Multi-Model Sequential Response Generation #2209

cybersholt opened this issue May 12, 2024 · 1 comment
Assignees

Comments

@cybersholt
Copy link

Bug Report

Description

Bug Summary:

When multiple models are selected in OpenWebUI and a question is submitted, the response generation incorrectly starts with the last selected model rather than the first. This behavior is counterintuitive as users expect the response sequence to begin from the first model selected.

Steps to Reproduce:

Launch OpenWebUI.
Select multiple models (e.g., five different models).
Submit a question or input to these models.
Observe the order in which the responses are generated.

Expected Behavior:

The response generation should initiate from the first selected model and proceed sequentially through to the last model selected.

Actual Behavior:

The response generation initiates from the last model selected and proceeds in reverse order.

Environment

  • Open WebUI Version: 0.1.124

  • Ollama (if applicable): 0.1.36

  • Operating System: Windows 10

  • Browser (if applicable): Brave 1.65.132, Chromium 124.0.6367.202

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:

Docker Container Logs:

Screenshots (if applicable):

Installation Method

Running with Docker on Windows 10 with docker compose

Additional Information

Seems like a pretty straight forward fix, I think it should jump to whatever is outputting first.

@cybersholt cybersholt changed the title Multi-Model Sequential Response Generation BUG: Multi-Model Sequential Response Generation May 12, 2024
@tjbck tjbck self-assigned this May 13, 2024
@tjbck tjbck mentioned this issue May 19, 2024
@tjbck
Copy link
Contributor

tjbck commented May 19, 2024

Closing in favour of #2237

@tjbck tjbck closed this as completed May 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants