-
Couldn't load subscription status.
- Fork 13.4k
Description
Prerequisites
- I am running the latest code. Mention the version if possible as well.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new and useful enhancement to share.
Feature Description
Support the q URL parameter in the webui of llama-server to provide an initial prompt. With the current version of llama-server at commit 19a5a3e it does nothing.
Motivation
The Firefox AI chatbot tool seems to rely on the q parameter to send its prompts (for page summaries, explaining the selected text, ...) when the about:config flag to use a local LLM server is set.
The integration also used to work with the previous React webui (I tested with version b6318).
Possible Implementation
Here’s a page describing the integration in open-webui which also mentions other URL parameters, but AFAIK q would already enable most of the contextual functionalities of Firefox. I’m not even sure if the other URL parameters are currently used by Firefox.
https://docs.openwebui.com/tutorials/integrations/firefox-sidebar/
Without requiring Firefox to test, http://localhost:8080/?q=<prompt> would start a conversation with the given prompt.