-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Variable Gets Reset After Change Event #8217
Comments
Could you modify the reproduction so we can run it locally please? We need to be able to run the reproduction in order to debug the issue. |
@pngwn Thanks for the reply. The code is very, very long and I wanted to provide a MWE. Are you sure you want me to modify it? By the way, I like your profile picture :) |
Do you have a way of passing the dropdown value to the predict function? Have you tried using the |
@yvrjsharma Thanks for the pointer. Will definitely look into it. @pngwn Here's the full code. Let me know if you need anything else. I initially didn't share it because I thought a minimal working example would be better. import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName;
print("Changed model to " + NAME)
def getModelResponse(message, history):
return NAME;
Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
Chatbot.render();
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
interface.launch(share=True, debug = True); |
I meant a minimal reproduction that we can run, not your full code. Could you simplify the example code please, at least to remove any external dependencies other than gradio. |
Will close for now, can reopen if we get a minimal repro. |
Sorry for the delay. Here is the minimal example, which I've also edited into the previous comments: import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName;
print("Changed model to " + NAME)
def getModelResponse(message, history):
return NAME;
Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
Chatbot.render();
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
interface.launch(share=True, debug = True); |
You need to pass the dropdown component as import gradio as gr
supportedLLMs = {'a': 1, "b": 2}
NAME = "a"
def updateName(newName):
NAME = newName
print("Changed model to " + NAME)
def getModelResponse(message, history, llmchoice):
NAME = llmchoice
print("Changed model to " + NAME)
return NAME
chatbot = gr.Chatbot(likeable=True,
show_share_button=True,
show_copy_button=True,
bubble_full_width = False,
)
with gr.Blocks() as interface:
with gr.Row():
with gr.Column():
with gr.Accordion("Chatbot Configuration", open=True):
LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "a", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])
with gr.Column():
bot = gr.ChatInterface(getModelResponse,
chatbot=chatbot,
additional_inputs=[LLMChoice],
examples = [["What is the capital of France?"], ["What was John Lennon's first album?"], ["Write a rhetorical analysis of Hamlet"]])
interface.launch(share=True, debug = True); |
Thanks @yvrjsharma! |
@yvrjsharma Thanks. This works. I wasn't using additional_inputs earlier because I was experiencing some problems with it (there was a bug report filed about it by someone else if I'm not mistaken), but looks like the problem with it was fixed. |
Hello @yvrjsharma , Just a quick follow up here. How do you make the additional inputs accordion be on a separate column than the chat interface (as in side by side)? |
Describe the bug
I am creating an app that allows to user to switch between different LLMs for response through a chat interface. I have a dropdown that allows user to choose which LLM to use. I have a variable keeping track of the chosen LLM.
When the user makes a selection via the dropdown, I have verified that this variable is correctly updated. However, when my getModelResponse() function accesses the same variable, it seems to be using the old, unchanged value.
If this is not a bug, and instead a mistake on my part, please let me know what I can do to rectify it. I am brand new to gradio.
Have you searched existing issues? 🔎
Reproduction
A lot of code that was irrelevant to this issue was deleted, so you might see references to a function that is not defined. Assume that function works properly
Screenshot
Logs
System Info
Severity
Blocking usage of gradio
The text was updated successfully, but these errors were encountered: