Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variable Gets Reset After Change Event #8217

Closed
1 task done
11301858 opened this issue May 5, 2024 · 11 comments
Closed
1 task done

Variable Gets Reset After Change Event #8217

11301858 opened this issue May 5, 2024 · 11 comments
Labels
bug Something isn't working needs repro Awaiting full reproduction

Comments

@11301858
Copy link

11301858 commented May 5, 2024

Describe the bug

I am creating an app that allows to user to switch between different LLMs for response through a chat interface. I have a dropdown that allows user to choose which LLM to use. I have a variable keeping track of the chosen LLM.

When the user makes a selection via the dropdown, I have verified that this variable is correctly updated. However, when my getModelResponse() function accesses the same variable, it seems to be using the old, unchanged value.

If this is not a bug, and instead a mistake on my part, please let me know what I can do to rectify it. I am brand new to gradio.

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Reproduction

A lot of code that was irrelevant to this issue was deleted, so you might see references to a function that is not defined. Assume that function works properly

import gradio as gr

supportedLLMs = {'a': 1, "b": 2}

NAME = "a"

def updateName(newName):
  NAME = newName;
  print("Changed model to " + NAME)

def getModelResponse(message, history):
      return NAME;

Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:



  with gr.Row():
    with gr.Column():
      Chatbot.render();
    with gr.Column():

      with gr.Accordion("Chatbot Configuration", open=True):
        LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
        LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])


interface.launch(share=True, debug = True);

Screenshot

Screen Shot 2024-05-05 at 1 20 41 PM

Logs

No errors are thrown

System Info

Gradio Environment Information:
------------------------------
Operating System: Linux
gradio version: 4.29.0
gradio_client version: 0.16.1

Severity

Blocking usage of gradio

@11301858 11301858 added the bug Something isn't working label May 5, 2024
@pngwn
Copy link
Member

pngwn commented May 5, 2024

Could you modify the reproduction so we can run it locally please? We need to be able to run the reproduction in order to debug the issue.

@pngwn pngwn added the needs repro Awaiting full reproduction label May 5, 2024
@11301858
Copy link
Author

11301858 commented May 5, 2024

@pngwn Thanks for the reply. The code is very, very long and I wanted to provide a MWE. Are you sure you want me to modify it?

By the way, I like your profile picture :)

@yvrjsharma
Copy link
Collaborator

Do you have a way of passing the dropdown value to the predict function? Have you tried using the additional_inputs parameter in your original code? If you haven't, that might be why you're seeing the value change but not seeing any effects.
You can learn more about this in our Guides here : https://www.gradio.app/guides/creating-a-chatbot-fast#additional-inputs.

@11301858
Copy link
Author

11301858 commented May 6, 2024

@yvrjsharma Thanks for the pointer. Will definitely look into it.

@pngwn Here's the full code. Let me know if you need anything else. I initially didn't share it because I thought a minimal working example would be better.

import gradio as gr

supportedLLMs = {'a': 1, "b": 2}

NAME = "a"

def updateName(newName):
  NAME = newName;
  print("Changed model to " + NAME)

def getModelResponse(message, history):
      return NAME;

Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:



  with gr.Row():
    with gr.Column():
      Chatbot.render();
    with gr.Column():

      with gr.Accordion("Chatbot Configuration", open=True):
        LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
        LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])


interface.launch(share=True, debug = True);

@pngwn
Copy link
Member

pngwn commented May 6, 2024

I meant a minimal reproduction that we can run, not your full code.

Could you simplify the example code please, at least to remove any external dependencies other than gradio.

@abidlabs
Copy link
Member

Will close for now, can reopen if we get a minimal repro.

@abidlabs abidlabs closed this as not planned Won't fix, can't repro, duplicate, stale May 12, 2024
@11301858
Copy link
Author

11301858 commented May 13, 2024

@abidlabs @pngwn
Hello,

Sorry for the delay. Here is the minimal example, which I've also edited into the previous comments:

import gradio as gr

supportedLLMs = {'a': 1, "b": 2}

NAME = "a"

def updateName(newName):
  NAME = newName;
  print("Changed model to " + NAME)

def getModelResponse(message, history):
      return NAME;

Chatbot = gr.ChatInterface(getModelResponse, chatbot = gr.Chatbot(likeable=True, show_share_button=True, show_copy_button=True, bubble_full_width = False), examples = ["What is the capital of France?", "What was John Lennon's first album?", "Write a rhetorical analysis of Hamlet"])
with gr.Blocks() as interface:



  with gr.Row():
    with gr.Column():
      Chatbot.render();
    with gr.Column():

      with gr.Accordion("Chatbot Configuration", open=True):
        LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "Llama7b", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
        LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])


interface.launch(share=True, debug = True);

@pngwn pngwn reopened this May 13, 2024
@yvrjsharma
Copy link
Collaborator

You need to pass the dropdown component as additional_inputs to your predict function - getModelResponse.
Couple tweaks to your code from above got it working for me -

import gradio as gr

supportedLLMs = {'a': 1, "b": 2}

NAME = "a"

def updateName(newName):
  NAME = newName
  print("Changed model to " + NAME)

def getModelResponse(message, history, llmchoice):
    NAME = llmchoice
    print("Changed model to " + NAME)
    return NAME

chatbot = gr.Chatbot(likeable=True,
                     show_share_button=True, 
                     show_copy_button=True, 
                     bubble_full_width = False,
                     )
      
with gr.Blocks() as interface:
  with gr.Row():
    with gr.Column():
      with gr.Accordion("Chatbot Configuration", open=True):
        LLMChoice = gr.Dropdown(choices = list(supportedLLMs.keys()), label = "Chatbot name", value = "a", interactive = True, info = "Cumulative chooses the best (most correct) reponse from all the LLM responses to a prompt.")
        LLMChoice.change(updateName, inputs = [LLMChoice], outputs = [])

    with gr.Column():
      bot = gr.ChatInterface(getModelResponse,
                             chatbot=chatbot,
                             additional_inputs=[LLMChoice], 
                             examples = [["What is the capital of France?"], ["What was John Lennon's first album?"], ["Write a rhetorical analysis of Hamlet"]])

interface.launch(share=True, debug = True);

image

@abidlabs
Copy link
Member

Thanks @yvrjsharma!

@11301858
Copy link
Author

@yvrjsharma Thanks. This works. I wasn't using additional_inputs earlier because I was experiencing some problems with it (there was a bug report filed about it by someone else if I'm not mistaken), but looks like the problem with it was fixed.

@11301858
Copy link
Author

11301858 commented May 22, 2024

Hello @yvrjsharma ,

Just a quick follow up here. How do you make the additional inputs accordion be on a separate column than the chat interface (as in side by side)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs repro Awaiting full reproduction
Projects
None yet
Development

No branches or pull requests

4 participants