Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add AI Debate App Demo #2

Merged
merged 4 commits into from
May 7, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 37 additions & 0 deletions AI Debate app.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# AI Debate App

## Introduction
Debate-App is a web application that enables a back-and-forth conversation between two Language Models (LLMs) on a topic chosen by the user. Users can select any two models, input a query, and visualize the dialogue between the LLMs in real-time. Built using Unify and deployed with Streamlit, this application provides a platform for users to witness AI-generated debates and explore the capabilities of different language models.

## Quick Demo
https://github.com/Sanjay8602/Debate-App/assets/121057369/a81d4b3b-bf6f-44b6-bdba-f316b8054a77

## Repository and Deployment
Repo Link:
```commandline
https://github.com/Sanjay8602/Debate-App
```
Guide to run locally:
```commandline
git clone https://github.com/Sanjay8602/Debate-App.git
```
Installing Dependencies:
```commandline
pip install -r requirements.txt
```
Run Locally:
```commandline
streamlit run app.py
```
Deployed app link:
```commandline
https://sanjay8602-debate-app-app-kt5o9f.streamlit.app/
```

## Contributors
Contributors to the project:

| Name | GitHub Profile |
|---------------|------------------------------------------------|
| Sanjay Suthar | [Sanjay8602](https://github.com/Sanjay8602) |
| Ogban Ugot | [ogbanugot](https://github.com/ogbanugot) |
1 change: 1 addition & 0 deletions Debate-App
Submodule Debate-App added at 3f1386
31 changes: 31 additions & 0 deletions debate/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# LLM Debate App built using Unify and Streamlit

## Overview

This application enables a back and forth conversation between two LLMs on a topic chosen by the user.
Users should be able to:
- choose any two models
- input a query
- visualize the dialogue between the LLMs

Built using [Unify](https://unify.ai/docs/index.html).

## Requirements for running locally
```commandline
git clone https://github.com/Sanjay8602/Debate-App.git
```

```commandline
pip install -r requirements.txt
```

```commandline
streamlit run app.py
```
## Contributors
Contributors to the project:

| Name | GitHub Profile |
|---------------|------------------------------------------------|
| Sanjay Suthar | [Sanjay8602](https://github.com/Sanjay8602) |
| Ogban Ugot | [ogbanugot](https://github.com/ogbanugot) |
143 changes: 143 additions & 0 deletions debate/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,143 @@
import streamlit as st
from unify import Unify

st.set_page_config(page_title="Debate App built with Unify")


def start_interaction():
st.session_state.continue_interaction = True


def stop_interaction():
st.session_state.continue_interaction = False


def clear_history():
st.session_state.continue_interaction = False
st.session_state.model1_messages = []
st.session_state.model2_messages = []

endpoints = ["llama-2-13b-chat@anyscale", "mistral-7b-instruct-v0.1@deepinfra", "gpt-4@deepinfra", "codellama-7b-instruct@octoai",
"gpt-3.5-turbo@openai", "pplx-70b-chat@perplexity-ai", "llama-3-8b-chat@together-ai", "gemma-2b-it@together-ai", "gpt-4-turbo@openai",
"deepseek-coder-33b-instruct@together-ai", "mistral-large@mistral-ai", "llama-3-8b-chat@fireworks-ai"]

def input_fields():
with st.sidebar:
st.session_state.unify_key = st.text_input("UNIFY KEY", type="password")
st.image("robot_icon_green.png", width=20)
st.session_state.llm_1 = st.selectbox(
"Select LLM to debate supporting the topic",
endpoints,
key="endpoints_1_llm",

)
personas = ["factual", "funny", "silly", "serious", "angry"]
st.session_state.llm_1_persona = st.selectbox(
"Select the LLMs persona",
personas,
key="llm1_persona"
)

st.image("robot_icon_yellow.png", width=20)
st.session_state.llm_2 = st.selectbox(
"Select LLM to debate opposing the topic",
endpoints,
key= "endpoints_2_llm",

)

st.session_state.llm_2_persona = st.selectbox(
"Select the LLMs persona",
personas,
key="llm2_persona"
)
# Initialize stop button
if st.button("Stop debate", help="stop the debate at "
"the last complete "
"reply-response "
"cycle"):
stop_interaction()
else:
pass

# Clear history
if st.button("Clear chat history"):
clear_history()
else:
pass


def initialize_model(llm_endpoint, unify_key):
model = Unify(
api_key=unify_key,
endpoint=llm_endpoint
)
return model


# Function to generate response from a model given a prompt
def generate_response(model, topic, position, persona, prompt):
messages = [
{"role": "system", "content": f"You are debating {position} the following topic: {topic}. "
f"Consider the opposing points and provide a response. "
f"Adopt a {persona} persona when responding."},
]
messages.extend(prompt)
return model.generate(messages=messages, stream=True)


def main():
st.title("Debate App built with Unify")
st.text("Choose two LLMs to debate each other on a given topic.")

input_fields()

if 'continue_interaction' not in st.session_state:
st.session_state.continue_interaction = True

if "model1_messages" not in st.session_state:
st.session_state.model1_messages = []

if "model2_messages" not in st.session_state:
st.session_state.model2_messages = []

with st.form(key='my_form'):
topic = st.text_input(label='Enter the debate topic here:')
submit = st.form_submit_button(label='Start debate')

if len(st.session_state.model1_messages) > 0 and len(st.session_state.model2_messages) > 0:
for _i, (model1_message, model2_message) in enumerate(
zip(st.session_state.model1_messages, st.session_state.model2_messages)):
with st.chat_message(name="model1", avatar="robot_icon_green.png"):
st.write(model1_message)
with st.chat_message(name="model2", avatar="robot_icon_yellow.png"):
st.write(model2_message)

model1 = initialize_model(st.session_state.llm_1, st.session_state.unify_key)
model2 = initialize_model(st.session_state.llm_2, st.session_state.unify_key)
if submit:
st.session_state.continue_interaction = True
model1_messages = []
model2_messages = []
while st.session_state.continue_interaction:
with st.chat_message(name="model1", avatar="robot_icon_green.png"):
if len(model1_messages) == 0:
stream = generate_response(model1, topic, "for", st.session_state.llm_1_persona,
[{"role": "user", "content": "start debate."}])
else:
model1_messages.append({"role": "user", "content": model2_response})
stream = generate_response(model1, topic, "for", st.session_state.llm_1_persona, model1_messages)
model1_response = st.write_stream(stream)
model1_messages.append({"role": "assistant", "content": model1_response})
st.session_state.model1_messages.append(model1_response)

with st.chat_message(name="model2", avatar="robot_icon_yellow.png"):
model2_messages.append({"role": "user", "content": model1_response})
stream = generate_response(model2, topic, "against", st.session_state.llm_2_persona, model2_messages)
model2_response = st.write_stream(stream)
model2_messages.append({"role": "assistant", "content": model2_response})
st.session_state.model2_messages.append(model2_response)


if __name__ == "__main__":
main()
2 changes: 2 additions & 0 deletions debate/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
streamlit~=1.33.0
unifyai~=0.8.2
Binary file added debate/robot_icon_green.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added debate/robot_icon_yellow.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.