Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add serialize methods for ChatMessage + ChatFeed #5764

Merged
merged 9 commits into from Nov 15, 2023

Conversation

ahuang11
Copy link
Contributor

@ahuang11 ahuang11 commented Oct 27, 2023

I'm trying to make ChatInterface easy to use to fine tune LLMs. The first step is easy exporting of it, ready to use for transformers.

Kind of addresses #5567, but specifically for use with fine tuning transformers https://huggingface.co/docs/transformers/main/chat_templating, not ChatInterface.

I was initially thinking of combining these it like in the ChatBox export method #5178, but decided against it since the logic got long.

There's a lot of ways to serialize the widgets / panes / etc, so I added format_func as an escape hatch if users don't agree with the default.

Would like some early thoughts before I add tests.

image
import panel as pn
import numpy as np
import pandas as pd

pn.extension()

msg = pn.chat.ChatMessage(
    pn.Row(
        pn.widgets.FloatSlider(value=10),
        pn.widgets.Checkbox(name="Check me!"),
        pn.pane.HTML("Hello World!"),
        pn.pane.DataFrame(
            pd.DataFrame(np.random.randn(10, 3), columns=list("ABC"))
        ),
    ),
)
print(str(msg))
  • Add tests
  • Use user from ChatInterface in user_names

@ahuang11 ahuang11 changed the title Add serialize methods Add serialize methods for ChatMessage + ChatFeed Oct 27, 2023
@codecov
Copy link

codecov bot commented Oct 28, 2023

Codecov Report

Attention: 4 lines in your changes are missing coverage. Please review.

Comparison is base (7eb9c71) 84.00% compared to head (b7ad0d5) 82.33%.
Report is 5 commits behind head on main.

Files Patch % Lines
panel/chat/interface.py 40.00% 3 Missing ⚠️
panel/chat/message.py 96.87% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #5764      +/-   ##
==========================================
- Coverage   84.00%   82.33%   -1.67%     
==========================================
  Files         290      290              
  Lines       42391    42622     +231     
==========================================
- Hits        35609    35093     -516     
- Misses       6782     7529     +747     
Flag Coverage Δ
ui-tests 38.33% <24.00%> (-2.57%) ⬇️
unitexamples-tests 72.27% <97.71%> (-0.09%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ahuang11 ahuang11 marked this pull request as ready for review October 31, 2023 20:15
panel/chat/interface.py Outdated Show resolved Hide resolved
@ahuang11
Copy link
Contributor Author

ahuang11 commented Nov 1, 2023

Just tried this out and it works well.

import panel as pn
from transformers import AutoTokenizer
from ctransformers import AutoConfig, AutoModelForCausalLM, Config

pn.extension(design="material")

BASE_MODEL = "mistralai/Mistral-7B-Instruct-v0.1"
QUANTIZED_MODEL = "TheBloke/Mistral-7B-Instruct-v0.1-GGUF"
QUANTIZED_FILE = "mistral-7b-instruct-v0.1.Q4_K_M.gguf"


async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
    conversation = instance.serialize_for_transformers()
    formatted_conversation = tokenizer.apply_chat_template(conversation, tokenize=False)
    response = llm(formatted_conversation, stream=True)
    message = ""
    for token in response:
        message += token
        yield message

config = AutoConfig(
    config=Config(
        temperature=0.5, max_new_tokens=2048, context_length=2048, gpu_layers=1
    ),
)
llm = AutoModelForCausalLM.from_pretrained(
    QUANTIZED_MODEL,
    model_file=QUANTIZED_FILE,
    config=config,
)
tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL)

chat_interface = pn.chat.ChatInterface(
    callback=callback,
    show_send=False,
    show_rerun=False,
    show_undo=False,
    show_clear=False,
    show_button_name=False,
    callback_exception="verbose"
)
pn.template.FastListTemplate(
    main=[chat_interface],
    sidebar_width=400,
).show()
image

panel/chat/feed.py Outdated Show resolved Hide resolved
@philippjfr philippjfr merged commit ea2f8af into main Nov 15, 2023
10 of 13 checks passed
@philippjfr philippjfr deleted the add_serialize_methods branch November 15, 2023 14:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants