Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integration with a Large Language Model like LLama #372

Open
AmmarkoV opened this issue Jul 28, 2023 · 2 comments
Open

Integration with a Large Language Model like LLama #372

AmmarkoV opened this issue Jul 28, 2023 · 2 comments

Comments

@AmmarkoV
Copy link

AmmarkoV commented Jul 28, 2023

Hello,

First of all thanks for the great work developing this bot!
I would like to make a ChatGPT like "feature" where the bot will be able to "respond" to regular chat input
using the llama.cpp as a back-end :
https://github.com/ggerganov/llama.cpp

To setup and test llama.cpp

git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp
make
cd models
wget https://huggingface.co/eachadea/ggml-vicuna-7b-1.1/resolve/main/ggml-vic7b-uncensored-q5_1.bin
cd ..
./main -m ./models/ggml-vic7b-uncensored-q5_1.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt

Where would the correct part on your codebase be to implement the binding to such behaviour ?

I would expect a function like newChatMessage(username,message):

Thank you very much

@luca0N
Copy link
Contributor

luca0N commented Aug 1, 2023

I personally feel like adding LLM to this bot is beyond the scope of this project. To answer your question, you're looking for the function message_received in mumbleBot.py. You should create a function that takes the input, processes it with the LLM, and then sends the message back (assuming the user is not trying to run a command, by checking whether the message starts with the command character, which is ! by default).

def message_received(self, text):

Hope this helps!

@luca0N
Copy link
Contributor

luca0N commented Aug 3, 2023

I should also note that this would be a good candidate for a 3rd-party optional library (PR #228).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants