Added vectorstore and vectorizer functionality#17
Added vectorstore and vectorizer functionality#17fakhirali merged 9 commits intoFinity-Alpha:masterfrom
Conversation
|
Some questions:
Also can the base class implement anything? It would make it more logical to have one, When you integrate it with like an llm (in main or smth), I'll test it out. |
|
set Open AI api key in environment variable and you are good to go.
| def __init__(self, sys_prompt='', | ||
| Model='gpt-3.5-turbo', |
There was a problem hiding this comment.
sys_prompt and Model not used anywhere.
| def post_process(self, response): | ||
| return response |
There was a problem hiding this comment.
Is this correct? In llm_gpt it appends the response to the messages list. https://github.com/fakhirali/OpenVoiceChat/blob/e7125a91bb5edca505a3af8857d947c0b5c22058/openvoicechat/llm/llm_gpt.py#L27
There was a problem hiding this comment.
We didn't need the history because Langchain can take care of it, and I cannot remove this function because I am inheriting the BaseChatbot class so this class is aligned with your other LLM classes.
|
Instead of making it an llm, this would be an example of how rag would be used with ovc. Just fix a couple of things and you're good to go. |
main_rag.py
Outdated
| template = """You are a helpful assistant Give answers using following pieces of context given inside ``` to answer the question at the end. If you don't know the answer , don't try to make up an answer, but be nice in conversation. | ||
| {context} | ||
| Question: {question} | ||
| Helpful Answer:""" | ||
|
|
||
| if (self.sys_prompt!=''): | ||
| template += self.sys_prompt |
There was a problem hiding this comment.
Is this the right way to incorporate the sys prompt? After the template like this?
There was a problem hiding this comment.
Extremely sorry. I am wondering why was it even working when I checked it. I will fix it.
There was a problem hiding this comment.
fixed it. Now it is working as expected
No description provided.