Skip to content

Added vectorstore and vectorizer functionality#17

Merged
fakhirali merged 9 commits intoFinity-Alpha:masterfrom
Harras3:vectorstore
May 19, 2024
Merged

Added vectorstore and vectorizer functionality#17
fakhirali merged 9 commits intoFinity-Alpha:masterfrom
Harras3:vectorstore

Conversation

@Harras3
Copy link
Contributor

@Harras3 Harras3 commented Apr 25, 2024

No description provided.

@fakhirali
Copy link
Contributor

fakhirali commented Apr 25, 2024

Some questions:

  • Do we really need a getVectorizer?
  • also docs and texts are differently handled? Can they be handled the same way?

Also can the base class implement anything? It would make it more logical to have one,
Ideally the vectorizer is passed in the init of the vectorstore instead of getVectorizer so that one can mix and match the vectorizers and stores.

When you integrate it with like an llm (in main or smth), I'll test it out.
Also thank you for doing this. (You kinda have to make abstract_vectordb)

@fakhirali fakhirali added the bounty PR solving a bounty problem label Apr 25, 2024
@fakhirali fakhirali mentioned this pull request Apr 25, 2024
@Harras3
Copy link
Contributor Author

Harras3 commented Apr 29, 2024

  1. getVectorizer is only needed if someone is using chains of langchain.
  2. docs function are not needed so I have removed them in the next PR

Comment on lines +22 to +23
def __init__(self, sys_prompt='',
Model='gpt-3.5-turbo',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sys_prompt and Model not used anywhere.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed it.

Comment on lines +78 to +79
def post_process(self, response):
return response
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We didn't need the history because Langchain can take care of it, and I cannot remove this function because I am inheriting the BaseChatbot class so this class is aligned with your other LLM classes.

@fakhirali
Copy link
Contributor

Instead of making it an llm, this would be an example of how rag would be used with ovc. Just fix a couple of things and you're good to go.

main_rag.py Outdated
Comment on lines +63 to +69
template = """You are a helpful assistant Give answers using following pieces of context given inside ``` to answer the question at the end. If you don't know the answer , don't try to make up an answer, but be nice in conversation.
{context}
Question: {question}
Helpful Answer:"""

if (self.sys_prompt!=''):
template += self.sys_prompt
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the right way to incorporate the sys prompt? After the template like this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extremely sorry. I am wondering why was it even working when I checked it. I will fix it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ez 💯

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed it. Now it is working as expected

@fakhirali fakhirali merged commit 41c39ec into Finity-Alpha:master May 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bounty PR solving a bounty problem

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants