Skip to content

Collection : enhancement of the module #398

@nishika26

Description

@nishika26

Currently, when we give document ID as input to the collection module, a vector store gets screated and that vector store, along with the given model name and instruction creates an assistant and assistant id is what we get as an output here, with the name of the output variable being "llm_service_id".

Since Openai's assistant is going to get deprecated from their end, we should only be creating a vector store for the given documents in this collection module. Additionally, the module's code should be LLM agnostic and not be so specific to openai.

Additionally, these should be covered in the v2 of the module as well -

  • DB session stays open during long external calls (e.g., creating OpenAI vector stores), holding connections/locks unnecessarily. more context : here and here

  • Need to take care of response payload, logs and read

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

Status

In Progress

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions