-
Notifications
You must be signed in to change notification settings - Fork 2.8k
remove llama index plugin and update examples #1857
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
❌ Invalid Changeset Format DetectedOne or more changeset files in this PR have an invalid format. Please ensure they adhere to:
Error details: |
|
I'm wondering if we should delete livekit-plugins-llama-index, but convert our example to directly use their API in our examples |
I have no preference. Only the |
Yes, but the tradeoffs are too big, with llama-index, if you use the ChatEngine, we don't support function calls. So IMO, we should maybe just do an example using llama-index inside the llm_node |
Sounds good to me. |
|
@theomonnom send a ping if you do end up deleting the plugin (or any other niche/one-of-a-kind plugins like rag, nltk, etc) since we have links to them in docs (very happy to remove, they're kinda awkward to list out since they don't fit into a category |
|
@theomonnom updated all examples to use original llama-index, feel free to remove the pulgin. |
| ), | ||
| vad=silero.VAD.load(), | ||
| stt=deepgram.STT(), | ||
| llm=DummyLLM(), # use a dummy LLM to enable the pipeline reply |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we add a mock LLM in the api like llm=llm.NOT_REQUIRE to enable the pipeline reply task for ppl want to use a fully customized llm_node, for example for multiple LLMs inside the node, or other chat completion libraries like llama index chat engine or langchain.
Hi all, |
|
@biouser-abiomix does this example work for you, without a llamaindex plugin you can still use it in the llm_node of the agent. |
Can confirm it works @longcw |
theomonnom
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this lg! feel free to remove the llama-index plugin directly inside this PR
1767666 to
e0f4a72
Compare
No description provided.