Skip to content

Conversation

@longcw
Copy link
Contributor

@longcw longcw commented Apr 2, 2025

No description provided.

@longcw longcw requested a review from a team April 2, 2025 09:11
@github-actions
Copy link
Contributor

github-actions bot commented Apr 2, 2025

❌ Invalid Changeset Format Detected

One or more changeset files in this PR have an invalid format. Please ensure they adhere to:

  • Start with --- and include a closing --- on its own line.
  • Each package line must be in the format:
    "package-name": patch|minor|major
  • No duplicate package entries allowed.
  • A non-empty change description must follow the front matter.

Error details:
.github/next-release/changeset-0ac9b53f.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-27d8268e.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-65b3292c.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-7116e2a3.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-73e5be2c.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-79b2e16a.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-7aed4d48.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-a4f99981.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-aee68111.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-bdfd50f8.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-be147682.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-c6cc5df2.md: Failed to read file from git branch 'pr_head'.
.github/next-release/changeset-f0294eb1.md: Failed to read file from git branch 'pr_head'.

@theomonnom
Copy link
Member

I'm wondering if we should delete livekit-plugins-llama-index, but convert our example to directly use their API in our examples

@longcw
Copy link
Contributor Author

longcw commented Apr 2, 2025

I'm wondering if we should delete livekit-plugins-llama-index, but convert our example to directly use their API in our examples

I have no preference. Only the chat_engine example is using the llama index plugin, the other two are already using the llama index directly. But for chat_engine, it seems okay to have the llama index as a LLM plugin, otherwise how to specify the llm for the agent?

@theomonnom
Copy link
Member

I have no preference. Only the chat_engine example is using the llama index plugin, the other two are already using the llama index directly. But for chat_engine, it seems okay to have the llama index as a LLM plugin, otherwise how to specify the llm for the agent?

Yes, but the tradeoffs are too big, with llama-index, if you use the ChatEngine, we don't support function calls. So IMO, we should maybe just do an example using llama-index inside the llm_node

@longcw
Copy link
Contributor Author

longcw commented Apr 3, 2025

Yes, but the tradeoffs are too big, with llama-index, if you use the ChatEngine, we don't support function calls. So IMO, we should maybe just do an example using llama-index inside the llm_node

Sounds good to me.

@bcherry
Copy link
Contributor

bcherry commented Apr 3, 2025

@theomonnom send a ping if you do end up deleting the plugin (or any other niche/one-of-a-kind plugins like rag, nltk, etc) since we have links to them in docs (very happy to remove, they're kinda awkward to list out since they don't fit into a category

@longcw
Copy link
Contributor Author

longcw commented Apr 7, 2025

@theomonnom updated all examples to use original llama-index, feel free to remove the pulgin.

),
vad=silero.VAD.load(),
stt=deepgram.STT(),
llm=DummyLLM(), # use a dummy LLM to enable the pipeline reply
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we add a mock LLM in the api like llm=llm.NOT_REQUIRE to enable the pipeline reply task for ppl want to use a fully customized llm_node, for example for multiple LLMs inside the node, or other chat completion libraries like llama index chat engine or langchain.

@biouser-abiomix
Copy link

I have no preference. Only the chat_engine example is using the llama index plugin, the other two are already using the llama index directly. But for chat_engine, it seems okay to have the llama index as a LLM plugin, otherwise how to specify the llm for the agent?

Yes, but the tradeoffs are too big, with llama-index, if you use the ChatEngine, we don't support function calls. So IMO, we should maybe just do an example using llama-index inside the llm_node

Hi all,
if you may allow me to contribute to this discussion, i wanted to add here that for RAG applications, the llama-index chat engine produces much better results compared to just providing a manual RAG tool. This is important for (multi)-agent workflows where a rag component is important. This component is the main reason i am sticking to the 0.x branch now.
Best wishes

@longcw
Copy link
Contributor Author

longcw commented Apr 8, 2025

@biouser-abiomix does this example work for you, without a llamaindex plugin you can still use it in the llm_node of the agent.

@biouser-abiomix
Copy link

@biouser-abiomix does this example work for you, without a llamaindex plugin you can still use it in the llm_node of the agent.

Can confirm it works @longcw
Thank you very much !

Copy link
Member

@theomonnom theomonnom left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this lg! feel free to remove the llama-index plugin directly inside this PR

@longcw longcw changed the title fix llama index plugin remove llama index plugin and update examples Apr 9, 2025
@longcw longcw force-pushed the longc/llama-index-example branch from 1767666 to e0f4a72 Compare April 9, 2025 07:17
@longcw longcw merged commit bbcc82e into main Apr 9, 2025
5 of 6 checks passed
@longcw longcw deleted the longc/llama-index-example branch April 9, 2025 07:24
jayesh-mivi pushed a commit to mivi-dev-org/custom-livekit-agents that referenced this pull request Jun 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants