Skip to content

Keyframe Labs Plugin#4950

Merged
tinalenguyen merged 4 commits intolivekit:mainfrom
keyframelabs:krad/kfl-plugin
Feb 27, 2026
Merged

Keyframe Labs Plugin#4950
tinalenguyen merged 4 commits intolivekit:mainfrom
keyframelabs:krad/kfl-plugin

Conversation

@kradkfl
Copy link
Contributor

@kradkfl kradkfl commented Feb 25, 2026

@CLAassistant
Copy link

CLAassistant commented Feb 25, 2026

CLA assistant check
All committers have signed the CLA.

@kradkfl kradkfl force-pushed the krad/kfl-plugin branch 4 times, most recently from 1c061bb to 0efcbe5 Compare February 25, 2026 21:05
@kradkfl kradkfl marked this pull request as ready for review February 25, 2026 21:15
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 5 additional findings.

Open in Devin Review

@kradkfl kradkfl force-pushed the krad/kfl-plugin branch 2 times, most recently from a281561 to 172a2d3 Compare February 26, 2026 16:36
Copy link
Member

@tinalenguyen tinalenguyen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you for the PR! could you also add the plugin to this pyproject file as well: https://github.com/livekit/agents/blob/main/livekit-agents/pyproject.toml

logger.warning("set_emotion() called before start()")
return

await self._room.local_participant.publish_data(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just wondering, are there no plans to support setting the emotion via an API call? i think that would be more ideal if possible

Copy link
Contributor Author

@kradkfl kradkfl Feb 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean by way of a REST call to api.keyframelabs.com?

If so, the latency would be pretty high. Additionally, the avatar session itself on the backend isn't listening to any central server calls once connected to the room, it's just listening to data channels (which is what we use here).

Are you imagining that the user of the avatar plugin wouldn't want to access the underlying avatar object and call this function themselves? Or that accessing the underlying avatar isn't ergonomic?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah yes, i was just curious how users would update settings mid-session in other scenarios/use cases, but the latency using the data channels is much better as you said. iirc most of our other providers don't allow for dynamic updates like this, so this would be a new feature (very cool to see the changes in real-time!)

Copy link
Member

@tinalenguyen tinalenguyen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me! small ask - could you update the examples to use this setup instead for the AgentSession:

session = AgentSession(
        stt=inference.STT("deepgram/nova-3"),
        llm=inference.LLM("google/gemini-2.5-flash"),
        tts=inference.TTS("cartesia/sonic-3"),
        resume_false_interruption=False,
    )

i noticed that this setup called set_emotion more often as well, so it really seems like the avatar is reacting throughout the conversation. i didn't notice much of a difference in latency either

@tinalenguyen tinalenguyen merged commit bd81b23 into livekit:main Feb 27, 2026
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants