Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Add handleEvent callback #2025

Merged
merged 14 commits into from
Aug 25, 2023

Conversation

mgce
Copy link
Contributor

@mgce mgce commented Jul 20, 2023

Following up on the earlier pull request #1964 I introduced one generic handleEvent callback.
It will be responsible for catching both tokens as well as a function call.

@vercel
Copy link

vercel bot commented Jul 20, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Updated (UTC)
langchainjs-docs ✅ Ready (Inspect) Visit Preview Aug 25, 2023 10:22pm

@dosubot dosubot bot added the auto:improvement Medium size change to existing code to handle new use-cases label Jul 20, 2023
@vladholubiev
Copy link

@jacoblee93, Let me know if there's anything you need to increase the chances of this getting merged! Be it reviewing, testing, etc.

If introducing this change requires changing the upstream python library first, we can do that as well, if it helps.

@jacoblee93
Copy link
Collaborator

Yeah the biggest blocker is coordination with Python - if you open it there it would definitely speed things up.

@vladholubiev
Copy link

@jacoblee93 here is the PR for the Python: langchain-ai/langchain#9263

@andrewBatutin
Copy link

@jacoblee93 can u please advice who from python maintainers can help with python pr review?

@vladholubiev
Copy link

@jacoblee93 we have an identical Python PR opened. Would you be able to review the proposed small changes anytime soon? Could we do anything else to facilitate this process?

@hinthornw
Copy link
Collaborator

Hi @vladgolubev and @andrewBatutin thanks for the PR and suggestion! While there is room for a generic onEvent callback, it would have to be completely generic and scoped to the base callback manager rather than the llm run manager.

Would prefer to extend the llm's on new token callback to take in an optional chunk arg that each handler can decide to use

hinthornw pushed a commit to langchain-ai/langchain that referenced this pull request Aug 25, 2023
# Description

Main motivation for this PR is to sync with JS langchain
langchain-ai/langchainjs#2025

Added `on_event` callback that works for both token and openai function
calls in streaming mode

Twitter: [@shelfdev](https://twitter.com/ShelfDev)
@mgce
Copy link
Contributor Author

mgce commented Aug 25, 2023

Hey @hinthornw, @jacoblee93
Thanks for your help. I've added new commit to adhere to changes in python lib. Please take a look.

@jacoblee93
Copy link
Collaborator

Not super pretty, but should be backwards compatible and handle any additional arguments - when we're ready to make breaking changes, we can roll everything into one arg.

Final comments @hinthornw?

@jacoblee93 jacoblee93 merged commit 1c0bdae into langchain-ai:main Aug 25, 2023
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:improvement Medium size change to existing code to handle new use-cases
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants