Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI Functions support and demo example #97

Open
votkon opened this issue Jun 23, 2023 · 8 comments
Open

OpenAI Functions support and demo example #97

votkon opened this issue Jun 23, 2023 · 8 comments
Assignees
Labels
feature New feature or request
Milestone

Comments

@votkon
Copy link

votkon commented Jun 23, 2023

Hi,
I was wondering how we could plug the OpenAI Functions into Lanarky.

It takes the conversation chain as an input and Functions seem to be called by Agents
Do you think it's possible to add that and maybe some examples as well?
Thanks!

@votkon votkon added the feature New feature or request label Jun 23, 2023
@ajndkr
Copy link
Owner

ajndkr commented Jun 23, 2023

@votkon sure! do you maybe have a langchain example or a use case?

@ajndkr
Copy link
Owner

ajndkr commented Jun 26, 2023

@votkon I think you can use OpenAI functions directly via initialise_agent() (https://python.langchain.com/docs/modules/agents/agent_types/openai_functions_agent). The function creates an AgentExecutor instance which is already supported. See example: https://github.com/ajndkr/lanarky/blob/main/examples/app/zero_shot_agent.py

@votkon
Copy link
Author

votkon commented Jun 26, 2023

Thanks for the prompt reply!

I somehow missed this example.

After giving it a try it looks like it doesn't communicate correctly with the frontend app.

In the zero_shot_agent.py example, when I change the

  return initialize_agent(
        tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
    ) 

to

  return initialize_agent(
        tools, llm, agent=AgentType.OPENAI_FUNCTIONS, verbose=True
    )

it doesn't stream responses back to the chat, although it shows them in the Python output.

Also, it doesn't look like initialize_agent supports the memory parameter:
https://python.langchain.com/docs/modules/memory/how_to/agent_with_memory

I am doing a simple web chat with conversation memory and the ability to call different tools and/or functions.

Thanks!

@ajndkr
Copy link
Owner

ajndkr commented Jun 27, 2023

it doesn't stream responses back to the chat, although it shows them in the Python output.

I see! It's likely due to this:

answer_prefix_tokens: list[str] = ["Final", " Answer", ":"]

I need to update this to handle all agent types. I think currently only zero shot agents are supported.

@votkon
Copy link
Author

votkon commented Jul 6, 2023

Thanks! Have you had a chance to look into it?

@ajndkr
Copy link
Owner

ajndkr commented Jul 8, 2023

Thanks! Have you had a chance to look into it?

nope! i have been busy these past few weeks. I will try to find more time to work on this repo.

@GuillaumeBeal
Copy link

Hi @votkon,

Using this custom class

    class CustomAsyncAgentsStreamingCallback(AsyncAgentsLanarkyCallback, AsyncLLMChainStreamingCallback):
        """AsyncStreamingResponseCallback handler for AgentExecutor."""
        answer_prefix_tokens: list[str] = [""]

and using it as a callback in your StreamingResponse may be a quickfix for you

StreamingResponse.from_chain( 
        agent_executor, 
        {"input": query}, 
        media_type="text/event-stream",
        callback=CustomAsyncAgentsStreamingCallback
    )

@ajndkr
Copy link
Owner

ajndkr commented Nov 29, 2023

I will add native support for OpenAI functions as part of v0.9 roadmap.

@ajndkr ajndkr self-assigned this Nov 29, 2023
@ajndkr ajndkr added this to the v0.9 milestone Nov 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
Status: Todo
Development

No branches or pull requests

3 participants