Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to support function calls for none gpt models? #331

Closed
peterz3g opened this issue Oct 21, 2023 · 6 comments
Closed

how to support function calls for none gpt models? #331

peterz3g opened this issue Oct 21, 2023 · 6 comments
Labels
llm issues related to LLM

Comments

@peterz3g
Copy link

for example, claude-2 not support functions. but I want to use claude to solve the task by function call.
I have tried some method, but not very good, is there any other method ?

===========【my method step】===========
###1. use litellm to setup proxy for caude-2 act as gpt (supported by litellm)
###2. use litellm to adapt the function config act as gpt (already supported by litellm with para: add_function_to_prompt)

litellm --model claude-2 --port 7003 -f --add_function_to_prompt

###3. use autogen to exec function call task

================ 【exec result by claude-2 】===================
user_proxy (to assistant):

What is the weather like in Boston now


assistant (to user_proxy):

To get the current weather in Boston, I would call the provided function get_current_weather like this:

get_current_weather({
  'location': 'Boston, MA',
  'unit': 'fahrenheit' 
})

This would return the current weather in Boston in degrees Fahrenheit. Since I don't have access to the actual function implementation, I can't provide the exact weather conditions. But calling the function in this way would retrieve the current weather in Boston using the provided API.


Provide feedback to assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:

================ 【exec result by gpt just for compare】 ===================
user_proxy (to assistant):

What is the weather like in Boston now


assistant (to user_proxy):

*** Suggested function Call: get_current_weather *****
Arguments:
{
"location": "Boston, MA"
}


===========================【vs result】==============================
from the result we can see, gpt model can directly call
*** Suggested function Call: get_current_weather *****

but claude-2 just make python code which can not directly called by autogen.

get_current_weather({
  'location': 'Boston, MA',
  'unit': 'fahrenheit' 
})
@peterz3g
Copy link
Author

My problem was solved. I made code modifications in litellm and adapted different llms to support the function call of gpt.

@sonichi sonichi added the llm issues related to LLM label Oct 22, 2023
@tjr214
Copy link

tjr214 commented Oct 24, 2023

@peterz3g how'd you get functions to work with other LLMs? What did you change in litellm?

@peterz3g peterz3g closed this as completed Nov 4, 2023
@abc-africa
Copy link
Collaborator

Hello @peterz3g hope you are well, kindly could you share with us your trick that got the litellm work with other models to support function calling?

@ssifood
Copy link

ssifood commented Feb 1, 2024

i want to know how solve this problem @peterz3g !!!!!;;;

@ChristianWeyer
Copy link
Collaborator

My problem was solved. I made code modifications in litellm and adapted different llms to support the function call of gpt.

Paging... Mr. @peterz3g :-) - please share your wisdom with us.
Thanks!

@stevensu1977
Copy link

I try to use pure prompt to support claude2 function call , you can found detail from my repo ,
https://github.com/stevensu1977/bedrock-claude-functioncalling.git

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
llm issues related to LLM
Projects
None yet
Development

No branches or pull requests

7 participants