-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Will llama 3 have function calling support in future? #88
Comments
Waiting for update. |
Yeah , it should since LLM's without function calling are not enterprise ready |
Others do see reasonable results with Llama 3 8B function calling: https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/function_calling_template_for_llama_3/ |
This would be highly beneficial. I would love to see a model that is fine-tuned for this, including parallel function calling. |
Maybe look at guidance? I have used llama3 with it, and it supports function calling. |
With llama 70B I get good structured outputs. Didn't try function calling yet, but that should work in some scenarios i guess. Will let know when I have tested. |
I did try function calling by making a function-calling ReAct template for Llama 3 70B (see https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/comment/l1dksmx/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button) It seems to be working reasonably well. I have tried it with a large number of functions yet, but I have tested it and it seems to work well with 2-3 functions, 2-3 parameters per function. I haven't reached its limit yet where it would start breaking down. |
In #78 it is stated that it is not currently been supported so my question is that will that be supported in future/ is it in road map of llama 3?if yes then any approx date upto which we can expect this?if no then why not :-() when it can help in making own tools and use with autogen
The text was updated successfully, but these errors were encountered: