Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Will llama 3 have function calling support in future? #88

Open
HakaishinShwet opened this issue Apr 20, 2024 · 7 comments
Open

Will llama 3 have function calling support in future? #88

HakaishinShwet opened this issue Apr 20, 2024 · 7 comments

Comments

@HakaishinShwet
Copy link

In #78 it is stated that it is not currently been supported so my question is that will that be supported in future/ is it in road map of llama 3?if yes then any approx date upto which we can expect this?if no then why not :-() when it can help in making own tools and use with autogen

@Mandeep0001
Copy link

Waiting for update.

@kushagradeep
Copy link

Yeah , it should since LLM's without function calling are not enterprise ready

@zoltan-fedor
Copy link

Others do see reasonable results with Llama 3 8B function calling: https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/function_calling_template_for_llama_3/

@aronbrand
Copy link

This would be highly beneficial. I would love to see a model that is fine-tuned for this, including parallel function calling.

@xrd
Copy link

xrd commented Apr 21, 2024

Maybe look at guidance? I have used llama3 with it, and it supports function calling.

https://github.com/guidance-ai/guidance/?tab=readme-ov-file#automatic-call-grammar-for-guidance-functions

@teis-e
Copy link

teis-e commented Apr 30, 2024

With llama 70B I get good structured outputs. Didn't try function calling yet, but that should work in some scenarios i guess. Will let know when I have tested.

@zoltan-fedor
Copy link

zoltan-fedor commented Apr 30, 2024

I did try function calling by making a function-calling ReAct template for Llama 3 70B (see https://www.reddit.com/r/LocalLLaMA/comments/1c7jtwh/comment/l1dksmx/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button)

It seems to be working reasonably well.

I have tried it with a large number of functions yet, but I have tested it and it seems to work well with 2-3 functions, 2-3 parameters per function. I haven't reached its limit yet where it would start breaking down.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants