You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your feature request. I'm a bot, an experiment of @mudler, auditing the tickets that get opened in this Github project.
We appreciate your interest in improving LocalAI and welcomes all feature requests.
In this case, you want to request the integration of llama3 support in LocalAI with function calling capabilities, using the gguf format from Hugging Face. You provided the following links as examples:
As an alternative solution, you mentioned file support.
Please provide any additional information or context to help us better understand your request. We'll hide this comment when you post the issue, but you can delete it if you wish.
Is your feature request related to a problem? Please describe.
I need a better llm to process chat completion.
Describe the solution you'd like
Make llama3 in localai. and support function calling
Here's the gguf format:
https://huggingface.co/lmstudio-community/Meta-Llama-3-70B-Instruct-GGUF
https://huggingface.co/lmstudio-community/Meta-Llama-3-8B-Instruct-GGUF
Describe alternatives you've considered
assistant file support.
Additional context
The text was updated successfully, but these errors were encountered: