Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add submodules to the LLM Client to use other LLMs #86

Open
este6an13 opened this issue Apr 29, 2024 · 0 comments
Open

Add submodules to the LLM Client to use other LLMs #86

este6an13 opened this issue Apr 29, 2024 · 0 comments
Labels
exploring Exploring new ideas

Comments

@este6an13
Copy link
Contributor

Goal

Split the LLM Client implementation in some way so that we can consume other state-of-the-art models such as:

  • Mixtral 8x7B via Azure or HuggingFace
  • Phi-3 via HuggingFace
  • Llama 3 via HuggingFace
@este6an13 este6an13 added the exploring Exploring new ideas label Apr 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
exploring Exploring new ideas
Projects
None yet
Development

No branches or pull requests

1 participant