Skip to content

Conversation

@PeriniM
Copy link
Contributor

@PeriniM PeriniM commented Apr 30, 2024

Added the possibility to use the llm models provided by Groq.
Since there are no embeddings models on Groq, it should be used in combination with one (OpenAI or a local one).
You can find an example in the examples/mixed_models folder.
For example:

graph_config = {
    "llm": {
        "model": "groq/gemma-7b-it",
        "api_key": groq_key,
        "temperature": 0
    },
    "embeddings": {
        "model": "ollama/nomic-embed-text",
        "temperature": 0,
        "base_url": "http://localhost:11434",  # set ollama URL arbitrarily
    }
}

Available models are listed here

@PeriniM PeriniM merged commit ae2971c into pre/beta Apr 30, 2024
@PeriniM PeriniM deleted the groq-implementation branch April 30, 2024 00:59
@github-actions
Copy link

🎉 This PR is included in version 0.5.0-beta.1 🎉

The release is available on:

Your semantic-release bot 📦🚀

@github-actions
Copy link

🎉 This PR is included in version 0.5.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants