Skip to content

Conversation

@Levyathanus
Copy link
Contributor

If the model provider in the graph configuration is not specified, the abstract_graph tries to infer it from models_tokens.py and the user is warned with an info message.
E.g.: if the following configuration is used:

graph_config = {
    "llm": {
        "api_key": YOUR_GEMINI_API_KEY,
        "model": "gemini-pro",
    },
    "verbose": True,
    "headless": False,
}

is considered in the same way as using:

graph_config = {
    "llm": {
        "api_key": YOUR_GEMINI_API_KEY,
        "model": "google_genai/gemini-pro",
    },
    "verbose": True,
    "headless": False,
}

@VinciGit00
Copy link
Collaborator

Ok please show me what should I write for OpenAI

Copy link
Collaborator

@VinciGit00 VinciGit00 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you I tried it and it works

@VinciGit00 VinciGit00 merged commit 8fd7b24 into ScrapeGraphAI:pre/beta Nov 18, 2024
1 check passed
@github-actions
Copy link

🎉 This PR is included in version 1.30.0-beta.5 🎉

The release is available on:

Your semantic-release bot 📦🚀

@github-actions
Copy link

🎉 This PR is included in version 1.31.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

@github-actions
Copy link

🎉 This PR is included in version 1.31.0-beta.1 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants