Skip to content

chore(provider): add Google AI Studio support#2805

Merged
amitksingh1490 merged 4 commits intotailcallhq:mainfrom
tivris:feat/google-ai-studio-provider
Apr 3, 2026
Merged

chore(provider): add Google AI Studio support#2805
amitksingh1490 merged 4 commits intotailcallhq:mainfrom
tivris:feat/google-ai-studio-provider

Conversation

@tivris
Copy link
Copy Markdown
Contributor

@tivris tivris commented Apr 2, 2026

Add Google AI Studio as a first-class provider, giving users direct access to Gemini models using their GEMINI_API_KEY without routing through OpenRouter.

Changes

Provider registration (provider.json):

  • New google_ai_studio entry using the native Google API endpoint (https://generativelanguage.googleapis.com/v1beta)
  • Auth via GEMINI_API_KEY (standard API key flow)
  • response_type: "Google" to use the native Google response format rather than the OpenAI-compatible shim
  • Curated model list: Gemini 3.1 Pro Preview, 3.1 Flash Lite Preview, 3 Pro Preview, 3 Flash Preview, 2.5 Pro, 2.5 Flash, and 2.0 Flash — all with 1M context window, tool support, and parallel tool calls; reasoning enabled on all except 2.0 Flash

Domain registration (provider.rs):

  • Added ProviderId::GOOGLE_AI_STUDIO constant
  • Registered in all_built_in() list
  • Added display name mapping ("google_ai_studio""GoogleAIStudio")
  • Added FromStr parsing for the "google_ai_studio" string
  • Tests for display name, built-in inclusion, and FromStr parsing

Testing

# Set your Gemini API key
export GEMINI_API_KEY=your_key_here

# Run with a Gemini model via Google AI Studio provider
forge --model google_ai_studio/gemini-2.5-pro

# Run domain tests
cargo insta test --accept -p forge_domain

Register google_ai_studio as a first-class provider using the
OpenAI-compatible endpoint at generativelanguage.googleapis.com,
allowing users to authenticate with GEMINI_API_KEY and access
Gemini models directly without an OpenRouter intermediary.

Closes tailcallhq#703
@github-actions github-actions bot added the type: feature Brand new functionality, features, pages, workflows, endpoints, etc. label Apr 2, 2026
Copy link
Copy Markdown

@mcgamer48ft mcgamer48ft left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bump

"id": "google_ai_studio",
"api_key_vars": "GEMINI_API_KEY",
"url_param_vars": [],
"response_type": "OpenAI",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"response_type": "OpenAI",
"response_type": "Google",

"api_key_vars": "GEMINI_API_KEY",
"url_param_vars": [],
"response_type": "OpenAI",
"url": "https://generativelanguage.googleapis.com/v1beta/openai/chat/completions",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@amitksingh1490
Copy link
Copy Markdown
Contributor

We have optimised the google response type for google models it will give you better & faster results

@amitksingh1490 amitksingh1490 merged commit 6b3e16f into tailcallhq:main Apr 3, 2026
11 checks passed
@amitksingh1490 amitksingh1490 changed the title feat(provider): add Google AI Studio support chore(provider): add Google AI Studio support Apr 3, 2026
@amitksingh1490 amitksingh1490 added type: chore Routine tasks like conversions, reorganization, and maintenance work. and removed type: feature Brand new functionality, features, pages, workflows, endpoints, etc. labels Apr 3, 2026
@tivris tivris deleted the feat/google-ai-studio-provider branch April 3, 2026 16:08
@mslinn
Copy link
Copy Markdown

mslinn commented Apr 10, 2026

I could not find any documentation on how to configure Gemini 3.1 models in forge.yaml.

@tivris
Copy link
Copy Markdown
Contributor Author

tivris commented Apr 10, 2026

I could not find any documentation on how to configure Gemini 3.1 models in forge.yaml.

Gemini 3.1 models don't require any special forge.yaml configuration, they're used just like any other provider. Here's the quickest way:

  1. Run forge provider login google_ai_studio
  2. Enter your Gemini API key (get one from https://aistudio.google.com/app/apikey)
  3. Once inside forge, you can switch models with :model gemini-3.1-pro-preview (or :config-model to persist it)

Alternatively, you can set it in your .forge.toml / forge.yaml:

[session]
provider_id = "google_ai_studio"
model_id = "gemini-3.1-pro-preview"
Screenshot

@mslinn
Copy link
Copy Markdown

mslinn commented Apr 10, 2026

$ forge provider login google_ai_studio
? Enter your GoogleAIStudio API key: AIza...ezWf5uE
● [12:11:25] GoogleAIStudio configured successfully
● [12:11:25] ERROR: Bearer token is required in API key field

@tivris
Copy link
Copy Markdown
Contributor Author

tivris commented Apr 10, 2026

Seems to be weird behaviour coming from a previously configured Bedrock provider. After a successful login, forge tries to fetch models from all configured providers, and if Bedrock has stale/invalid credentials, it throws that error.

You can fix it by logging out of Bedrock:

forge provider logout bedrock

Then retry forge provider login google_ai_studio.

@mslinn
Copy link
Copy Markdown

mslinn commented Apr 10, 2026

That worked, which is weird because I have never used bedrock. Looks like additional error handling would help users. Thanks!

@tivris
Copy link
Copy Markdown
Contributor Author

tivris commented Apr 10, 2026

No worries! Yeah that's definitely a bug in how forge handles multiple configured providers. I just submitted a fix for it in #2936. Thanks for identifying it, that helped track down the root cause.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

type: chore Routine tasks like conversions, reorganization, and maintenance work.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants