-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Open
Description
Please provide the following information
Fedora 43 x64. Desktop v1.23.0 via Flatpak.
To Reproduce
- Install a fresh instance of Goose Desktop
- Open Goose and "Welcome to Goose" screen with "Quick Setup with API Key" field will be shown
- Insert a Groq API Key (
gsk_***). Goose will show "Detected groq" with (20 models available) subcaption
4. "Choose Model" modal will show up, but **only 5 models are shown**: gpt-oss-120b, llama-3.1-8b-instant, llama-3.3-70b-versatile, llama-guard-4-12b, and gpt-oss-20b
Expected behavior
All 20 models should be shown, including Qwen3 and Kimi K2 Instruct
Additional context: The Culprit
goose/crates/goose/src/providers/declarative/groq.json
Lines 1 to 31 in 44ac5dc
| { | |
| "name": "groq", | |
| "engine": "openai", | |
| "display_name": "Groq (d)", | |
| "description": "Fast inference with Groq hardware", | |
| "api_key_env": "GROQ_API_KEY", | |
| "base_url": "https://api.groq.com/openai/v1/chat/completions", | |
| "models": [ | |
| { | |
| "name": "openai/gpt-oss-120b", | |
| "context_limit": 131072 | |
| }, | |
| { | |
| "name": "llama-3.1-8b-instant", | |
| "context_limit": 131072 | |
| }, | |
| { | |
| "name": "llama-3.3-70b-versatile", | |
| "context_limit": 131072 | |
| }, | |
| { | |
| "name": "meta-llama/llama-guard-4-12b", | |
| "context_limit": 131072 | |
| }, | |
| { | |
| "name": "openai/gpt-oss-20b", | |
| "context_limit": 131072 | |
| } | |
| ], | |
| "supports_streaming": true | |
| } |
This matches the available option on the dropdown, so editing groq.json should be sufficient to fix this bug. (I will do this and submit the PR)
Reference: bbcfe7f - Editing declarative mistral.json file to make Devstral 2 & Devstral Small 2 models available on model dropdown list
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels