The LLM provider modules (Claude, Groq, Gemini, ChatGPT, Grok, and Ollama) contain hardcoded magic numbers and string literals that should be defined as constants. These magic values include model names, token limits, temperature settings, and API endpoints, making the code difficult to maintain and prone to inconsistencies.
