Skip to content

Conversation

@olasunkanmi-SE
Copy link
Owner

No description provided.

olasunkanmi-SE and others added 5 commits September 2, 2024 19:31
…uration

                        parameters

* Update GROQ_CONFIG to increase max_tokens to 5024
* Adjust max_tokens value for different AI models in EventGenerator
* Improve chat history management for Anthropic, Gemini, and Groq providers
* Enhance error handling in Gemini and Groq providers
@olasunkanmi-SE olasunkanmi-SE merged commit a1667f5 into development Sep 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants