Fix Ollama tool calling crash when system prompt is None#41
Merged
colinfrisch merged 1 commit intomesa:mainfrom Dec 17, 2025
Merged
Conversation
Contributor
|
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
colinfrisch
approved these changes
Dec 16, 2025
Member
colinfrisch
left a comment
There was a problem hiding this comment.
Thanks for spotting this ! I'll give a bit of time for the other maintainers to review and merge in a few days.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Hi, When I was trying to write a beginner-friendly tutorial, I wanted users to be able to experiment with
ModuleLLMfreely. However, the default Gemini free-tier rate limit (RPD = 20) is often not sufficient for experimentation. So I decided to switch to a local Ollama setup with a lightweight model (qwen3:8b). During this process, I encountered the issue describedbelow.
Summary
Fixes a crash / infinite retry issue when using ModuleLLM with Ollama models(e.g. ollama/qwen3:8b) and tool calling enabled.
Bug / Issue
When using
ModuleLLMwith an Ollama model and enabling tool calling, the application may enter an infinite retry loop or fail with a validation error.The root cause is that when no
system_promptis provided,ModuleLLMinitializes the system message withcontent=None. Ollama (vialitellm) strictly requirescontentto be a string. This becomes problematic during tool calling, where the system prompt is dynamically modified to inject tool schemas, causing validation failures or retries.Expected Behavior
ModuleLLMshould handle tool calling with Ollama models correctly without crashing. The system message content should default to an empty string ("") instead ofNoneto ensure compatibility with stricter API contracts such as Ollama's.Implementation
The system message initialization logic was updated to default the
contentfield to an empty string ("") when no system prompt is provided. This ensures safe concatenation and modification of the system prompt during tool calling and maintains compatibility across providers.Testing
qwen3:8bAdditional Notes
N/A