Skip to content

Add ollama-local support to Microbots#73

Merged
0xba1a merged 41 commits intomainfrom
bala/add_ollama_local_model
Dec 3, 2025
Merged

Add ollama-local support to Microbots#73
0xba1a merged 41 commits intomainfrom
bala/add_ollama_local_model

Conversation

@0xba1a
Copy link
Member

@0xba1a 0xba1a commented Dec 1, 2025

No description provided.

@codecov-commenter
Copy link

codecov-commenter commented Dec 1, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 91.34%. Comparing base (1c87a9d) to head (dddff42).

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff             @@
##             main      #73      +/-   ##
==========================================
+ Coverage   90.54%   91.34%   +0.80%     
==========================================
  Files          19       20       +1     
  Lines         698      763      +65     
==========================================
+ Hits          632      697      +65     
  Misses         66       66              
Flag Coverage Δ
integration 79.94% <36.11%> (-7.45%) ⬇️
ollama_local 63.04% <87.50%> (?)
unit 56.09% <94.44%> (+6.23%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
src/microbots/MicroBot.py 100.00% <100.00%> (ø)
src/microbots/constants.py 100.00% <100.00%> (ø)
src/microbots/llm/llm.py 100.00% <100.00%> (ø)
src/microbots/llm/ollama_local.py 100.00% <100.00%> (ø)
src/microbots/llm/openai_api.py 100.00% <100.00%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds support for Ollama Local as an LLM provider to Microbots, enabling the use of locally-hosted language models alongside the existing Azure OpenAI provider. The PR includes a refactoring of the LLM response schema, replacing the result field with thoughts to better represent the LLM's reasoning process.

Key Changes:

  • New OllamaLocal LLM provider implementation with comprehensive test fixtures for integration testing
  • Schema change in LLMAskResponse: renamed result field to thoughts to better capture the LLM's reasoning
  • Added pytest fixtures and documentation for running tests with optional Ollama server dependencies

Reviewed changes

Copilot reviewed 14 out of 14 changed files in this pull request and generated 21 comments.

Show a summary per file
File Description
src/microbots/llm/ollama_local.py New OllamaLocal class implementing LLMInterface for local Ollama server communication
src/microbots/llm/llm.py Updated LLMAskResponse schema: replaced result with thoughts field
src/microbots/MicroBot.py Added OllamaLocal provider integration and updated system prompt with clearer instructions
src/microbots/constants.py Added OLLAMA_LOCAL model provider constant
test/llm/test_ollama_local.py Comprehensive unit and integration tests for OllamaLocal implementation
test/llm/conftest.py New pytest fixtures for Ollama server setup, model management, and test isolation
test/llm/test_llm.py Updated assertions to use thoughts instead of result
test/llm/test_openai_api.py Updated assertions to use thoughts field
test/bot/test_writing_bot.py Added test for WritingBot with Ollama Local provider
test/bot/test_reading_bot.py Updated to use issue_1 fixture and added result validation
test/bot/test_microbot.py Updated mock responses to use thoughts field
test/conftest.py Registered Ollama fixtures for global test availability
test/llm/README_OLLAMA_TESTING.md Comprehensive documentation for Ollama testing setup and usage
.vscode/settings.json Added "ollama" and "qwen" to spell checker dictionary

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +104 to +106
# However instructed ollama is not only providing the response only in json
# It adds some extra text above or below the json sometimes.
# So, this hack to extract the json part from the response.
Copy link

Copilot AI Dec 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Capitalization: "json" should be capitalized as "JSON" for consistency with technical terminology.

Suggested change
# However instructed ollama is not only providing the response only in json
# It adds some extra text above or below the json sometimes.
# So, this hack to extract the json part from the response.
# However instructed ollama is not only providing the response only in JSON
# It adds some extra text above or below the JSON sometimes.
# So, this hack to extract the JSON part from the response.

Copilot uses AI. Check for mistakes.
}
mock_post.return_value = mock_response

result = ollama.ask("List files")
Copy link

Copilot AI Dec 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Variable result is not used.

Suggested change
result = ollama.ask("List files")
ollama.ask("List files")

Copilot uses AI. Check for mistakes.
import time
import requests
import shutil
from pathlib import Path
Copy link

Copilot AI Dec 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'Path' is not used.

Suggested change
from pathlib import Path

Copilot uses AI. Check for mistakes.
import pytest
import sys
import os
from unittest.mock import Mock, patch, MagicMock
Copy link

Copilot AI Dec 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'MagicMock' is not used.

Suggested change
from unittest.mock import Mock, patch, MagicMock
from unittest.mock import Mock, patch

Copilot uses AI. Check for mistakes.
import sys
import os
from unittest.mock import Mock, patch, MagicMock
import json
Copy link

Copilot AI Dec 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import of 'json' is not used.

Suggested change
import json

Copilot uses AI. Check for mistakes.
if response.status_code == 200:
server_already_running = True
print(f"\nOllama server already running on port {ollama_model_port}")
except requests.exceptions.RequestException:
Copy link

Copilot AI Dec 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'except' clause does nothing but pass and there is no explanatory comment.

Suggested change
except requests.exceptions.RequestException:
except requests.exceptions.RequestException:
# If the request fails, assume the server is not running and proceed to start it.

Copilot uses AI. Check for mistakes.
@0xba1a
Copy link
Member Author

0xba1a commented Dec 1, 2025

@copilot Investigate why the integration test gets cancelled and add a comment

Copy link
Contributor

Copilot AI commented Dec 1, 2025

@0xba1a I've opened a new pull request, #74, to work on those changes. Once the pull request is ready, I'll request review from you.

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@0xba1a 0xba1a marked this pull request as draft December 1, 2025 14:23
0xba1a and others added 3 commits December 1, 2025 19:53
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@0xba1a
Copy link
Member Author

0xba1a commented Dec 1, 2025

@copilot Review the PR now and give all your suggestion in a single commit

@0xba1a 0xba1a marked this pull request as ready for review December 1, 2025 14:26
@0xba1a
Copy link
Member Author

0xba1a commented Dec 1, 2025

@copilot Review the PR now and give all your suggestion in a single commit

Copy link
Contributor

Copilot AI commented Dec 1, 2025

@0xba1a I've opened a new pull request, #75, to work on those changes. Once the pull request is ready, I'll request review from you.

Copy link
Contributor

Copilot AI commented Dec 1, 2025

@0xba1a I've opened a new pull request, #76, to work on those changes. Once the pull request is ready, I'll request review from you.

Copilot AI and others added 5 commits December 1, 2025 20:10
* Initial plan

* Address all PR review comments in a single commit

Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
@0xba1a
Copy link
Member Author

0xba1a commented Dec 2, 2025

@copilot Check the test failure and root cause it

Copy link
Contributor

Copilot AI commented Dec 2, 2025

@0xba1a I've opened a new pull request, #77, to work on those changes. Once the pull request is ready, I'll request review from you.

@0xba1a 0xba1a merged commit 8f923ff into main Dec 3, 2025
4 checks passed
@0xba1a 0xba1a deleted the bala/add_ollama_local_model branch March 3, 2026 08:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants