Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #73 +/- ##
==========================================
+ Coverage 90.54% 91.34% +0.80%
==========================================
Files 19 20 +1
Lines 698 763 +65
==========================================
+ Hits 632 697 +65
Misses 66 66
Flags with carried forward coverage won't be shown. Click here to find out more.
🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Pull request overview
This pull request adds support for Ollama Local as an LLM provider to Microbots, enabling the use of locally-hosted language models alongside the existing Azure OpenAI provider. The PR includes a refactoring of the LLM response schema, replacing the result field with thoughts to better represent the LLM's reasoning process.
Key Changes:
- New OllamaLocal LLM provider implementation with comprehensive test fixtures for integration testing
- Schema change in LLMAskResponse: renamed
resultfield tothoughtsto better capture the LLM's reasoning - Added pytest fixtures and documentation for running tests with optional Ollama server dependencies
Reviewed changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 21 comments.
Show a summary per file
| File | Description |
|---|---|
src/microbots/llm/ollama_local.py |
New OllamaLocal class implementing LLMInterface for local Ollama server communication |
src/microbots/llm/llm.py |
Updated LLMAskResponse schema: replaced result with thoughts field |
src/microbots/MicroBot.py |
Added OllamaLocal provider integration and updated system prompt with clearer instructions |
src/microbots/constants.py |
Added OLLAMA_LOCAL model provider constant |
test/llm/test_ollama_local.py |
Comprehensive unit and integration tests for OllamaLocal implementation |
test/llm/conftest.py |
New pytest fixtures for Ollama server setup, model management, and test isolation |
test/llm/test_llm.py |
Updated assertions to use thoughts instead of result |
test/llm/test_openai_api.py |
Updated assertions to use thoughts field |
test/bot/test_writing_bot.py |
Added test for WritingBot with Ollama Local provider |
test/bot/test_reading_bot.py |
Updated to use issue_1 fixture and added result validation |
test/bot/test_microbot.py |
Updated mock responses to use thoughts field |
test/conftest.py |
Registered Ollama fixtures for global test availability |
test/llm/README_OLLAMA_TESTING.md |
Comprehensive documentation for Ollama testing setup and usage |
.vscode/settings.json |
Added "ollama" and "qwen" to spell checker dictionary |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
src/microbots/llm/ollama_local.py
Outdated
| # However instructed ollama is not only providing the response only in json | ||
| # It adds some extra text above or below the json sometimes. | ||
| # So, this hack to extract the json part from the response. |
There was a problem hiding this comment.
Capitalization: "json" should be capitalized as "JSON" for consistency with technical terminology.
| # However instructed ollama is not only providing the response only in json | |
| # It adds some extra text above or below the json sometimes. | |
| # So, this hack to extract the json part from the response. | |
| # However instructed ollama is not only providing the response only in JSON | |
| # It adds some extra text above or below the JSON sometimes. | |
| # So, this hack to extract the JSON part from the response. |
test/llm/test_ollama_local.py
Outdated
| } | ||
| mock_post.return_value = mock_response | ||
|
|
||
| result = ollama.ask("List files") |
There was a problem hiding this comment.
Variable result is not used.
| result = ollama.ask("List files") | |
| ollama.ask("List files") |
test/llm/conftest.py
Outdated
| import time | ||
| import requests | ||
| import shutil | ||
| from pathlib import Path |
There was a problem hiding this comment.
Import of 'Path' is not used.
| from pathlib import Path |
test/llm/test_ollama_local.py
Outdated
| import pytest | ||
| import sys | ||
| import os | ||
| from unittest.mock import Mock, patch, MagicMock |
There was a problem hiding this comment.
Import of 'MagicMock' is not used.
| from unittest.mock import Mock, patch, MagicMock | |
| from unittest.mock import Mock, patch |
test/llm/test_ollama_local.py
Outdated
| import sys | ||
| import os | ||
| from unittest.mock import Mock, patch, MagicMock | ||
| import json |
There was a problem hiding this comment.
Import of 'json' is not used.
| import json |
| if response.status_code == 200: | ||
| server_already_running = True | ||
| print(f"\nOllama server already running on port {ollama_model_port}") | ||
| except requests.exceptions.RequestException: |
There was a problem hiding this comment.
'except' clause does nothing but pass and there is no explanatory comment.
| except requests.exceptions.RequestException: | |
| except requests.exceptions.RequestException: | |
| # If the request fails, assume the server is not running and proceed to start it. |
|
@copilot Investigate why the integration test gets cancelled and add a comment |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
@copilot Review the PR now and give all your suggestion in a single commit |
|
@copilot Review the PR now and give all your suggestion in a single commit |
* Initial plan * Address all PR review comments in a single commit Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
|
@copilot Check the test failure and root cause it |
* Initial plan * Fix Ollama test timeout: add timeout to requests.post and model warm-up fixture Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: 0xba1a <2942888+0xba1a@users.noreply.github.com>
No description provided.