Skip to content

Conversation

iceljc
Copy link
Collaborator

@iceljc iceljc commented Sep 2, 2025

PR Type

Other


Description

  • Replace dynamic model selection with hardcoded "gpt-5-mini"

  • Remove LLM provider service dependency

  • Simplify completion provider initialization


Diagram Walkthrough

flowchart LR
  A["Dynamic Model Selection"] --> B["Hardcoded Model"]
  C["LLM Provider Service"] --> D["Direct Provider String"]
  E["Multi-modal Model Query"] --> F["Simple Model Name"]
Loading

File Walkthrough

Relevant files
Configuration changes
ReadImageFn.cs
Hardcode model selection in image reading                               

src/Plugins/BotSharp.Plugin.FileHandler/Functions/ReadImageFn.cs

  • Remove LLM provider service dependency
  • Replace dynamic model selection with hardcoded "gpt-5-mini"
  • Simplify completion provider initialization
+3/-4     
ReadPdfFn.cs
Hardcode model selection in PDF reading                                   

src/Plugins/BotSharp.Plugin.FileHandler/Functions/ReadPdfFn.cs

  • Remove LLM provider service dependency
  • Replace dynamic model selection with hardcoded "gpt-5-mini"
  • Simplify completion provider initialization
+3/-4     

@iceljc iceljc merged commit c68e2ec into SciSharp:master Sep 2, 2025
0 of 4 checks passed
Copy link

qodo-merge-pro bot commented Sep 2, 2025

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
🧪 No relevant tests
🔒 No security concerns identified
⚡ Recommended focus areas for review

Hardcoded Model

Replacing dynamic provider/model resolution with hardcoded values reduces flexibility and may break multi-modal/image-specific behavior previously provided by 'gpt-4o'. Validate that 'gpt-5-mini' supports required capabilities and that environment-specific configuration is not needed.

var provider = "openai";
var model = "gpt-5-mini";
var completion = CompletionProvider.GetChatCompletion(_services, provider: provider, model: model);
var response = await completion.GetChatCompletions(agent, dialogs);
return response.Content;
Provider Assumption

The provider is now hardcoded to 'openai'. Confirm this aligns with deployment environments and that fallbacks or configuration overrides are not required. Consider reading from configuration to avoid vendor lock-in.

var provider = "openai";
var model = "gpt-5-mini";
var completion = CompletionProvider.GetChatCompletion(_services, provider: provider, model: model);
var response = await completion.GetChatCompletions(agent, dialogs);
return response.Content;
Error Handling Context

With initialization moved inline, ensure exceptions from provider/model mismatches are handled with actionable logs, since service-based validation is removed. Verify logging includes provider/model details.

    var provider = "openai";
    var model = "gpt-5-mini";
    var completion = CompletionProvider.GetChatCompletion(_services, provider: provider, model: model);
    var response = await completion.GetChatCompletions(agent, dialogs);
    return response.Content;
}
catch (Exception ex)

Copy link

qodo-merge-pro bot commented Sep 2, 2025

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Impact
Possible issue
Use configurable multimodal model

Hardcoding a potentially non-multimodal model can break image analysis. Make the
provider/model configurable and default to a known multimodal-capable model to
prevent runtime failures. This preserves flexibility across environments.

src/Plugins/BotSharp.Plugin.FileHandler/Functions/ReadImageFn.cs [101-103]

-var provider = "openai";
-var model = "gpt-5-mini";
+var provider = Environment.GetEnvironmentVariable("LLM_PROVIDER") ?? "openai";
+var model = Environment.GetEnvironmentVariable("BOTSHARP_VISION_MODEL") ?? "gpt-4o";
 var completion = CompletionProvider.GetChatCompletion(_services, provider: provider, model: model);
  • Apply / Chat
Suggestion importance[1-10]: 9

__

Why: The suggestion correctly identifies that hardcoding gpt-5-mini in ReadImageFn is a critical issue, as this function requires a multimodal model which the hardcoded model may not be, potentially breaking the core functionality.

High
Default to configurable multimodal model

A hardcoded model that may lack multimodal support risks failing on PDFs with
embedded images. Make provider/model configurable and default to a proven
multimodal model to ensure reliable analysis. This avoids failures in varied
deployments.

src/Plugins/BotSharp.Plugin.FileHandler/Functions/ReadPdfFn.cs [92-94]

-var provider = "openai";
-var model = "gpt-5-mini";
+var provider = Environment.GetEnvironmentVariable("LLM_PROVIDER") ?? "openai";
+var model = Environment.GetEnvironmentVariable("BOTSHARP_VISION_MODEL") ?? "gpt-4o";
 var completion = CompletionProvider.GetChatCompletion(_services, provider: provider, model: model);
  • Apply / Chat
Suggestion importance[1-10]: 9

__

Why: The suggestion correctly identifies a critical risk, as ReadPdfFn may need to process images within PDFs, requiring a multimodal model. Hardcoding gpt-5-mini removes the guarantee of multimodal capability, potentially breaking functionality.

High
  • More

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant