Skip to content

fix: make llm_enabled a user-configurable field for MCP services#314

Merged
rshoemaker merged 3 commits intomainfrom
fix/PLAT-523/llm_enabled_flag
Mar 24, 2026
Merged

fix: make llm_enabled a user-configurable field for MCP services#314
rshoemaker merged 3 commits intomainfrom
fix/PLAT-523/llm_enabled_flag

Conversation

@rshoemaker
Copy link
Copy Markdown
Contributor

Summary

Adds llm_enabled as an optional boolean config field (default: false) for MCP services, replacing the previously hardcoded llm.enabled: true.

When llm_enabled is false (or omitted):

  • The llm: section is omitted entirely from the generated config.yaml
  • llm_provider, llm_model, API keys, llm_temperature, and llm_max_tokens must not be provided (validation error if present)
  • The MCP server starts without an LLM proxy — users connect via MCP clients (Claude Desktop, Cursor, etc.) that bring their own LLM
  • All MCP tools (query_database, get_schema_info, etc.) remain fully functional over HTTP/SSE

When llm_enabled is true:

  • llm_provider, llm_model, and the matching provider credential are required (unchanged from prior behavior)
  • The llm: section is written to config.yaml with enabled: true
  • The MCP server's LLM proxy is active for web client / curl usage

Additional fix: embedding_provider: ollama now correctly requires ollama_url — a pre-existing validation gap that was masked when LLM fields were always required.

Test plan

  • Unit tests pass: go test ./server/internal/database/... ./server/internal/orchestrator/swarm/...
  • E2E tests pass: make test-e2e E2E_FIXTURE=lima
  • Manual: deploy MCP service with empty config ({}), connect via Claude Desktop using mcp-remote
  • Manual: deploy MCP service with llm_enabled: true + provider config, verify LLM proxy works via curl

The control plane previously hardcoded llm.enabled: true in every MCP
server config.yaml, forcing users to provide llm_provider, llm_model,
and an API key even when they only needed MCP protocol access (e.g.,
via Claude Desktop or Cursor).

llm_enabled is now an optional boolean field (default: false) in
MCPServiceConfig. When false, the llm: section is omitted entirely
from the generated config.yaml. When true, llm_provider, llm_model,
and the provider credential are required (unchanged behavior).

LLM-specific fields are rejected when llm_enabled is false to prevent
silent misconfiguration. ollama_url is shared between LLM and
embedding providers, so it is only rejected when neither needs it.

Also fixes a pre-existing gap where embedding_provider: ollama did not
require ollama_url — this path is now validated.
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 24, 2026

📝 Walkthrough

Walkthrough

The changes introduce an optional llm_enabled flag to MCP service configuration, making LLM fields conditionally required when enabled. When disabled or unset, LLM fields are rejected and the generated YAML omits the llm section. Validation, tests, and YAML generation were updated to follow this behavior.

Changes

Cohort / File(s) Summary
E2E Tests
e2e/service_provisioning_test.go
Added llm_enabled: true to MCP service config in all provisioning and update test scenarios.
API Validation Tests
server/internal/api/apiv1/validate_test.go
Adjusted validation test inputs to include llm_enabled: true across service/database spec cases; updated the "missing llm_provider" case to retain llm_enabled.
MCP Service Configuration Core
server/internal/database/mcp_service_config.go
Added LLMEnabled *bool to MCPServiceConfig; ParseMCPServiceConfig now conditionally parses/validates LLM fields only when llm_enabled is true; rejects LLM-only keys when disabled; special-case handling for ollama_url when used for embeddings.
MCP Service Configuration Tests
server/internal/database/mcp_service_config_test.go
Updated test builders to default to llm_enabled: true; added noLLMBase() and tests for minimal no-LLM and explicit llm_enabled:false; added negative tests asserting LLM keys are rejected when disabled and update tests for toggling llm_enabled.
MCP YAML Generation
server/internal/orchestrator/swarm/mcp_config.go
Changed generated LLM YAML to an optional pointer (LLM *mcpLLMConfig); emit LLM block only when LLMEnabled is true and apply defaults conditionally; omit LLM section when disabled/unset.
MCP YAML Generation Tests
server/internal/orchestrator/swarm/mcp_config_test.go
Refactored tests to use utils.PointerTo(true/false) for LLMEnabled; added separate tests for LLM enabled/disabled/defaults; assert llm section is absent when unset; adjusted embedding tests to cover embedding-without-LLM scenarios.

Poem

🐰 Hooray—no extra fluff in the config tonight,
Toggle a flag and the YAML is right.
If LLM's off, the block hops away,
If on, it hums with keys and play.
Carrots, code, and tidy delight! 🥕

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 inconclusive)

Check name Status Explanation Resolution
Description check ❓ Inconclusive The description covers the summary, changes, and test plan comprehensively, but is missing the explicit checklist section required by the template (Tests added, Documentation, Issue linked, Changelog entry, Breaking changes callout). Complete the PR checklist section and add explicit callouts for breaking changes, documentation updates, and changelog entry if applicable.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and specifically describes the main change: making llm_enabled a user-configurable field for MCP services, which aligns with the primary objective of the changeset.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch fix/PLAT-523/llm_enabled_flag

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@server/internal/orchestrator/swarm/mcp_config.go`:
- Around line 113-146: Existing MCP configs with llm_provider but no llm_enabled
will fail reconciliation because mcp_config.go only builds LLM when
cfg.LLMEnabled != nil && *cfg.LLMEnabled; fix by treating an explicit
llm_provider as implying enabled (or backfill the DB): either add a DB migration
that sets llm_enabled = true for rows where llm_provider IS NOT NULL, or change
the logic in the mcp LLM construction to treat cfg.LLMProvider (and non-nil/
non-empty) as enabling LLM when cfg.LLMEnabled == nil (i.e., if cfg.LLMEnabled
== nil && cfg.LLMProvider != "" { consider enabled }). Also ensure the
validation in database/mcp_service_config.go (the validation that rejects
LLM-only fields when llm_enabled is not true) is updated or kept consistent with
the migration so existing records are accepted during reconciliation.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: db18264f-f1d8-4c47-a52c-e65fc6ea8014

📥 Commits

Reviewing files that changed from the base of the PR and between a356068 and 075e469.

📒 Files selected for processing (6)
  • e2e/service_provisioning_test.go
  • server/internal/api/apiv1/validate_test.go
  • server/internal/database/mcp_service_config.go
  • server/internal/database/mcp_service_config_test.go
  • server/internal/orchestrator/swarm/mcp_config.go
  • server/internal/orchestrator/swarm/mcp_config_test.go

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
server/internal/orchestrator/swarm/mcp_config_test.go (1)

397-434: Add explicit llm_enabled=false coverage for embedding-only configs.

TestGenerateMCPConfig_EmbeddingWithoutLLM currently exercises only the omitted (nil) flag path. Adding an explicit LLMEnabled: false variant would better lock in the intended parity and prevent regressions.

Suggested test addition
+func TestGenerateMCPConfig_EmbeddingWithoutLLM_DisabledExplicitly(t *testing.T) {
+	embProvider := "voyage"
+	embModel := "voyage-3"
+	embAPIKey := "pa-voyage-key"
+
+	params := &MCPConfigParams{
+		Config: &database.MCPServiceConfig{
+			LLMEnabled:        utils.PointerTo(false),
+			EmbeddingProvider: &embProvider,
+			EmbeddingModel:    &embModel,
+			EmbeddingAPIKey:   &embAPIKey,
+		},
+		DatabaseName:  "mydb",
+		DatabaseHosts: []database.ServiceHostEntry{{Host: "db-host", Port: 5432}},
+		Username:      "appuser",
+		Password:      "secret",
+	}
+
+	data, err := GenerateMCPConfig(params)
+	if err != nil {
+		t.Fatalf("GenerateMCPConfig() error = %v", err)
+	}
+
+	cfg := parseYAML(t, data)
+	if cfg.LLM != nil {
+		t.Errorf("llm section should be absent when llm_enabled is false, got %+v", cfg.LLM)
+	}
+	if cfg.Embedding == nil {
+		t.Fatal("embedding section should be present")
+	}
+}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@server/internal/orchestrator/swarm/mcp_config_test.go` around lines 397 -
434, Add a second sub-case to TestGenerateMCPConfig_EmbeddingWithoutLLM that
explicitly sets LLMEnabled: false on the MCPServiceConfig inside the
MCPConfigParams (rather than leaving it nil) to verify behavior when the flag is
present but disabled; call GenerateMCPConfig with that params object, parse the
YAML as in the existing test, and assert the same expectations (cfg.LLM is
nil/absent and embedding section exists and is enabled with provider "voyage"),
ensuring the test covers both the omitted-nil and explicit-false paths for
LLMEnabled.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@server/internal/orchestrator/swarm/mcp_config_test.go`:
- Around line 397-434: Add a second sub-case to
TestGenerateMCPConfig_EmbeddingWithoutLLM that explicitly sets LLMEnabled: false
on the MCPServiceConfig inside the MCPConfigParams (rather than leaving it nil)
to verify behavior when the flag is present but disabled; call GenerateMCPConfig
with that params object, parse the YAML as in the existing test, and assert the
same expectations (cfg.LLM is nil/absent and embedding section exists and is
enabled with provider "voyage"), ensuring the test covers both the omitted-nil
and explicit-false paths for LLMEnabled.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 6b7b4565-cc36-43ee-99d4-0ce784f303c7

📥 Commits

Reviewing files that changed from the base of the PR and between 075e469 and 353edec.

📒 Files selected for processing (1)
  • server/internal/orchestrator/swarm/mcp_config_test.go

@rshoemaker rshoemaker merged commit ada912a into main Mar 24, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants