Skip to content

[gh-392] Enable and disable mcp servers#398

Merged
dwash96 merged 3 commits intocecli-dev:v0.96.0from
gopar:gh-392-enable-and-disable-mcp-servers
Jan 17, 2026
Merged

[gh-392] Enable and disable mcp servers#398
dwash96 merged 3 commits intocecli-dev:v0.96.0from
gopar:gh-392-enable-and-disable-mcp-servers

Conversation

@gopar
Copy link
Copy Markdown

@gopar gopar commented Jan 13, 2026

Two new commands are introcuded:

  • /load-mcp - Load a disabled mcp
  • /remove-mcp - Remove an active mcp

Closes #332 and #392

Comment thread cecli/coders/base_coder.py
for server_name, server_tools in tools:
self.io.tool_output(f" - {server_name}")
@property
def mcp_tools(self):
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In order to reduce size of PR (and minimize breaking things), i'm creating a "wrapper" that keeps the original mcp_tools intent.

finally:
from . import SwitchCoderSignal

raise SwitchCoderSignal(
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like raising signals is the only way to make things persistent when changing internal states? Otherwise when i would remove/load an mcp, all my changes disappeared when switching do a different mode

Comment thread cecli/mcp/manager.py
except StopIteration:
return None

async def connect_all(self) -> None:
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is replaced by the classmethod, not really needed and wasn't used before either

Comment thread cecli/mcp/manager.py
session = await server.connect()
tools_result = await session.list_tools()
self._server_tools[server.name] = tools_result.tools
tools = await experimental_mcp_client.load_mcp_tools(session=session, format="openai")
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i dont have all the context so not sure what the intent is here, but perhaps we can update so we can just use list_tools vs using this library? Would need to be in another PR since that is a bigger change

Copy link
Copy Markdown
Collaborator

@dwash96 dwash96 Jan 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://docs.litellm.ai/docs/mcp#2-list-and-call-mcp-tools

experimental_mcp_client formats the known list of MCP tools into a format that LLM prefers they be in from their own documentation. Based on a lot of your own work with the manager, you'd probably still want to store the name -> tool list mapping from experimental_mcp_client off of the manager instance since we do need to look up the tool names and what server they belong to for agent mode's operations

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you'd probably still want to store the name -> tool list mapping from experimental_mcp_client off of the manager instance since we do need to look up the tool names and what server they belong to for agent mode's operations

I updated self.mcp_tools in the Coder class to point to a backwards compatible version of what was originally there, and since AgentCoder inherits from Coder, it didn't break anything when i ran the agent. (I did do some small changes inside AgentCoder class but they are about handling the Local server)

I thought that would be enough? or maybe I'm misunderstanding 🤔

Comment thread cecli/mcp/manager.py
Comment on lines +265 to +280
for server, did_connect in results:
if not did_connect and server.name not in ["unnamed-server", "Local"]:
io.tool_warning(
f"MCP tool initialization failed after multiple retries: {server.name}"
)

if verbose:
io.tool_output("MCP servers configured:")

for server, _ in results:
io.tool_output(f" - {server.name}")

for tool in mcp_manager.get_server_tools(server.name):
tool_name = tool.get("function", {}).get("name", "unknown")
tool_desc = tool.get("function", {}).get("description", "").split("\n")[0]
io.tool_output(f" - {tool_name}: {tool_desc}")
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't like that im handling IO operations in a classmethod but it shouold be fine? this is only ever needed at initialization so letting user know details should be okay?

Comment thread tests/basic/test_coder.py
io.confirm_ask.assert_called_once_with("Edit the files?", allow_tweak=False)
mock_create.assert_not_called()

@patch("cecli.coders.base_coder.experimental_mcp_client")
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Superceded by the tests for the manager class

| **/read-only** | Add files to the chat that are for reference only, or turn added files to read-only |
| **/reasoning-effort** | Set the reasoning effort level (values: number or low/medium/high depending on model) |
| **/report** | Report a problem by opening a GitHub Issue |
| **/remove-mcp** | Remove a MCP server by name |
Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Running documentation scripts were broken. Needs to be updated in a follow up PR.

@gopar gopar marked this pull request as ready for review January 15, 2026 19:04
@dwash96 dwash96 changed the base branch from main to v0.96.0 January 17, 2026 16:30
@dwash96 dwash96 merged commit e8d0156 into cecli-dev:v0.96.0 Jan 17, 2026
8 checks passed
@dwash96 dwash96 mentioned this pull request Jan 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Dynamic MCP Server Management

2 participants