Conversation
Add detailed description of MCP integration as the primary development focus. MCP support will make Bishop extensible through the MCP ecosystem, allowing easy integration with external tools and data sources.
There was a problem hiding this comment.
Review: ROADMAP.md — Mark MCP-native-shell as next priority
Deployability: No production risk. This is a documentation-only change with zero impact on the build, binary, or runtime behavior.
Summary: This is a well-structured roadmap update. The new "Next Priority" section is clear and gives the team a shared understanding of the architectural direction. The decision to expose MCP as a first-class host (rather than a thin wrapper or one-off tool integration) is sound — it positions Bishop to grow with the MCP ecosystem rather than accumulating bespoke integrations.
Strengths:
- The description explains what MCP is and why it fits Bishop's model, not just what will be built. That context is valuable for contributors picking up this work.
- Promoting the item in-place (with a cross-reference) rather than moving it avoids losing the sub-bullet use cases that belong with the original item.
- Small, focused PR — exactly the right batch size for a planning/direction change.
Minor observations (see inline comments):
- The "consistent interface across LLM providers" bullet may overstate the scope; worth clarifying whether provider-agnosticism is an explicit design goal.
- The sub-bullets under "MCP support" in the backlog are use cases, not acceptance criteria — consider marking them as such so the team has a clear definition of done for the initiative.
Neither observation blocks this change. The PR correctly keeps the system deployable and gives the team clear directional alignment.
| - Provide a consistent interface for tool calling across different LLM providers | ||
|
|
||
| This makes Bishop extensible through the MCP ecosystem rather than requiring custom tool integrations for each data source. | ||
|
|
There was a problem hiding this comment.
This bullet — "Provide a consistent interface for tool calling across different LLM providers" — reads more like a design goal for the overall MCP ecosystem than a specific outcome Bishop will deliver. If Bishop's MCP host implementation is intentionally designed to be provider-agnostic, that's worth calling out explicitly (e.g. "abstracts tool calling so the same MCP servers work regardless of which LLM backend is configured"). If it's just a side-effect of MCP being an open protocol, it may be overstating what this initiative achieves and could set expectations that are hard to meet later.
| - support agent modifying memory | ||
| - [set ollama context window](https://github.com/ollama/ollama/pull/6504) | ||
| - MCP support | ||
| - MCP support (→ **NEXT PRIORITY - see above**) |
There was a problem hiding this comment.
Good cross-reference. The (→ **NEXT PRIORITY - see above**) annotation keeps the original backlog item in context while making the promotion clear. This avoids readers wondering why the same item appears in two places.
One consideration: the sub-bullets ("allow agent to search the web", "allow agent to browse web urls") represent specific use cases that would be enabled by MCP support, not items to complete before moving it to "done". It might be worth making this parent/child relationship explicit in the new top-section description, so the team has concrete acceptance criteria for what "done" means — otherwise MCP support could be perpetually in-progress as new use cases are added.
| - MCP support (→ **NEXT PRIORITY - see above**) | |
| - MCP support (→ **NEXT PRIORITY - see above**) | |
| - [example use case] allow agent to search the web | |
| - [example use case] allow agent to browse web urls |
Add detailed description of MCP integration as the primary development focus.