Skip to content

Conversation

@ttommyth
Copy link
Contributor

Adds the interactive-mcp server to the list of community servers.

Description

This PR adds interactive-mcp, a Node.js/TypeScript MCP Server designed to run locally, enabling interactive communication between LLMs and users through native OS notifications and command-line prompts.

Server Details

  • Server: interactive-mcp (New Addition)
  • Repository: https://github.com/ttommyth/interactive-mcp
  • Main Features/Tools:
    • request_user_input: Asks the user a question via a command-line prompt, optionally with predefined options.
    • message_complete_notification: Sends a native OS notification.
    • start_intensive_chat, ask_intensive_chat, stop_intensive_chat: Manage a persistent command-line chat session for multiple interactions.

Motivation and Context

LLMs often need to clarify instructions, confirm actions, or gather feedback directly from the user during complex tasks. Currently, this often involves guesswork or requires the user to anticipate the LLM's needs. interactive-mcp provides explicit tools for the LLM to request input or notify the user directly on their local machine, reducing ambiguity and improving the reliability of LLM-driven workflows. It allows the LLM to "stop guessing" and ask directly.

See the introductory blog post for more context: Stop Your AI Assistant From Guessing — Introducing interactive-mcp

How Has This Been Tested?

Tested with Cursor and VSCode Copilot in both Windows and MacOS. All tools provided by the server (request_user_input, message_complete_notification, start_intensive_chat, ask_intensive_chat, stop_intensive_chat) were tested in various interaction scenarios.

Breaking Changes

None. This is a new server addition.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality) - Adding a new server to the list.
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation update

Checklist

  • I have read the MCP Protocol Documentation
  • My changes follows MCP security best practices (Note: This server runs locally, accessing OS features like notifications and terminal prompts. Security relies on the user running it in a trusted environment.)
  • I have updated the server's README accordingly
  • I have tested this with an LLM client
  • My code follows the repository's style guidelines (Referring to the interactive-mcp repo)
  • New and existing tests pass locally (Referring to the interactive-mcp repo, if applicable)
  • I have added appropriate error handling (Referring to the interactive-mcp repo)
  • I have documented all environment variables and configuration options (Command-line options are documented in the README)

Additional context

This server requires local execution alongside the MCP client due to its need to interact with the host operating system's UI (notifications, terminal). It's configured easily using npx for installation and execution within client configurations like Claude Desktop or Cursor.

@ttommyth
Copy link
Contributor Author

Hi @tadasant and @olaservo,

Would you please help take some time to review this pull request? I've added interactive-mcp to the list of community servers. Interactive-mcp is a Node.js/TypeScript MCP server designed to run locally, enabling direct communication between LLMs and users through native OS notifications and command-line prompts. It allows AI assistants to request input directly from users rather than having to guess, which significantly improves the reliability of LLM-driven workflows.

Thanks for your consideration!

@tadasant tadasant merged commit df91168 into modelcontextprotocol:main May 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants