Skip to content

Support LLM Discovery and Usage of Prompts and Resources #1495

@kartheekp-ms

Description

@kartheekp-ms

Is your feature request related to a problem? Please describe.

Currently, Prompts are explicitly user-controlled and Resources are application-driven, which limits the flexibility of LLMs in dynamically discovering and utilizing them. This can lead to missed opportunities for automation, context-awareness, and intelligent suggestions, especially in complex workflows where the model could benefit from proactively surfacing relevant prompts or resources.

Reference https://modelcontextprotocol.io/docs/learn/server-concepts#core-server-features

Describe the solution you'd like

I’d like to propose making Prompts and Resources optionally LLM-controlled, similar to how Tools are handled. This would allow models to:

  • Discover and suggest relevant prompts based on context.
  • Dynamically retrieve and include resources to enrich responses.
  • Use parameter completion and metadata to guide users intelligently.
  • Maintain user oversight through approval mechanisms, visibility settings, and activity logs.

This hybrid control model would preserve user agency while unlocking more powerful and adaptive workflows.

Describe alternatives you've considered

  • None as of now.

Additional context

By extending LLM control to prompts and resources, we can enable richer interactions like:

  • Auto-suggesting prompts in context menus or command palettes.
  • Pre-fetching resources based on conversation history or inferred needs.
  • Creating adaptive workflows that evolve with user input and model reasoning.

Similar ask: langchain-ai/langchain-mcp-adapters#62

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions