From 1bde60174c56807a548cf8739c78dc9a2ed69bf3 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 00:53:24 +0000 Subject: [PATCH 01/15] Initial plan From 85509b73a9a1547d45a279cd75a01a9c9fd6842c Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 00:59:52 +0000 Subject: [PATCH 02/15] Populate package READMEs with documentation Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 63 +++++++++++++++++++- packages/ai/README.md | 102 ++++++++++++++++++++++++++++++++- packages/devtools/README.md | 69 ++++++++++++++++++++++ packages/mcpplugin/README.md | 73 +++++++++++++++++++++++ packages/openai/README.md | 92 ++++++++++++++++++++++++++++- 5 files changed, 394 insertions(+), 5 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index 2e7adbef..d772426f 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -1,4 +1,4 @@ -# Microsoft Teams A2A +# Microsoft Teams A2A Protocol

@@ -8,6 +8,67 @@

+ +Agent-to-Agent (A2A) protocol support for Microsoft Teams AI applications. +Enables Teams agents to communicate and collaborate with other AI agents using standardized protocols. + + +## Features + +- **Agent Communication**: Enable Teams agents to communicate with other A2A-compatible agents +- **HTTP Server Support**: Built-in HTTP server for A2A protocol endpoints +- **Prompt Integration**: Seamless integration with Teams AI prompt system +- **Standardized Protocol**: Uses the A2A SDK for standard agent communication + +## Installation + +```bash +# Using uv (recommended) +uv add microsoft-teams-a2a + +# Using pip +pip install microsoft-teams-a2a +``` + +## Quick Start + +```python +from microsoft.teams.apps import App +from microsoft.teams.ai import PromptManager +from microsoft.teams.a2a import A2AServerPlugin + +app = App() + +# Configure A2A server plugin +a2a_plugin = A2AServerPlugin( + port=5000, + host="0.0.0.0" +) + +# Register the plugin with your app +app.use(a2a_plugin) + +# Your Teams agent is now accessible via A2A protocol +``` + +## Agent Integration + +```python +# Your Teams agent can now receive requests from other A2A agents +# and respond according to your configured prompts and actions + +@app.on_message +async def handle_message(ctx: ActivityContext[MessageActivity]): + # Handle both Teams messages and A2A requests + await ctx.send(f"Received: {ctx.activity.text}") +``` + +## Use Cases + +- Multi-agent collaboration systems +- Agent orchestration and delegation +- Cross-platform agent communication +- Distributed AI workflows diff --git a/packages/ai/README.md b/packages/ai/README.md index bff9bccf..04d69f50 100644 --- a/packages/ai/README.md +++ b/packages/ai/README.md @@ -9,8 +9,106 @@

-AI tools and utilities. +AI-powered conversational experiences for Microsoft Teams applications. +Provides prompt management, action planning, and model integration for building intelligent Teams bots. - \ No newline at end of file + + +## Features + +- **Prompt Management**: Template-based prompt system with variable substitution +- **Action Planning**: AI-driven action execution with validation +- **Model Integration**: Compatible with OpenAI, Azure OpenAI, and custom models +- **Memory Management**: Conversation history and state management +- **Function Calling**: Structured actions and tool use + +## Installation + +```bash +# Using uv (recommended) +uv add microsoft-teams-ai + +# Using pip +pip install microsoft-teams-ai +``` + +## Quick Start + +```python +from microsoft.teams.ai import AIAgent, PromptManager +from microsoft.teams.openai import OpenAIModel + +# Create prompt manager +prompts = PromptManager() + +# Configure AI model +model = OpenAIModel( + api_key="your-api-key", + model="gpt-4" +) + +# Create AI agent +agent = AIAgent( + model=model, + prompts=prompts +) + +# Use in Teams app +@app.on_message +async def handle_message(ctx: ActivityContext[MessageActivity]): + response = await agent.run(ctx) + await ctx.send(response) +``` + +## Prompt Templates + +```python +# Define a prompt template +prompts.add_prompt("greeting", """ +You are a helpful assistant for {{company}}. +Greet the user and ask how you can help them today. +""") + +# Use the prompt with variables +result = await agent.run( + ctx, + prompt_name="greeting", + variables={"company": "Contoso"} +) +``` + +## Actions and Tools + +```python +from microsoft.teams.ai import Action + +# Register custom actions +@agent.action("get_weather") +async def get_weather(context, parameters): + location = parameters.get("location") + # Fetch weather data + return {"temperature": 72, "conditions": "sunny"} + +# AI can now call this action when needed +``` + +## Memory and State + +```python +# Configure memory for conversation history +from microsoft.teams.ai import ConversationMemory + +memory = ConversationMemory(max_turns=10) +agent = AIAgent( + model=model, + prompts=prompts, + memory=memory +) + +# Access conversation state +state = await memory.get_state(ctx) +state["user_preference"] = "dark_mode" +await memory.save_state(ctx, state) +``` \ No newline at end of file diff --git a/packages/devtools/README.md b/packages/devtools/README.md index 7a6df11e..7893f45b 100644 --- a/packages/devtools/README.md +++ b/packages/devtools/README.md @@ -22,3 +22,72 @@ Developer tools for locally testing and debugging Teams applications. Streamline - **Local Testing**: Test Teams apps locally without deployment - **Bot Emulator**: Simulate Teams conversations and interactions +- **Web Interface**: Browser-based UI for testing bot responses +- **Activity Inspector**: View and inspect incoming/outgoing activities +- **No Tunneling Required**: Works entirely locally without ngrok or similar tools + +## Installation + +```bash +# Using uv (recommended) +uv add microsoft-teams-devtools + +# Using pip +pip install microsoft-teams-devtools +``` + +## Quick Start + +```python +from microsoft.teams.apps import App +from microsoft.teams.devtools import DevToolsPlugin + +app = App() + +# Add DevTools plugin (automatically enabled in development) +app.use(DevToolsPlugin(port=3979)) + +# Start your app +await app.start() + +# Open http://localhost:3979/devtools in your browser +``` + +## Using the Web Interface + +Once your app is running with DevTools enabled: + +1. Navigate to `http://localhost:3979/devtools` +2. Send messages to your bot through the interface +3. View bot responses and activity logs in real-time +4. Inspect activity payloads and debug issues + +## Configuration + +```python +# Customize DevTools settings +devtools = DevToolsPlugin( + port=3979, # DevTools UI port + host="localhost", # Bind address + auto_open=True # Open browser automatically +) + +app.use(devtools) +``` + +## Environment-Based Activation + +```python +import os + +# Only enable DevTools in development +if os.getenv("ENVIRONMENT") == "development": + app.use(DevToolsPlugin()) +``` + +## Debugging Tips + +- Use the activity inspector to examine message payloads +- Test different message types (text, cards, attachments) +- Verify authentication flows locally +- Debug action handlers without Teams client diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index e69de29b..2bc15c2b 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -0,0 +1,73 @@ +# Microsoft Teams MCP Plugin + +

+ + + + + + +

+ +Model Context Protocol (MCP) integration for Microsoft Teams AI applications. +Enables Teams bots to use MCP servers as tools and resources. + + + + + +## Features + +- **MCP Server Integration**: Connect to MCP servers for extended capabilities +- **Tool Execution**: Execute MCP tools from within Teams conversations +- **Resource Access**: Access MCP resources in your bot logic +- **FastMCP Support**: Compatible with FastMCP server implementations + +## Installation + +```bash +# Using uv (recommended) +uv add microsoft-teams-mcpplugin + +# Using pip +pip install microsoft-teams-mcpplugin +``` + +## Quick Start + +```python +from microsoft.teams.apps import App +from microsoft.teams.ai import PromptManager +from microsoft.teams.mcpplugin import MCPPlugin + +app = App() + +# Configure MCP plugin +mcp_plugin = MCPPlugin( + server_command="uvx", + server_args=["mcp-server-time"] +) + +# Register the plugin +app.use(mcp_plugin) + +# MCP tools are now available to your AI agent +``` + +## Using MCP Resources + +```python +# Access MCP resources +resources = await mcp_plugin.list_resources() + +for resource in resources: + content = await mcp_plugin.read_resource(resource.uri) + print(f"Resource: {resource.name}, Content: {content}") +``` + +## Supported MCP Servers + +This plugin works with any MCP-compliant server, including: +- Built-in MCP servers (filesystem, git, etc.) +- Custom MCP servers +- FastMCP-based servers diff --git a/packages/openai/README.md b/packages/openai/README.md index 9d16f434..37e42cdb 100644 --- a/packages/openai/README.md +++ b/packages/openai/README.md @@ -9,8 +9,96 @@

-OpenAI model implementations to be used with @microsoft-teams-ai. Supports all OpenAI-like API models. +OpenAI model implementations for Microsoft Teams AI applications. +Supports OpenAI and OpenAI-compatible APIs for chat completions and embeddings. - \ No newline at end of file + + +## Features + +- **Chat Completions**: GPT-3.5, GPT-4, and compatible models +- **Streaming Support**: Real-time response streaming +- **Function Calling**: Native support for OpenAI function calling +- **OpenAI-Compatible APIs**: Works with Azure OpenAI, LM Studio, and other compatible services + +## Installation + +```bash +# Using uv (recommended) +uv add microsoft-teams-openai + +# Using pip +pip install microsoft-teams-openai +``` + +## Quick Start + +```python +from microsoft.teams.openai import OpenAIModel +from microsoft.teams.ai import PromptManager + +# Configure OpenAI model +model = OpenAIModel( + api_key="your-api-key", + model="gpt-4", + temperature=0.7 +) + +# Use with Teams AI +prompt_manager = PromptManager() +result = await model.complete(prompt_manager, "Hello!") +``` + +## Azure OpenAI Support + +```python +from microsoft.teams.openai import AzureOpenAIModel + +# Configure Azure OpenAI +model = AzureOpenAIModel( + api_key="your-azure-key", + endpoint="https://your-resource.openai.azure.com/", + deployment="your-deployment-name", + api_version="2024-02-01" +) +``` + +## Streaming Responses + +```python +# Enable streaming for real-time responses +model = OpenAIModel( + api_key="your-api-key", + model="gpt-4", + stream=True +) + +async for chunk in model.stream_complete(prompt_manager, "Tell me a story"): + print(chunk.content, end="", flush=True) +``` + +## Function Calling + +```python +# Define functions for the model to call +functions = [ + { + "name": "get_weather", + "description": "Get the weather for a location", + "parameters": { + "type": "object", + "properties": { + "location": {"type": "string"} + } + } + } +] + +model = OpenAIModel( + api_key="your-api-key", + model="gpt-4", + functions=functions +) +``` \ No newline at end of file From efc82c833323ffddf7044462ba7c976f95e27071 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 01:05:12 +0000 Subject: [PATCH 03/15] Fix class names in README examples to match actual API Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 10 +++-- packages/ai/README.md | 68 +++++++++++++++++----------------- packages/mcpplugin/README.md | 17 ++++----- packages/openai/README.md | 62 ++++++++++++++++--------------- 4 files changed, 81 insertions(+), 76 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index d772426f..562799fb 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -37,13 +37,12 @@ pip install microsoft-teams-a2a ```python from microsoft.teams.apps import App -from microsoft.teams.ai import PromptManager -from microsoft.teams.a2a import A2AServerPlugin +from microsoft.teams.a2a import A2APlugin app = App() -# Configure A2A server plugin -a2a_plugin = A2AServerPlugin( +# Configure A2A plugin +a2a_plugin = A2APlugin( port=5000, host="0.0.0.0" ) @@ -57,6 +56,9 @@ app.use(a2a_plugin) ## Agent Integration ```python +from microsoft.teams.api import MessageActivity +from microsoft.teams.apps import ActivityContext + # Your Teams agent can now receive requests from other A2A agents # and respond according to your configured prompts and actions diff --git a/packages/ai/README.md b/packages/ai/README.md index 04d69f50..841357a4 100644 --- a/packages/ai/README.md +++ b/packages/ai/README.md @@ -37,23 +37,17 @@ pip install microsoft-teams-ai ## Quick Start ```python -from microsoft.teams.ai import AIAgent, PromptManager -from microsoft.teams.openai import OpenAIModel +from microsoft.teams.ai import Agent, ChatPrompt +from microsoft.teams.openai import OpenAICompletionsAIModel -# Create prompt manager -prompts = PromptManager() - -# Configure AI model -model = OpenAIModel( +# Create AI model +model = OpenAICompletionsAIModel( api_key="your-api-key", model="gpt-4" ) -# Create AI agent -agent = AIAgent( - model=model, - prompts=prompts -) +# Create agent +agent = Agent(model=model) # Use in Teams app @app.on_message @@ -65,16 +59,15 @@ async def handle_message(ctx: ActivityContext[MessageActivity]): ## Prompt Templates ```python -# Define a prompt template -prompts.add_prompt("greeting", """ -You are a helpful assistant for {{company}}. -Greet the user and ask how you can help them today. -""") +# Define a chat prompt +prompt = ChatPrompt( + instructions="You are a helpful assistant for {{company}}.", + functions=[] +) # Use the prompt with variables -result = await agent.run( - ctx, - prompt_name="greeting", +result = await agent.chat( + prompt, variables={"company": "Contoso"} ) ``` @@ -82,33 +75,40 @@ result = await agent.run( ## Actions and Tools ```python -from microsoft.teams.ai import Action +from microsoft.teams.ai import Function + +# Register custom functions +weather_function = Function( + name="get_weather", + description="Get weather for a location", + parameters={ + "type": "object", + "properties": { + "location": {"type": "string"} + } + } +) -# Register custom actions -@agent.action("get_weather") -async def get_weather(context, parameters): - location = parameters.get("location") +# Add function handler +@agent.function("get_weather") +async def get_weather(location: str): # Fetch weather data return {"temperature": 72, "conditions": "sunny"} - -# AI can now call this action when needed ``` ## Memory and State ```python # Configure memory for conversation history -from microsoft.teams.ai import ConversationMemory +from microsoft.teams.ai import ListMemory -memory = ConversationMemory(max_turns=10) -agent = AIAgent( +memory = ListMemory(max_items=10) +agent = Agent( model=model, - prompts=prompts, memory=memory ) # Access conversation state -state = await memory.get_state(ctx) -state["user_preference"] = "dark_mode" -await memory.save_state(ctx, state) +messages = await memory.get() +await memory.add(user_message) ``` \ No newline at end of file diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index 2bc15c2b..14afaef4 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -37,13 +37,12 @@ pip install microsoft-teams-mcpplugin ```python from microsoft.teams.apps import App -from microsoft.teams.ai import PromptManager -from microsoft.teams.mcpplugin import MCPPlugin +from microsoft.teams.mcpplugin import McpClientPlugin app = App() -# Configure MCP plugin -mcp_plugin = MCPPlugin( +# Configure MCP client plugin +mcp_plugin = McpClientPlugin( server_command="uvx", server_args=["mcp-server-time"] ) @@ -57,12 +56,12 @@ app.use(mcp_plugin) ## Using MCP Resources ```python -# Access MCP resources -resources = await mcp_plugin.list_resources() +# Access MCP resources through the plugin +# The plugin integrates with your Teams AI agent +# and makes MCP tools available as functions -for resource in resources: - content = await mcp_plugin.read_resource(resource.uri) - print(f"Resource: {resource.name}, Content: {content}") +# MCP tools can be called by the AI model +# when using the Teams AI framework ``` ## Supported MCP Servers diff --git a/packages/openai/README.md b/packages/openai/README.md index 37e42cdb..fbc1c127 100644 --- a/packages/openai/README.md +++ b/packages/openai/README.md @@ -36,32 +36,31 @@ pip install microsoft-teams-openai ## Quick Start ```python -from microsoft.teams.openai import OpenAIModel -from microsoft.teams.ai import PromptManager +from microsoft.teams.openai import OpenAICompletionsAIModel +from microsoft.teams.ai import ChatPrompt # Configure OpenAI model -model = OpenAIModel( +model = OpenAICompletionsAIModel( api_key="your-api-key", model="gpt-4", temperature=0.7 ) # Use with Teams AI -prompt_manager = PromptManager() -result = await model.complete(prompt_manager, "Hello!") +prompt = ChatPrompt(instructions="You are a helpful assistant.") +result = await model.chat(prompt) ``` ## Azure OpenAI Support ```python -from microsoft.teams.openai import AzureOpenAIModel +from microsoft.teams.openai import OpenAICompletionsAIModel # Configure Azure OpenAI -model = AzureOpenAIModel( +model = OpenAICompletionsAIModel( api_key="your-azure-key", - endpoint="https://your-resource.openai.azure.com/", - deployment="your-deployment-name", - api_version="2024-02-01" + base_url="https://your-resource.openai.azure.com/openai/deployments/your-deployment-name", + model="gpt-4" ) ``` @@ -69,36 +68,41 @@ model = AzureOpenAIModel( ```python # Enable streaming for real-time responses -model = OpenAIModel( +model = OpenAIResponsesAIModel( api_key="your-api-key", - model="gpt-4", - stream=True + model="gpt-4" ) -async for chunk in model.stream_complete(prompt_manager, "Tell me a story"): - print(chunk.content, end="", flush=True) +# Streaming is handled through the model's chat method +result = await model.chat(prompt, stream=True) ``` ## Function Calling ```python +from microsoft.teams.ai import Function + # Define functions for the model to call -functions = [ - { - "name": "get_weather", - "description": "Get the weather for a location", - "parameters": { - "type": "object", - "properties": { - "location": {"type": "string"} - } - } +get_weather = Function( + name="get_weather", + description="Get the weather for a location", + parameters={ + "type": "object", + "properties": { + "location": {"type": "string"} + }, + "required": ["location"] } -] +) -model = OpenAIModel( +# Add functions to your prompt +prompt = ChatPrompt( + instructions="You are a helpful assistant.", + functions=[get_weather] +) + +model = OpenAICompletionsAIModel( api_key="your-api-key", - model="gpt-4", - functions=functions + model="gpt-4" ) ``` \ No newline at end of file From 5051e9220ac7d3d12ecb8818f418374f1c1eb8db Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 01:08:20 +0000 Subject: [PATCH 04/15] Fix API method calls in README examples Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/ai/README.md | 77 ++++++++++++++++++--------------------- packages/openai/README.md | 65 +++++++++++++++++++-------------- 2 files changed, 74 insertions(+), 68 deletions(-) diff --git a/packages/ai/README.md b/packages/ai/README.md index 841357a4..21843484 100644 --- a/packages/ai/README.md +++ b/packages/ai/README.md @@ -37,7 +37,7 @@ pip install microsoft-teams-ai ## Quick Start ```python -from microsoft.teams.ai import Agent, ChatPrompt +from microsoft.teams.ai import Agent from microsoft.teams.openai import OpenAICompletionsAIModel # Create AI model @@ -52,63 +52,58 @@ agent = Agent(model=model) # Use in Teams app @app.on_message async def handle_message(ctx: ActivityContext[MessageActivity]): - response = await agent.run(ctx) - await ctx.send(response) + result = await agent.send( + input=ctx.activity.text, + instructions="You are a helpful assistant." + ) + await ctx.send(result.response.content) ``` -## Prompt Templates +## Prompt Instructions ```python -# Define a chat prompt -prompt = ChatPrompt( - instructions="You are a helpful assistant for {{company}}.", - functions=[] -) - -# Use the prompt with variables -result = await agent.chat( - prompt, - variables={"company": "Contoso"} +# Define custom instructions for your agent +result = await agent.send( + input="What can you help me with?", + instructions="You are a helpful assistant for {{company}}. Be professional and courteous." ) ``` -## Actions and Tools +## Function Calling ```python from microsoft.teams.ai import Function +from pydantic import BaseModel -# Register custom functions -weather_function = Function( - name="get_weather", - description="Get weather for a location", - parameters={ - "type": "object", - "properties": { - "location": {"type": "string"} - } - } -) +class GetWeatherParams(BaseModel): + location: str -# Add function handler -@agent.function("get_weather") -async def get_weather(location: str): +async def get_weather_handler(params: GetWeatherParams) -> str: # Fetch weather data - return {"temperature": 72, "conditions": "sunny"} + return f"Weather in {params.location}: sunny, 72°F" + +# Register function with agent +agent.with_function( + Function( + name="get_weather", + description="Get weather for a location", + parameter_schema=GetWeatherParams, + handler=get_weather_handler + ) +) ``` -## Memory and State +## Memory Management ```python -# Configure memory for conversation history -from microsoft.teams.ai import ListMemory +from microsoft.teams.ai import ListMemory, UserMessage -memory = ListMemory(max_items=10) -agent = Agent( - model=model, - memory=memory -) +# Create memory for conversation history +memory = ListMemory() + +# Add messages to memory +await memory.push(UserMessage(content="Hello")) -# Access conversation state -messages = await memory.get() -await memory.add(user_message) +# Retrieve conversation history +messages = await memory.get_all() ``` \ No newline at end of file diff --git a/packages/openai/README.md b/packages/openai/README.md index fbc1c127..892aaa97 100644 --- a/packages/openai/README.md +++ b/packages/openai/README.md @@ -37,7 +37,6 @@ pip install microsoft-teams-openai ```python from microsoft.teams.openai import OpenAICompletionsAIModel -from microsoft.teams.ai import ChatPrompt # Configure OpenAI model model = OpenAICompletionsAIModel( @@ -46,9 +45,14 @@ model = OpenAICompletionsAIModel( temperature=0.7 ) -# Use with Teams AI -prompt = ChatPrompt(instructions="You are a helpful assistant.") -result = await model.chat(prompt) +# Use with Teams AI Agent +from microsoft.teams.ai import Agent + +agent = Agent(model=model) +result = await agent.send( + input="Hello!", + instructions="You are a helpful assistant." +) ``` ## Azure OpenAI Support @@ -56,7 +60,7 @@ result = await model.chat(prompt) ```python from microsoft.teams.openai import OpenAICompletionsAIModel -# Configure Azure OpenAI +# Configure Azure OpenAI with base_url model = OpenAICompletionsAIModel( api_key="your-azure-key", base_url="https://your-resource.openai.azure.com/openai/deployments/your-deployment-name", @@ -67,42 +71,49 @@ model = OpenAICompletionsAIModel( ## Streaming Responses ```python -# Enable streaming for real-time responses +from microsoft.teams.openai import OpenAIResponsesAIModel + +# Use OpenAIResponsesAIModel for streaming support model = OpenAIResponsesAIModel( api_key="your-api-key", model="gpt-4" ) -# Streaming is handled through the model's chat method -result = await model.chat(prompt, stream=True) +# Streaming is handled through callback +def on_chunk(chunk: str): + print(chunk, end="", flush=True) + +result = await model.generate_text( + prompt=chat_prompt, + on_chunk=on_chunk +) ``` ## Function Calling ```python -from microsoft.teams.ai import Function - -# Define functions for the model to call -get_weather = Function( - name="get_weather", - description="Get the weather for a location", - parameters={ - "type": "object", - "properties": { - "location": {"type": "string"} - }, - "required": ["location"] - } -) +from microsoft.teams.ai import Agent, Function +from pydantic import BaseModel -# Add functions to your prompt -prompt = ChatPrompt( - instructions="You are a helpful assistant.", - functions=[get_weather] -) +class GetWeatherParams(BaseModel): + location: str + +async def get_weather_handler(params: GetWeatherParams) -> str: + return f"Weather in {params.location}: sunny, 72°F" +# Configure agent with function model = OpenAICompletionsAIModel( api_key="your-api-key", model="gpt-4" ) + +agent = Agent(model=model) +agent.with_function( + Function( + name="get_weather", + description="Get the weather for a location", + parameter_schema=GetWeatherParams, + handler=get_weather_handler + ) +) ``` \ No newline at end of file From 19e1e5ec0348bd8e78d87e1d0759fa4c86b12ae1 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 01:10:54 +0000 Subject: [PATCH 05/15] Fix remaining API details in README examples Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 19 +++++++++++-------- packages/openai/README.md | 3 ++- 2 files changed, 13 insertions(+), 9 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index 562799fb..ffd22707 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -37,14 +37,20 @@ pip install microsoft-teams-a2a ```python from microsoft.teams.apps import App -from microsoft.teams.a2a import A2APlugin +from microsoft.teams.a2a import A2APlugin, A2APluginOptions +from a2a.types import AgentCard app = App() -# Configure A2A plugin +# Configure A2A plugin with agent card +agent_card = AgentCard( + name="My Agent", + description="A helpful agent", + capabilities={} +) + a2a_plugin = A2APlugin( - port=5000, - host="0.0.0.0" + A2APluginOptions(agent_card=agent_card) ) # Register the plugin with your app @@ -59,12 +65,9 @@ app.use(a2a_plugin) from microsoft.teams.api import MessageActivity from microsoft.teams.apps import ActivityContext -# Your Teams agent can now receive requests from other A2A agents -# and respond according to your configured prompts and actions - +# Define message handler for both Teams and A2A requests @app.on_message async def handle_message(ctx: ActivityContext[MessageActivity]): - # Handle both Teams messages and A2A requests await ctx.send(f"Received: {ctx.activity.text}") ``` diff --git a/packages/openai/README.md b/packages/openai/README.md index 892aaa97..9eeb68f0 100644 --- a/packages/openai/README.md +++ b/packages/openai/README.md @@ -72,6 +72,7 @@ model = OpenAICompletionsAIModel( ```python from microsoft.teams.openai import OpenAIResponsesAIModel +from microsoft.teams.ai import UserMessage # Use OpenAIResponsesAIModel for streaming support model = OpenAIResponsesAIModel( @@ -84,7 +85,7 @@ def on_chunk(chunk: str): print(chunk, end="", flush=True) result = await model.generate_text( - prompt=chat_prompt, + input=UserMessage(content="Tell me a story"), on_chunk=on_chunk ) ``` From 5c2155432fe89342764ba01126ec85418037be0a Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 07:20:16 +0000 Subject: [PATCH 06/15] Simplify READMEs to show only basic use cases Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 41 +-------------- packages/ai/README.md | 79 ++--------------------------- packages/devtools/README.md | 57 +-------------------- packages/mcpplugin/README.md | 35 +------------ packages/openai/README.md | 91 +--------------------------------- 5 files changed, 10 insertions(+), 293 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index ffd22707..9a926bb8 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -16,24 +16,13 @@ Enables Teams agents to communicate and collaborate with other AI agents using s -## Features - -- **Agent Communication**: Enable Teams agents to communicate with other A2A-compatible agents -- **HTTP Server Support**: Built-in HTTP server for A2A protocol endpoints -- **Prompt Integration**: Seamless integration with Teams AI prompt system -- **Standardized Protocol**: Uses the A2A SDK for standard agent communication - ## Installation ```bash -# Using uv (recommended) uv add microsoft-teams-a2a - -# Using pip -pip install microsoft-teams-a2a ``` -## Quick Start +## Usage ```python from microsoft.teams.apps import App @@ -42,38 +31,12 @@ from a2a.types import AgentCard app = App() -# Configure A2A plugin with agent card agent_card = AgentCard( name="My Agent", description="A helpful agent", capabilities={} ) -a2a_plugin = A2APlugin( - A2APluginOptions(agent_card=agent_card) -) - -# Register the plugin with your app +a2a_plugin = A2APlugin(A2APluginOptions(agent_card=agent_card)) app.use(a2a_plugin) - -# Your Teams agent is now accessible via A2A protocol ``` - -## Agent Integration - -```python -from microsoft.teams.api import MessageActivity -from microsoft.teams.apps import ActivityContext - -# Define message handler for both Teams and A2A requests -@app.on_message -async def handle_message(ctx: ActivityContext[MessageActivity]): - await ctx.send(f"Received: {ctx.activity.text}") -``` - -## Use Cases - -- Multi-agent collaboration systems -- Agent orchestration and delegation -- Cross-platform agent communication -- Distributed AI workflows diff --git a/packages/ai/README.md b/packages/ai/README.md index 21843484..5d671e5c 100644 --- a/packages/ai/README.md +++ b/packages/ai/README.md @@ -16,94 +16,23 @@ Provides prompt management, action planning, and model integration for building -## Features - -- **Prompt Management**: Template-based prompt system with variable substitution -- **Action Planning**: AI-driven action execution with validation -- **Model Integration**: Compatible with OpenAI, Azure OpenAI, and custom models -- **Memory Management**: Conversation history and state management -- **Function Calling**: Structured actions and tool use - ## Installation ```bash -# Using uv (recommended) uv add microsoft-teams-ai - -# Using pip -pip install microsoft-teams-ai ``` -## Quick Start +## Usage ```python from microsoft.teams.ai import Agent from microsoft.teams.openai import OpenAICompletionsAIModel -# Create AI model -model = OpenAICompletionsAIModel( - api_key="your-api-key", - model="gpt-4" -) - -# Create agent +model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") agent = Agent(model=model) -# Use in Teams app -@app.on_message -async def handle_message(ctx: ActivityContext[MessageActivity]): - result = await agent.send( - input=ctx.activity.text, - instructions="You are a helpful assistant." - ) - await ctx.send(result.response.content) -``` - -## Prompt Instructions - -```python -# Define custom instructions for your agent result = await agent.send( - input="What can you help me with?", - instructions="You are a helpful assistant for {{company}}. Be professional and courteous." -) -``` - -## Function Calling - -```python -from microsoft.teams.ai import Function -from pydantic import BaseModel - -class GetWeatherParams(BaseModel): - location: str - -async def get_weather_handler(params: GetWeatherParams) -> str: - # Fetch weather data - return f"Weather in {params.location}: sunny, 72°F" - -# Register function with agent -agent.with_function( - Function( - name="get_weather", - description="Get weather for a location", - parameter_schema=GetWeatherParams, - handler=get_weather_handler - ) + input="Hello!", + instructions="You are a helpful assistant." ) -``` - -## Memory Management - -```python -from microsoft.teams.ai import ListMemory, UserMessage - -# Create memory for conversation history -memory = ListMemory() - -# Add messages to memory -await memory.push(UserMessage(content="Hello")) - -# Retrieve conversation history -messages = await memory.get_all() ``` \ No newline at end of file diff --git a/packages/devtools/README.md b/packages/devtools/README.md index 7893f45b..bc25a594 100644 --- a/packages/devtools/README.md +++ b/packages/devtools/README.md @@ -18,76 +18,21 @@ Developer tools for locally testing and debugging Teams applications. Streamline -## Features - -- **Local Testing**: Test Teams apps locally without deployment -- **Bot Emulator**: Simulate Teams conversations and interactions -- **Web Interface**: Browser-based UI for testing bot responses -- **Activity Inspector**: View and inspect incoming/outgoing activities -- **No Tunneling Required**: Works entirely locally without ngrok or similar tools - ## Installation ```bash -# Using uv (recommended) uv add microsoft-teams-devtools - -# Using pip -pip install microsoft-teams-devtools ``` -## Quick Start +## Usage ```python from microsoft.teams.apps import App from microsoft.teams.devtools import DevToolsPlugin app = App() - -# Add DevTools plugin (automatically enabled in development) app.use(DevToolsPlugin(port=3979)) -# Start your app await app.start() - # Open http://localhost:3979/devtools in your browser ``` - -## Using the Web Interface - -Once your app is running with DevTools enabled: - -1. Navigate to `http://localhost:3979/devtools` -2. Send messages to your bot through the interface -3. View bot responses and activity logs in real-time -4. Inspect activity payloads and debug issues - -## Configuration - -```python -# Customize DevTools settings -devtools = DevToolsPlugin( - port=3979, # DevTools UI port - host="localhost", # Bind address - auto_open=True # Open browser automatically -) - -app.use(devtools) -``` - -## Environment-Based Activation - -```python -import os - -# Only enable DevTools in development -if os.getenv("ENVIRONMENT") == "development": - app.use(DevToolsPlugin()) -``` - -## Debugging Tips - -- Use the activity inspector to examine message payloads -- Test different message types (text, cards, attachments) -- Verify authentication flows locally -- Debug action handlers without Teams client diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index 14afaef4..87890794 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -16,24 +16,13 @@ Enables Teams bots to use MCP servers as tools and resources. -## Features - -- **MCP Server Integration**: Connect to MCP servers for extended capabilities -- **Tool Execution**: Execute MCP tools from within Teams conversations -- **Resource Access**: Access MCP resources in your bot logic -- **FastMCP Support**: Compatible with FastMCP server implementations - ## Installation ```bash -# Using uv (recommended) uv add microsoft-teams-mcpplugin - -# Using pip -pip install microsoft-teams-mcpplugin ``` -## Quick Start +## Usage ```python from microsoft.teams.apps import App @@ -41,32 +30,10 @@ from microsoft.teams.mcpplugin import McpClientPlugin app = App() -# Configure MCP client plugin mcp_plugin = McpClientPlugin( server_command="uvx", server_args=["mcp-server-time"] ) -# Register the plugin app.use(mcp_plugin) - -# MCP tools are now available to your AI agent ``` - -## Using MCP Resources - -```python -# Access MCP resources through the plugin -# The plugin integrates with your Teams AI agent -# and makes MCP tools available as functions - -# MCP tools can be called by the AI model -# when using the Teams AI framework -``` - -## Supported MCP Servers - -This plugin works with any MCP-compliant server, including: -- Built-in MCP servers (filesystem, git, etc.) -- Custom MCP servers -- FastMCP-based servers diff --git a/packages/openai/README.md b/packages/openai/README.md index 9eeb68f0..f8979952 100644 --- a/packages/openai/README.md +++ b/packages/openai/README.md @@ -16,105 +16,18 @@ Supports OpenAI and OpenAI-compatible APIs for chat completions and embeddings. -## Features - -- **Chat Completions**: GPT-3.5, GPT-4, and compatible models -- **Streaming Support**: Real-time response streaming -- **Function Calling**: Native support for OpenAI function calling -- **OpenAI-Compatible APIs**: Works with Azure OpenAI, LM Studio, and other compatible services - ## Installation ```bash -# Using uv (recommended) uv add microsoft-teams-openai - -# Using pip -pip install microsoft-teams-openai ``` -## Quick Start +## Usage ```python from microsoft.teams.openai import OpenAICompletionsAIModel - -# Configure OpenAI model -model = OpenAICompletionsAIModel( - api_key="your-api-key", - model="gpt-4", - temperature=0.7 -) - -# Use with Teams AI Agent from microsoft.teams.ai import Agent +model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") agent = Agent(model=model) -result = await agent.send( - input="Hello!", - instructions="You are a helpful assistant." -) -``` - -## Azure OpenAI Support - -```python -from microsoft.teams.openai import OpenAICompletionsAIModel - -# Configure Azure OpenAI with base_url -model = OpenAICompletionsAIModel( - api_key="your-azure-key", - base_url="https://your-resource.openai.azure.com/openai/deployments/your-deployment-name", - model="gpt-4" -) -``` - -## Streaming Responses - -```python -from microsoft.teams.openai import OpenAIResponsesAIModel -from microsoft.teams.ai import UserMessage - -# Use OpenAIResponsesAIModel for streaming support -model = OpenAIResponsesAIModel( - api_key="your-api-key", - model="gpt-4" -) - -# Streaming is handled through callback -def on_chunk(chunk: str): - print(chunk, end="", flush=True) - -result = await model.generate_text( - input=UserMessage(content="Tell me a story"), - on_chunk=on_chunk -) -``` - -## Function Calling - -```python -from microsoft.teams.ai import Agent, Function -from pydantic import BaseModel - -class GetWeatherParams(BaseModel): - location: str - -async def get_weather_handler(params: GetWeatherParams) -> str: - return f"Weather in {params.location}: sunny, 72°F" - -# Configure agent with function -model = OpenAICompletionsAIModel( - api_key="your-api-key", - model="gpt-4" -) - -agent = Agent(model=model) -agent.with_function( - Function( - name="get_weather", - description="Get the weather for a location", - parameter_schema=GetWeatherParams, - handler=get_weather_handler - ) -) ``` \ No newline at end of file From f146a688788745b115878b4743d5fc73bad9b5bf Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 15:58:52 +0000 Subject: [PATCH 07/15] Address feedback: Add ChatPrompt, functions, MCP server/client, A2A server/client examples and docs links Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 29 +++++++++++++++++++++++++++-- packages/ai/README.md | 34 ++++++++++++++++++++++++++++------ packages/devtools/README.md | 4 +--- packages/mcpplugin/README.md | 34 ++++++++++++++++++++++++++++------ 4 files changed, 84 insertions(+), 17 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index 9a926bb8..940a010f 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -24,6 +24,8 @@ uv add microsoft-teams-a2a ## Usage +### A2A Server (Expose Agent) + ```python from microsoft.teams.apps import App from microsoft.teams.a2a import A2APlugin, A2APluginOptions @@ -31,12 +33,35 @@ from a2a.types import AgentCard app = App() +# Expose your Teams agent via A2A protocol agent_card = AgentCard( name="My Agent", description="A helpful agent", capabilities={} ) -a2a_plugin = A2APlugin(A2APluginOptions(agent_card=agent_card)) -app.use(a2a_plugin) +a2a_server = A2APlugin(A2APluginOptions(agent_card=agent_card)) +app.use(a2a_server) +``` + +### A2A Client (Use Other Agents) + +```python +from microsoft.teams.a2a import A2AClientPlugin, A2APluginUseParams +from microsoft.teams.ai import ChatPrompt +from microsoft.teams.openai import OpenAICompletionsAIModel + +model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") + +# Connect to another A2A agent +a2a_client = A2AClientPlugin() +a2a_client.on_use_plugin( + A2APluginUseParams( + key="weather-agent", + base_url="http://localhost:4000/a2a", + card_url=".well-known/agent-card.json" + ) +) + +prompt = ChatPrompt(model, plugins=[a2a_client]) ``` diff --git a/packages/ai/README.md b/packages/ai/README.md index 5d671e5c..9fce6b53 100644 --- a/packages/ai/README.md +++ b/packages/ai/README.md @@ -12,9 +12,7 @@ AI-powered conversational experiences for Microsoft Teams applications. Provides prompt management, action planning, and model integration for building intelligent Teams bots. - - - +[📖 Documentation](https://microsoft.github.io/teams-ai/python/in-depth-guides/ai/) ## Installation @@ -24,15 +22,39 @@ uv add microsoft-teams-ai ## Usage +### ChatPrompt + ```python -from microsoft.teams.ai import Agent +from microsoft.teams.ai import ChatPrompt, Function from microsoft.teams.openai import OpenAICompletionsAIModel +from pydantic import BaseModel model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") -agent = Agent(model=model) -result = await agent.send( +# Create a ChatPrompt +prompt = ChatPrompt(model) + +result = await prompt.send( input="Hello!", instructions="You are a helpful assistant." ) +``` + +### Function Calling + +```python +class GetWeatherParams(BaseModel): + location: str + +async def get_weather(params: GetWeatherParams) -> str: + return f"The weather in {params.location} is sunny" + +weather_function = Function( + name="get_weather", + description="Get weather for a location", + parameter_schema=GetWeatherParams, + handler=get_weather +) + +prompt = ChatPrompt(model, functions=[weather_function]) ``` \ No newline at end of file diff --git a/packages/devtools/README.md b/packages/devtools/README.md index bc25a594..53c1f200 100644 --- a/packages/devtools/README.md +++ b/packages/devtools/README.md @@ -14,9 +14,7 @@ Developer tools for locally testing and debugging Teams applications. Streamlines the development process by eliminating the need to deploy apps or expose public endpoints during development. - - - +[📖 Documentation](https://microsoft.github.io/teams-ai/developer-tools/devtools/) ## Installation diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index 87890794..3c694618 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -10,11 +10,9 @@

Model Context Protocol (MCP) integration for Microsoft Teams AI applications. -Enables Teams bots to use MCP servers as tools and resources. +Enables Teams bots to both expose tools as MCP servers and use MCP servers as clients. - - - +[📖 Documentation](https://microsoft.github.io/teams-ai/python/in-depth-guides/ai/mcp/) ## Installation @@ -24,16 +22,40 @@ uv add microsoft-teams-mcpplugin ## Usage +### MCP Client (Use MCP Servers) + ```python from microsoft.teams.apps import App from microsoft.teams.mcpplugin import McpClientPlugin app = App() -mcp_plugin = McpClientPlugin( +# Connect to an MCP server +mcp_client = McpClientPlugin( server_command="uvx", server_args=["mcp-server-time"] ) -app.use(mcp_plugin) +app.use(mcp_client) +``` + +### MCP Server (Expose Tools) + +```python +from microsoft.teams.apps import App +from microsoft.teams.mcpplugin import McpServerPlugin +from microsoft.teams.ai import Function +from pydantic import BaseModel + +app = App() + +class EchoParams(BaseModel): + message: str + +async def echo_handler(params: EchoParams) -> str: + return f"Echo: {params.message}" + +# Expose app as MCP server +mcp_server = McpServerPlugin(name="my-mcp-server") +app.use(mcp_server) ``` From eb5e629dfd791ab557e0a9831bf41d5e540c9793 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 16:06:46 +0000 Subject: [PATCH 08/15] Fix MCP Server example to match actual test implementation with use_tool() Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/mcpplugin/README.md | 22 ++++++++++++++++------ 1 file changed, 16 insertions(+), 6 deletions(-) diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index 3c694618..d8830c41 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -47,15 +47,25 @@ from microsoft.teams.mcpplugin import McpServerPlugin from microsoft.teams.ai import Function from pydantic import BaseModel -app = App() +# Create MCP server plugin +mcp_server = McpServerPlugin(name="my-mcp-server") +# Define a tool class EchoParams(BaseModel): - message: str + input: str async def echo_handler(params: EchoParams) -> str: - return f"Echo: {params.message}" + return f"You said {params.input}" + +# Register tool with MCP server +mcp_server.use_tool( + Function( + name="echo", + description="Echo back whatever you said", + parameter_schema=EchoParams, + handler=echo_handler + ) +) -# Expose app as MCP server -mcp_server = McpServerPlugin(name="my-mcp-server") -app.use(mcp_server) +app = App(plugins=[mcp_server]) ``` From 0adb7efed701d4fb591d476699b3c5e2c88a1699 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 16:27:18 +0000 Subject: [PATCH 09/15] Fix MCP Client example to use use_mcp_server() with URL instead of invalid constructor params Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/mcpplugin/README.md | 16 +++++++++------- 1 file changed, 9 insertions(+), 7 deletions(-) diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index d8830c41..02351430 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -27,16 +27,18 @@ uv add microsoft-teams-mcpplugin ```python from microsoft.teams.apps import App from microsoft.teams.mcpplugin import McpClientPlugin +from microsoft.teams.ai import ChatPrompt +from microsoft.teams.openai import OpenAICompletionsAIModel -app = App() +# Create MCP client plugin +mcp_client = McpClientPlugin() -# Connect to an MCP server -mcp_client = McpClientPlugin( - server_command="uvx", - server_args=["mcp-server-time"] -) +# Connect to MCP servers +mcp_client.use_mcp_server("https://example.com/mcp") -app.use(mcp_client) +# Use with ChatPrompt +model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") +prompt = ChatPrompt(model, plugins=[mcp_client]) ``` ### MCP Server (Expose Tools) From c5599a40107fce7c0919df2b50f9b30c4a5b1d6a Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 16:32:31 +0000 Subject: [PATCH 10/15] Fix A2A title and OpenAI example to use ChatPrompt Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 2 +- packages/openai/README.md | 4 ++-- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index 940a010f..e6992b79 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -1,4 +1,4 @@ -# Microsoft Teams A2A Protocol +# Microsoft Teams A2A

diff --git a/packages/openai/README.md b/packages/openai/README.md index f8979952..a8eb9ba7 100644 --- a/packages/openai/README.md +++ b/packages/openai/README.md @@ -26,8 +26,8 @@ uv add microsoft-teams-openai ```python from microsoft.teams.openai import OpenAICompletionsAIModel -from microsoft.teams.ai import Agent +from microsoft.teams.ai import ChatPrompt model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") -agent = Agent(model=model) +prompt = ChatPrompt(model) ``` \ No newline at end of file From 235f3401fef790bee41ef9f6520ddebaa3bfe98f Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 17:06:16 +0000 Subject: [PATCH 11/15] Remove port specification from DevTools example Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/devtools/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/packages/devtools/README.md b/packages/devtools/README.md index 53c1f200..ef393384 100644 --- a/packages/devtools/README.md +++ b/packages/devtools/README.md @@ -29,7 +29,7 @@ from microsoft.teams.apps import App from microsoft.teams.devtools import DevToolsPlugin app = App() -app.use(DevToolsPlugin(port=3979)) +app.use(DevToolsPlugin()) await app.start() # Open http://localhost:3979/devtools in your browser From 80baf097d94c205b5fae8e0f55116fa2b792d709 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Fri, 31 Oct 2025 17:10:59 +0000 Subject: [PATCH 12/15] Update A2A examples to match test implementation with proper AgentCard setup Co-authored-by: heyitsaamir <48929123+heyitsaamir@users.noreply.github.com> --- packages/a2aprotocol/README.md | 19 ++++++++++++------- 1 file changed, 12 insertions(+), 7 deletions(-) diff --git a/packages/a2aprotocol/README.md b/packages/a2aprotocol/README.md index e6992b79..4d72c58c 100644 --- a/packages/a2aprotocol/README.md +++ b/packages/a2aprotocol/README.md @@ -29,19 +29,24 @@ uv add microsoft-teams-a2a ```python from microsoft.teams.apps import App from microsoft.teams.a2a import A2APlugin, A2APluginOptions -from a2a.types import AgentCard +from a2a.types import AgentCard, AgentCapabilities app = App() -# Expose your Teams agent via A2A protocol +# Define agent card with capabilities agent_card = AgentCard( - name="My Agent", - description="A helpful agent", - capabilities={} + name="weather_agent", + description="An agent that can tell you the weather", + url="http://localhost:4000/a2a/", + version="0.0.1", + protocol_version="0.3.0", + capabilities=AgentCapabilities(), + default_input_modes=[], + default_output_modes=[] ) a2a_server = A2APlugin(A2APluginOptions(agent_card=agent_card)) -app.use(a2a_server) +app = App(plugins=[a2a_server]) ``` ### A2A Client (Use Other Agents) @@ -57,7 +62,7 @@ model = OpenAICompletionsAIModel(api_key="your-api-key", model="gpt-4") a2a_client = A2AClientPlugin() a2a_client.on_use_plugin( A2APluginUseParams( - key="weather-agent", + key="my-weather-agent", base_url="http://localhost:4000/a2a", card_url=".well-known/agent-card.json" ) From a67178f1ed654f26b52d79bb7e39a32e525e81b1 Mon Sep 17 00:00:00 2001 From: Aamir Jawaid <48929123+heyitsaamir@users.noreply.github.com> Date: Wed, 5 Nov 2025 16:30:23 -0800 Subject: [PATCH 13/15] Update documentation link for Teams SDK --- packages/ai/README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/packages/ai/README.md b/packages/ai/README.md index 9fce6b53..4d0c02f7 100644 --- a/packages/ai/README.md +++ b/packages/ai/README.md @@ -12,7 +12,7 @@ AI-powered conversational experiences for Microsoft Teams applications. Provides prompt management, action planning, and model integration for building intelligent Teams bots. -[📖 Documentation](https://microsoft.github.io/teams-ai/python/in-depth-guides/ai/) +[📖 Documentation](https://microsoft.github.io/teams-sdk/python/in-depth-guides/ai/) ## Installation @@ -57,4 +57,4 @@ weather_function = Function( ) prompt = ChatPrompt(model, functions=[weather_function]) -``` \ No newline at end of file +``` From 16ff31bef5f507e8f97d894d691390a2dc4565b2 Mon Sep 17 00:00:00 2001 From: Aamir Jawaid <48929123+heyitsaamir@users.noreply.github.com> Date: Wed, 5 Nov 2025 16:38:59 -0800 Subject: [PATCH 14/15] Update README.md --- packages/devtools/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/packages/devtools/README.md b/packages/devtools/README.md index ef393384..36ef3e32 100644 --- a/packages/devtools/README.md +++ b/packages/devtools/README.md @@ -14,7 +14,7 @@ Developer tools for locally testing and debugging Teams applications. Streamlines the development process by eliminating the need to deploy apps or expose public endpoints during development. -[📖 Documentation](https://microsoft.github.io/teams-ai/developer-tools/devtools/) +[📖 Documentation](https://microsoft.github.io/teams-sdk/developer-tools/devtools/) ## Installation From 6fa2d3e5be8ab64ebc2281fe84f1b85784cb8b52 Mon Sep 17 00:00:00 2001 From: Aamir Jawaid <48929123+heyitsaamir@users.noreply.github.com> Date: Wed, 5 Nov 2025 16:39:44 -0800 Subject: [PATCH 15/15] Update documentation link for MCP integration --- packages/mcpplugin/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/packages/mcpplugin/README.md b/packages/mcpplugin/README.md index 02351430..9b624467 100644 --- a/packages/mcpplugin/README.md +++ b/packages/mcpplugin/README.md @@ -12,7 +12,7 @@ Model Context Protocol (MCP) integration for Microsoft Teams AI applications. Enables Teams bots to both expose tools as MCP servers and use MCP servers as clients. -[📖 Documentation](https://microsoft.github.io/teams-ai/python/in-depth-guides/ai/mcp/) +[📖 Documentation](https://microsoft.github.io/teams-sdk/python/in-depth-guides/ai/mcp/) ## Installation