β If you like this sample, star it on GitHub β it helps a lot!
Overview β’ Architecture β’ Getting started β’ Deploy to Azure β’ Run locally β’ MCP tools β’ Resources
This project demonstrates how to build AI agents that can interact with real-world APIs using the Model Context Protocol (MCP). It features a complete burger ordering system with a serverless API, web interfaces, and an MCP server that enables AI agents to browse menus, place orders, and track order status. The agent uses LangChain.js to handle LLM reasoning and tool calling. The system consists of multiple interconnected services, as detailed in the Architecture section below.
The system is hosted on Azure Static Web Apps (web apps) and Azure Functions (API and MCP servers), with Azure Cosmos DB for NoSQL for data storage. You can use it as a starting point for building your own AI agents.
- LangChain.js agent with tool calling via MCP (Streamable HTTP transport)
- Multi-service, endβtoβend architecture (web UIs, APIs, MCP server)
- User authentication with sessions history
- 100% serverless architecture, for cost-effective scaling
- Single-command deployment using Infrastructure as Code (IaC)
Building AI applications can be complex and time-consuming, but using LangChain.js and Azure serverless technologies allows to greatly simplify the process. This application is a AI agent that can be access through different interfaces (web app, CLI) and that can call tools through MCP to interact with a burger ordering API.
The application is made from these main components:
| Component | Folder | Purpose |
|---|---|---|
| Agent Web App | packages/agent-webapp |
Chat interface + conversation rendering |
| Agent API | packages/agent-api |
LangChain.js agent + chat state + MCP client |
| Burger API | packages/burger-api |
Core burger & order management web API |
| Burger MCP Server | packages/burger-mcp |
Exposes burger API as MCP tools |
| Burger Web App | packages/burger-webapp |
Live orders visualization |
| Infrastructure | infra |
Bicep templates (IaC) |
Additionally, these support components are included:
| Component | Folder | Purpose |
|---|---|---|
| Agent CLI | packages/agent-cli |
Command-line interface LangChain.js agent and MCP client |
| Data generation | packages/burger-data |
Scripts to (re)generate burgers data & images |
There are multiple ways to get started with this project. The quickest way is to use GitHub Codespaces that provides a preconfigured environment for you. Alternatively, you can set up your local environment following the instructions below.
You can run this project directly in your browser by using GitHub Codespaces, which will open a web-based VS Code:
A similar option to Codespaces is VS Code Dev Containers, that will open the project in your local VS Code instance using the Dev Containers extension.
You will also need to have Docker installed on your machine to run the container.
You need to install following tools to work on your local machine:
- Node.js LTS
- Azure Developer CLI 1.19+
- Git
- PowerShell 7+ (for Windows users only)
- Important: Ensure you can run
pwsh.exefrom a PowerShell command. If this fails, you likely need to upgrade PowerShell. - Instead of Powershell, you can also use Git Bash or WSL to run the Azure Developer CLI commands.
- Important: Ensure you can run
Then you can get the project code:
-
Fork the project to create your own copy of this repository.
-
On your forked repository, select the Code button, then the Local tab, and copy the URL of your forked repository.
-
Open a terminal and run this command to clone the repo:
git clone <your-repo-url>
- Azure account: If you're new to Azure, get an Azure account for free to get free Azure credits to get started
- Azure account permissions: Your Azure account must have
Microsoft.Authorization/roleAssignments/writepermissions, such as Role Based Access Control Administrator, User Access Administrator, or Owner
- Open a terminal and navigate to the root of the project
- Authenticate with Azure by running
azd auth login - Run
azd upto deploy the application to Azure. This will provision Azure resources and deploy all services- You will be prompted to select a base location for the resources
- The deployment process will take a few minutes
Once deployment is complete, you'll see the URLs of all deployed services in the terminal.
Pricing varies per region and usage, so it isn't possible to predict exact costs for your usage. However, you can use the Azure pricing calculator with pre-configured estimations to get an idea of the costs: Azure Pricing Calculator.
To clean up all the Azure resources created by this sample:
azd down --purgeAfter setting up your environment and provisioned the Azure resources, you can run the entire application locally:
# Install dependencies for all services
npm install
# Start all services locally
npm startStarting the different services may take some time, you need to wait until you see the following message in the terminal: π All services ready π
This will start:
- Agent Web App: http://localhost:4280
- Agent API: http://localhost:7072
- Burger Web App: http://localhost:5173
- Burger API: http://localhost:7071
- Burger MCP Server: http://localhost:3000
Note
When running locally without having deployed the application, the servers will use in-memory storage, so any data will be lost when you stop the servers. After a successful deployment, the servers will use Azure Cosmos DB for persistent storage.
You can then open the Agent web app and ask things like:
- What spicy burgers do you have?
- Order two Classic Cheeseburgers with extra bacon.
- Show my recent orders
The agent will decide which MCP tool(s) to call, then come up with a response.
This project uses npm workspaces to manage multiple packages in a single repository. You can run scripts from the root folder that will apply to all packages, or you can run scripts for individual packages as indicated in their respective README files.
Common scripts (run from repo root):
| Action | Command |
|---|---|
| Start everything | npm start |
| Build all | npm run build |
| Lint | npm run lint |
| Fix lint | npm run lint:fix |
| Format | npm run format |
The Burger MCP server provides these tools for AI agents:
| Tool Name | Description |
|---|---|
get_burgers |
Get a list of all burgers in the menu |
get_burger_by_id |
Get a specific burger by its ID |
get_toppings |
Get a list of all toppings in the menu |
get_topping_by_id |
Get a specific topping by its ID |
get_topping_categories |
Get a list of all topping categories |
get_orders |
Get a list of all orders in the system |
get_order_by_id |
Get a specific order by its ID |
place_order |
Place a new order with burgers (requires userId, optional nickname) |
delete_order_by_id |
Cancel an order if it has not yet been started (status must be pending, requires userId) |
You can test the MCP server using the MCP Inspector:
-
Install and start MCP Inspector:
npx -y @modelcontextprotocol/inspector
-
In your browser, open the MCP Inspector (the URL will be shown in the terminal)
-
Configure the connection:
- Transport: Streamable HTTP or SSE
- URL:
http://localhost:3000/mcp(for Streamable HTTP) orhttp://localhost:3000/sse(for legacy SSE)
-
Click Connect and explore the available tools
To use the MCP server in local mode with GitHub Copilot, create a local .vscode/mcp.json configuration file in your project root:
{
"servers": {
"burger-mcp": {
"type": "stdio",
"command": "npm",
"args": ["run", "start:local", "--workspace=burger-mcp"]
}
}
}If you open that file
Then, you can use GitHub Copilot in agent mode to interact with the MCP server. For example, you can ask questions like "What burgers are available?" or "Place an order for a vegan burger" and Copilot will use the MCP server to provide answers or perform actions.
Tip
Copilot models can behave differently regarding tools usage, so if you don't see it calling the burger-mcp tools, you can explicitly mention using the Bruger MCP server by adding #burger-mcp in your prompt.
Here are some resources to learn more about the technologies used in this project:
- Model Context Protocol - More about the MCP protocol
- MCP for Beginners - A beginner-friendly introduction to MCP
- Generative AI with JavaScript - Learn how to build Generative AI applications with JavaScript
- Azure AI Travel Agents with Llamaindex.TS and MCP - Sample for building AI agents using Llamaindex.TS and MCP
- Serverless AI Chat with RAG using LangChain.js - Sample for building a serverless AI chat grounded on your own data with LangChain.js
You can also find more Azure AI samples here.
If you encounter issues while running or deploying this sample:
- Dependencies: Ensure all required tools are installed and up to date
- Ports: Make sure required ports (3000, 4280, 5173, 5174, 7071, 7072) are not in use
- Azure Developer CLI: Verify you're authenticated with
azd auth login - Node.js version: Ensure you're using Node.js 22 or higher
For more detailed troubleshooting, check the individual README files in each service directory.
This project has been optimized for use with AI agents like GitHub Copilot. This includes:
- Built-in context engineering provided with AGENTS.md files to help AI agents understand and extend the codebase.
- Reusable prompts for common tasks.
- Custom instructions tailored for each service of the project.
- Custom Codebase Explorer chat mode for Copilot, to help you explore and understand the codebase.
To learn how to set up and use GitHub Copilot with this repository, check out the docs/copilot.md guide.
If you get stuck or have any questions about building AI apps, join:
If you have product feedback or errors while building visit:


