Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 22 additions & 12 deletions src/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -194,12 +194,17 @@
]
},
{
"group": "Use in production",
"group": "Agent development",
"pages": [
"oss/python/langchain/studio",
"oss/python/langchain/test",
"oss/python/langchain/ui"
]
},
{
"group": "Deploy with LangSmith",
"pages": [
"oss/python/langchain/studio",
"oss/python/langchain/test",
"oss/python/langchain/deploy",
"oss/python/langchain/ui",
"oss/python/langchain/observability"
]
}
Expand Down Expand Up @@ -242,10 +247,10 @@
"group": "Production",
"pages": [
"oss/python/langgraph/application-structure",
"oss/python/langgraph/studio",
"oss/python/langgraph/test",
"oss/python/langgraph/deploy",
"oss/python/langgraph/studio",
"oss/python/langgraph/ui",
"oss/python/langgraph/deploy",
"oss/python/langgraph/observability"
]
},
Expand Down Expand Up @@ -530,12 +535,17 @@
]
},
{
"group": "Use in production",
"group": "Agent development",
"pages": [
"oss/javascript/langchain/studio",
"oss/javascript/langchain/test",
"oss/javascript/langchain/ui"
]
},
{
"group": "Deploy with LangSmith",
"pages": [
"oss/javascript/langchain/studio",
"oss/javascript/langchain/test",
"oss/javascript/langchain/deploy",
"oss/javascript/langchain/ui",
"oss/javascript/langchain/observability"
]
}
Expand Down Expand Up @@ -578,10 +588,10 @@
"group": "Production",
"pages": [
"oss/javascript/langgraph/application-structure",
"oss/javascript/langgraph/studio",
"oss/javascript/langgraph/test",
"oss/javascript/langgraph/deploy",
"oss/javascript/langgraph/studio",
"oss/javascript/langgraph/ui",
"oss/javascript/langgraph/deploy",
"oss/javascript/langgraph/observability"
]
},
Expand Down
9 changes: 5 additions & 4 deletions src/oss/langchain/deploy.mdx
Original file line number Diff line number Diff line change
@@ -1,17 +1,18 @@
---
title: Deploy
title: LangSmith Deployment
sidebarTitle: Deployment
---

import deploy from '/snippets/oss/deploy.mdx';

LangSmith is the fastest way to turn agents into production systems. Traditional hosting platforms are built for stateless, short-lived web apps, while LangGraph is **purpose-built for stateful, long-running agents**, so you can go from repo to reliable cloud deployment in minutes.
When you're ready to deploy your LangChain agent to production, LangSmith provides a managed hosting platform designed for agent workloads. Traditional hosting platforms are built for stateless, short-lived web applications, while LangGraph is **purpose-built for stateful, long-running agents** that require persistent state and background execution. LangSmith handles the infrastructure, scaling, and operational concerns so you can deploy directly from your repository.

## Prerequisites

Before you begin, ensure you have the following:

* A [GitHub account](https://github.com/)
* A [LangSmith account](https://smith.langchain.com/) (free to sign up)
- A [GitHub account](https://github.com/)
- A [LangSmith account](https://smith.langchain.com/) (free to sign up)

## Deploy your agent

Expand Down
18 changes: 9 additions & 9 deletions src/oss/langchain/observability.mdx
Original file line number Diff line number Diff line change
@@ -1,18 +1,22 @@
---
title: Observability
title: LangSmith Observability
sidebarTitle: Observability
---

import observability from '/snippets/oss/observability.mdx';

Observability is crucial for understanding how your agents behave in production. With LangChain's @[`create_agent`], you get built-in observability through [LangSmith](https://smith.langchain.com/) - a powerful platform for tracing, debugging, evaluating, and monitoring your LLM applications.
As you build and run agents with LangChain, you need visibility into how they behave: which [tools](/oss/langchain/tools) they call, what prompts they generate, and how they make decisions. LangChain agents built with @[`create_agent`] automatically support tracing through [LangSmith](/langsmith/home), a platform for capturing, debugging, evaluating, and monitoring LLM application behavior.

Traces capture every step your agent takes, from the initial user input to the final response, including all tool calls, model interactions, and decision points. This enables you to debug your agents, evaluate performance, and monitor usage.
[_Traces_](/langsmith/observability-concepts#traces) record every step of your agent's execution, from the initial user input to the final response, including all tool calls, model interactions, and decision points. This execution data helps you debug issues, evaluate performance across different inputs, and monitor usage patterns in production.

This guide shows you how to enable tracing for your LangChain agents and use LangSmith to analyze their execution.

## Prerequisites

Before you begin, ensure you have the following:

* A [LangSmith account](https://smith.langchain.com/) (free to sign up)
- **A LangSmith account**: Sign up (for free) or log in at [smith.langchain.com](https://smith.langchain.com).
- **A LangSmith API key**: Follow the [Create an API key](/langsmith/create-account-api-key#create-an-api-key) guide.

## Enable tracing

Expand All @@ -23,11 +27,7 @@ export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
```

<Info>
You can get your API key from your [LangSmith settings](https://smith.langchain.com/settings).
</Info>

## Quick start
## Quickstart

No extra code is needed to log a trace to LangSmith. Just run your agent code as you normally would:

Expand Down
3 changes: 2 additions & 1 deletion src/oss/langchain/studio.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
---
title: Studio
title: LangSmith Studio
sidebarTitle: LangSmith Studio
---

import Studio from '/snippets/oss/studio.mdx';
Expand Down
10 changes: 5 additions & 5 deletions src/oss/langgraph/application-structure.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,13 @@
title: Application structure
---



## Overview

A LangGraph application consists of one or more graphs, a configuration file (`langgraph.json`), a file that specifies dependencies, and an optional `.env` file that specifies environment variables.

This guide shows a typical structure of an application and shows how the required information to deploy an application using the LangSmith is specified.
This guide shows a typical structure of an application and shows you how to provide the required configuration to deploy an application with [LangSmith Deployment](/langsmith/deployments).

<Info>
LangSmith Deployment is a managed hosting platform for deploying and scaling LangGraph agents. It handles the infrastructure, scaling, and operational concerns so you can deploy your stateful, long-running agents directly from your repository. Learn more in the [Deployment documentation](/langsmith/deployments).
</Info>

## Key Concepts

Expand Down
4 changes: 2 additions & 2 deletions src/oss/langgraph/deploy.mdx
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
---
title: Deploy
title: LangSmith Deployment
---

import deploy from '/snippets/oss/deploy.mdx';

LangSmith is the fastest way to turn agents into production systems. Traditional hosting platforms are built for stateless, short-lived web apps, while LangGraph is **purpose-built for stateful, long-running agents**, so you can go from repo to reliable cloud deployment in minutes.
When you're ready to deploy your agent to production, LangSmith provides a managed hosting platform designed for agent workloads. Traditional hosting platforms are built for stateless, short-lived web applications, while LangGraph is **purpose-built for stateful, long-running agents** that require persistent state and background execution. LangSmith handles the infrastructure, scaling, and operational concerns so you can deploy directly from your repository.

## Prerequisites

Expand Down
5 changes: 3 additions & 2 deletions src/oss/langgraph/observability.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: Observability
title: LangSmith Observability
---

import observability from '/snippets/oss/observability.mdx';
Expand All @@ -15,7 +15,8 @@ Traces are a series of steps that your application takes to go from input to out

Before you begin, ensure you have the following:

* A [LangSmith account](https://smith.langchain.com/) (free to sign up)
- **A LangSmith account**: Sign up (for free) or log in at [smith.langchain.com](https://smith.langchain.com).
- **A LangSmith API key**: Follow the [Create an API key](/langsmith/create-account-api-key#create-an-api-key) guide.

## Enable tracing

Expand Down
16 changes: 13 additions & 3 deletions src/oss/langgraph/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -117,9 +117,19 @@ LangGraph provides low-level supporting infrastructure for *any* long-running, s

While LangGraph can be used standalone, it also integrates seamlessly with any LangChain product, giving developers a full suite of tools for building agents. To improve your LLM application development, pair LangGraph with:

* [LangSmith](http://www.langchain.com/langsmith) — Helpful for agent evals and observability. Debug poor-performing LLM app runs, evaluate agent trajectories, gain visibility in production, and improve performance over time.
* [LangGraph](/oss/langgraph/overview) — Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in [Studio](/langsmith/studio).
* [LangChain](/oss/langchain/overview) - Provides integrations and composable components to streamline LLM application development. Contains agent abstractions built on top of LangGraph.
<Columns cols={1}>
<Card title="LangSmith" icon="chart-line" href="http://www.langchain.com/langsmith" arrow cta="Learn more">
Trace requests, evaluate outputs, and monitor deployments in one place. Prototype locally with LangGraph, then move to production with integrated observability and evaluation to build more reliable agent systems.
</Card>

<Card title="LangGraph" icon="server" href="/langsmith/agent-server" arrow cta="Learn more">
Deploy and scale agents effortlessly with a purpose-built deployment platform for long running, stateful workflows. Discover, reuse, configure, and share agents across teams — and iterate quickly with visual prototyping in Studio.
</Card>

<Card title="LangChain" icon="link" href="/oss/langchain/overview" arrow cta="Learn more">
Provides integrations and composable components to streamline LLM application development. Contains agent abstractions built on top of LangGraph.
</Card>
</Columns>

## Acknowledgements

Expand Down
2 changes: 1 addition & 1 deletion src/oss/langgraph/studio.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: Studio
title: LangSmith Studio
---

import Studio from '/snippets/oss/studio.mdx';
Expand Down
72 changes: 42 additions & 30 deletions src/snippets/oss/studio.mdx
Original file line number Diff line number Diff line change
@@ -1,27 +1,23 @@
This guide will walk you through how to use **Studio** to visualize, interact, and debug your agent locally.
When building agents with LangChain locally, it's helpful to visualize what's happening inside your agent, interact with it in real-time, and debug issues as they occur. **LangSmith Studio** is a free visual interface for developing and testing your LangChain agents from your local machine.

Studio is our free-to-use, powerful agent IDE that integrates with [LangSmith](/langsmith/home) to enable tracing, evaluation, and prompt engineering. See exactly how your agent thinks, trace every decision, and ship smarter, more reliable agents.
Studio connects to your locally running agent to show you each step your agent takes: the prompts sent to the model, tool calls and their results, and the final output. You can test different inputs, inspect intermediate states, and iterate on your agent's behavior without additional code or deployment.

<Frame>
<iframe
className="w-full aspect-video rounded-xl"
src="https://www.youtube.com/embed/Mi1gSlHwZLM?si=zA47TNuTC5aH0ahd"
title="Studio"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
</Frame>
This pages describes how to set up Studio with your local LangChain agent.

## Prerequisites

Before you begin, ensure you have the following:
* An API key for [LangSmith](https://smith.langchain.com/settings) (free to sign up)

## Setup local Agent server
- **A LangSmith account**: Sign up (for free) or log in at [smith.langchain.com](https://smith.langchain.com).
- **A LangSmith API key**: Follow the [Create an API key](/langsmith/create-account-api-key#create-an-api-key) guide.
- If you don't want data [traced](/langsmith/observability-concepts#traces) to LangSmith, set `LANGSMITH_TRACING=false` in your application's `.env` file. With tracing disabled, no data leaves your local server.

## Set up local Agent server

### 1. Install the LangGraph CLI

The [LangGraph CLI](/langsmith/cli) provides a local development server (also called [Agent Server](/langsmith/agent-server)) that connects your agent to Studio.

:::python
```shell
# Python >= 3.11 is required.
Expand All @@ -37,7 +33,7 @@ npx @langchain/langgraph-cli

### 2. Prepare your agent

We'll use the following simple agent as an example:
If you already have a LangChain agent, you can use it directly. This example uses a simple email agent:

```python title="agent.py"
from langchain.agents import create_agent
Expand All @@ -62,10 +58,10 @@ agent = create_agent(

### 3. Environment variables

Create a `.env` file in the root of your project and fill in the necessary API keys. We'll need to set the `LANGSMITH_API_KEY` environment variable to the API key you get from [LangSmith](https://smith.langchain.com/settings).
Studio requires a LangSmith API key to connect your local agent. Create a `.env` file in the root of your project and add your API key from [LangSmith](https://smith.langchain.com/settings).

<Warning>
Be sure not to commit your `.env` to version control systems such as Git!
Ensure your `.env` file is not committed to version control, such as Git.
</Warning>

```bash .env
Expand All @@ -74,7 +70,7 @@ LANGSMITH_API_KEY=lsv2...

### 4. Create a LangGraph config file

Inside your app's directory, create a configuration file `langgraph.json`:
The LangGraph CLI uses a configuration file to locate your agent and manage dependencies. Create a `langgraph.json` file in your app's directory:

```json title="langgraph.json"
{
Expand All @@ -86,13 +82,13 @@ Inside your app's directory, create a configuration file `langgraph.json`:
}
```

@[`create_agent`] automatically returns a compiled LangGraph graph that we can pass to the `graphs` key in our configuration file.
The @[`create_agent`] function automatically returns a compiled LangGraph graph, which is what the `graphs` key expects in the configuration file.

<Info>
See the [LangGraph configuration file reference](/langsmith/cli#configuration-file) for detailed explanations of each key in the JSON object of the configuration file.
For detailed explanations of each key in the JSON object of the configuration file, refer to the [LangGraph configuration file reference](/langsmith/cli#configuration-file).
</Info>

So far, our project structure looks like this:
At this point, the project structure will look like this:

```bash
my-app/
Expand All @@ -105,7 +101,7 @@ my-app/
### 5. Install dependencies

:::python
In the root of your new LangGraph app, install the dependencies:
Install your project dependencies from the root directory:

<CodeGroup>
```shell pip
Expand All @@ -125,7 +121,7 @@ yarn install

### 6. View your agent in Studio

Start your Agent server:
Start the development server to connect your agent to Studio:

:::python
```shell
Expand All @@ -143,18 +139,34 @@ npx @langchain/langgraph-cli dev
Safari blocks `localhost` connections to Studio. To work around this, run the above command with `--tunnel` to access Studio via a secure tunnel.
</Warning>

Your agent will be accessible via API (`http://127.0.0.1:2024`) and the Studio UI `https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024`:
Once the server is running, your agent is accessible both via API at `http://127.0.0.1:2024` and through the Studio UI at `https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024`:

<Frame>
![Agent view in the Studio UI](/oss/images/studio_create-agent.png)
</Frame>

Studio makes each step of your agent easily observable. Replay any input and inspect the exact prompt, tool arguments, return values, and token/latency metrics. If a tool throws an exception, Studio records it with surrounding state so you can spend less time debugging.
With Studio connected to your local agent, you can iterate quickly on your agent's behavior. Run a test input, inspect the full execution trace including prompts, tool arguments, return values, and token/latency metrics. When something goes wrong, Studio captures exceptions with the surrounding state to help you understand what happened.

Keep your dev server running, edit prompts or tool signatures, and watch Studio hot-reload. Re-run the conversation thread from any step to verify behavior changes. See [Manage threads](/langsmith/use-studio#edit-thread-history) for more details.
The development server supports hot-reloading—make changes to prompts or tool signatures in your code, and Studio reflects them immediately. Re-run conversation threads from any step to test your changes without starting over. This workflow scales from simple single-tool agents to complex multi-node graphs.

As your agent grows, the same view scales from a single-tool demo to multi-node graphs, keeping decisions legible and reproducible.
For more information on how to run Studio, refer to the following guides in the [LangSmith docs](/langsmith/home):

<Tip>
For an in-depth look at Studio, check out the [overview page](/langsmith/studio).
</Tip>
- [Run application](/langsmith/use-studio#run-application)
- [Manage assistants](/langsmith/use-studio#manage-assistants)
- [Manage threads](/langsmith/use-studio#manage-threads)
- [Iterate on prompts](/langsmith/observability-studio)
- [Debug LangSmith traces](/langsmith/observability-studio#debug-langsmith-traces)
- [Add node to dataset](/langsmith/observability-studio#add-node-to-dataset)

## Video guide

<Frame>
<iframe
className="w-full aspect-video rounded-xl"
src="https://www.youtube.com/embed/Mi1gSlHwZLM?si=zA47TNuTC5aH0ahd"
title="Studio"
frameBorder="0"
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
allowFullScreen
></iframe>
</Frame>