Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 5 additions & 19 deletions src/langsmith/trace-with-langchain.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,7 @@ pnpm add @langchain/openai @langchain/core

### 1. Configure your environment

<CodeGroup>

```bash Python
```bash wrap
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
Expand All @@ -46,26 +44,14 @@ export OPENAI_API_KEY=<your-openai-api-key>
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
```

```bash TypeScript
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
export OPENAI_API_KEY=<your-openai-api-key>
# For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use.
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
```

</CodeGroup>

<Info>
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:

`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:

If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`

`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:

`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
</Info>
Comment on lines 47 to 55
Copy link

Copilot AI Nov 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The content within the <Info> component should not be indented. This is inconsistent with the established pattern used throughout the codebase. Remove the 4-space indentation from all lines within the <Info> tags.

Copilot uses AI. Check for mistakes.

### 2. Log a trace
Expand Down
48 changes: 12 additions & 36 deletions src/langsmith/trace-with-langgraph.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,7 @@ pnpm add @langchain/openai @langchain/langgraph

### 2. Configure your environment

<CodeGroup>

```bash Python
```bash wrap
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
Expand All @@ -50,27 +48,16 @@ export OPENAI_API_KEY=<your-openai-api-key>
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
```

```bash TypeScript
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
export OPENAI_API_KEY=<your-openai-api-key>
# For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use.
export LANGSMITH_WORKSPACE_ID=<your-workspace-id>
```

</CodeGroup>

<Info>
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:

`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`

If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:

`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`

See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
</Info>
Comment on lines 51 to 61
Copy link

Copilot AI Nov 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The content within the <Info> component should not be indented. This is inconsistent with the established pattern used throughout the codebase (see src/langsmith/administration-overview.mdx, src/langsmith/env-var.mdx, src/langsmith/multi-turn-simulation.mdx, etc.). Remove the 4-space indentation from all lines within the <Info> tags.

Copilot uses AI. Check for mistakes.

### 3. Log a trace
Expand Down Expand Up @@ -243,34 +230,23 @@ pnpm add openai langsmith @langchain/langgraph

### 2. Configure your environment

<CodeGroup>

```bash Python
```bash wrap
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
export OPENAI_API_KEY=<your-openai-api-key>
```

```bash TypeScript
export LANGSMITH_TRACING=true
export LANGSMITH_API_KEY=<your-api-key>
# This example uses OpenAI, but you can use any LLM provider of choice
export OPENAI_API_KEY=<your-openai-api-key>
```

</CodeGroup>

<Info>
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:

`export LANGCHAIN_CALLBACKS_BACKGROUND=true`
`export LANGCHAIN_CALLBACKS_BACKGROUND=true`

If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:

`export LANGCHAIN_CALLBACKS_BACKGROUND=false`
`export LANGCHAIN_CALLBACKS_BACKGROUND=false`

See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.
</Info>
Comment on lines 240 to 250
Copy link

Copilot AI Nov 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The content within the <Info> component should not be indented. This is inconsistent with the established pattern used throughout the codebase. Remove the 4-space indentation from all lines within the <Info> tags.

Copilot uses AI. Check for mistakes.

### 3. Log a trace
Expand Down