-
Notifications
You must be signed in to change notification settings - Fork 692
remove redundant codegroup for langsmith quickstart #1257
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -39,9 +39,7 @@ pnpm add @langchain/openai @langchain/langgraph | |
|
|
||
| ### 2. Configure your environment | ||
|
|
||
| <CodeGroup> | ||
|
|
||
| ```bash Python | ||
| ```bash wrap | ||
| export LANGSMITH_TRACING=true | ||
| export LANGSMITH_API_KEY=<your-api-key> | ||
| # This example uses OpenAI, but you can use any LLM provider of choice | ||
|
|
@@ -50,27 +48,16 @@ export OPENAI_API_KEY=<your-openai-api-key> | |
| export LANGSMITH_WORKSPACE_ID=<your-workspace-id> | ||
| ``` | ||
|
|
||
| ```bash TypeScript | ||
| export LANGSMITH_TRACING=true | ||
| export LANGSMITH_API_KEY=<your-api-key> | ||
| # This example uses OpenAI, but you can use any LLM provider of choice | ||
| export OPENAI_API_KEY=<your-openai-api-key> | ||
| # For LangSmith API keys linked to multiple workspaces, set the LANGSMITH_WORKSPACE_ID environment variable to specify which workspace to use. | ||
| export LANGSMITH_WORKSPACE_ID=<your-workspace-id> | ||
| ``` | ||
|
|
||
| </CodeGroup> | ||
|
|
||
| <Info> | ||
| If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency: | ||
| If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency: | ||
|
|
||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=true` | ||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=true` | ||
|
|
||
| If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends: | ||
| If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends: | ||
|
|
||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=false` | ||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=false` | ||
|
|
||
| See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information. | ||
| See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information. | ||
| </Info> | ||
|
Comment on lines
51
to
61
|
||
|
|
||
| ### 3. Log a trace | ||
|
|
@@ -243,34 +230,23 @@ pnpm add openai langsmith @langchain/langgraph | |
|
|
||
| ### 2. Configure your environment | ||
|
|
||
| <CodeGroup> | ||
|
|
||
| ```bash Python | ||
| ```bash wrap | ||
| export LANGSMITH_TRACING=true | ||
| export LANGSMITH_API_KEY=<your-api-key> | ||
| # This example uses OpenAI, but you can use any LLM provider of choice | ||
| export OPENAI_API_KEY=<your-openai-api-key> | ||
| ``` | ||
|
|
||
| ```bash TypeScript | ||
| export LANGSMITH_TRACING=true | ||
| export LANGSMITH_API_KEY=<your-api-key> | ||
| # This example uses OpenAI, but you can use any LLM provider of choice | ||
| export OPENAI_API_KEY=<your-openai-api-key> | ||
| ``` | ||
|
|
||
| </CodeGroup> | ||
|
|
||
| <Info> | ||
| If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency: | ||
| If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency: | ||
|
|
||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=true` | ||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=true` | ||
|
|
||
| If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends: | ||
| If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends: | ||
|
|
||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=false` | ||
| `export LANGCHAIN_CALLBACKS_BACKGROUND=false` | ||
|
|
||
| See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information. | ||
| See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information. | ||
| </Info> | ||
|
Comment on lines
240
to
250
|
||
|
|
||
| ### 3. Log a trace | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The content within the
<Info>component should not be indented. This is inconsistent with the established pattern used throughout the codebase. Remove the 4-space indentation from all lines within the<Info>tags.