Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,6 @@ This is an experimental community implementation, and it is not officially suppo

## Setup

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @langchain/community @langchain/core
```
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,10 +26,6 @@ UPSTASH_REDIS_REST_TOKEN="****"

Next, you will need to install Upstash Ratelimit and `@langchain/community`:

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @upstash/ratelimit @langchain/community @langchain/core
```
Expand Down
4 changes: 0 additions & 4 deletions src/oss/javascript/integrations/chat/alibaba_tongyi.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,6 @@ You'll need to sign up for an Alibaba API key and set it as an environment varia

Then, you'll need to install the [`@langchain/community`](https://www.npmjs.com/package/@langchain/community) package:

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @langchain/community @langchain/core
```
Expand Down
17 changes: 10 additions & 7 deletions src/oss/javascript/integrations/chat/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,16 +45,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain `ChatAnthropic` integration lives in the `@langchain/anthropic` package:

```{=mdx}

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/anthropic @langchain/core
```

<Npm2Yarn>
@langchain/anthropic @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/anthropic @langchain/core
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might be able to create a re-usable component for this to save some typing and make sure that all package managers are always included.

<jsInstalls packages=["@langchain/anthropic", "@langchain/core"]/>

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(just an idea, not meant to be blocking since its very low priority)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's definitely the more appropriate/long term fix. This is kind of a patch/bandaid until we can do that.

```

```bash pnpm
pnpm add @langchain/anthropic @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
16 changes: 10 additions & 6 deletions src/oss/javascript/integrations/chat/arcjet.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,15 +20,19 @@ The Arcjet Redact object is not a chat model itself, instead it wraps an LLM. It

Install the Arcjet Redaction Library:

```{=mdx}
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @arcjet/redact
```

<Npm2Yarn>
@arcjet/redact
</Npm2Yarn>
```bash yarn
yarn add @arcjet/redact
```

```bash pnpm
pnpm add @arcjet/redact
```
</CodeGroup>

And install LangChain Community:

Expand Down
17 changes: 10 additions & 7 deletions src/oss/javascript/integrations/chat/azure.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,16 +56,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain AzureChatOpenAI integration lives in the `@langchain/openai` package:

```{=mdx}

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/openai @langchain/core
```

<Npm2Yarn>
@langchain/openai @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/openai @langchain/core
```

```bash pnpm
pnpm add @langchain/openai @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
4 changes: 0 additions & 4 deletions src/oss/javascript/integrations/chat/baidu_qianfan.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,6 @@ title: ChatBaiduQianfan

You'll first need to install the [`@langchain/baidu-qianfan`](https://www.npmjs.com/package/@langchain/baidu-qianfan) package:

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @langchain/baidu-qianfan @langchain/core
```
Expand Down
4 changes: 0 additions & 4 deletions src/oss/javascript/integrations/chat/baidu_wenxin.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,6 @@ Use the [`@langchain/baidu-qianfan`](/oss/integrations/chat/baidu_qianfan/) pack

LangChain.js supports Baidu's ERNIE-bot family of models. Here's an example:

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @langchain/community @langchain/core
```
Expand Down
16 changes: 10 additions & 6 deletions src/oss/javascript/integrations/chat/bedrock_converse.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,15 +41,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain `ChatBedrockConverse` integration lives in the `@langchain/aws` package:

```{=mdx}
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/aws @langchain/core
```

<Npm2Yarn>
@langchain/aws @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/aws @langchain/core
```

```bash pnpm
pnpm add @langchain/aws @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
17 changes: 10 additions & 7 deletions src/oss/javascript/integrations/chat/cerebras.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,16 +53,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain ChatCerebras integration lives in the `@langchain/cerebras` package:

```{=mdx}

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/cerebras @langchain/core
```

<Npm2Yarn>
@langchain/cerebras @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/cerebras @langchain/core
```

```bash pnpm
pnpm add @langchain/cerebras @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
16 changes: 10 additions & 6 deletions src/oss/javascript/integrations/chat/cloudflare_workersai.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,15 +36,19 @@ Passing a binding within a Cloudflare Worker is not yet supported.

The LangChain ChatCloudflareWorkersAI integration lives in the `@langchain/cloudflare` package:

```{=mdx}
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/cloudflare @langchain/core
```

<Npm2Yarn>
@langchain/cloudflare @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/cloudflare @langchain/core
```

```bash pnpm
pnpm add @langchain/cloudflare @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
17 changes: 10 additions & 7 deletions src/oss/javascript/integrations/chat/cohere.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,16 +48,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain ChatCohere integration lives in the `@langchain/cohere` package:

```{=mdx}

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/cohere @langchain/core
```

<Npm2Yarn>
@langchain/cohere @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/cohere @langchain/core
```

```bash pnpm
pnpm add @langchain/cohere @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
4 changes: 0 additions & 4 deletions src/oss/javascript/integrations/chat/deep_infra.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,6 @@ title: ChatDeepInfra
LangChain supports chat models hosted by [Deep Infra](https://deepinfra.com/) through the `ChatDeepInfra` wrapper.
First, you'll need to install the `@langchain/community` package:

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @langchain/community @langchain/core
```
Expand Down
17 changes: 10 additions & 7 deletions src/oss/javascript/integrations/chat/fireworks.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,16 +45,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain `ChatFireworks` integration lives in the `@langchain/community` package:

```{=mdx}

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/community @langchain/core
```

<Npm2Yarn>
@langchain/community @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/community @langchain/core
```

```bash pnpm
pnpm add @langchain/community @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
4 changes: 0 additions & 4 deletions src/oss/javascript/integrations/chat/friendli.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,6 @@ This tutorial guides you through integrating `ChatFriendli` for chat application

Ensure the `@langchain/community` is installed.

import IntegrationInstallTooltip from '/snippets/javascript-integrations/integration-install-tooltip.mdx';

<IntegrationInstallTooltip/>

```bash npm
npm install @langchain/community @langchain/core
```
Expand Down
22 changes: 10 additions & 12 deletions src/oss/javascript/integrations/chat/google_vertex_ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,21 +65,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain `ChatVertexAI` integration lives in the `@langchain/google-vertexai` package:

```{=mdx}
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>

<Npm2Yarn>
@langchain/google-vertexai @langchain/core
</Npm2Yarn>

Or if using in a web environment like a [Vercel Edge function](https://vercel.com/blog/edge-functions-generally-available):
<CodeGroup>
```bash npm
npm install @langchain/google-vertexai @langchain/core
```

<Npm2Yarn>
@langchain/google-vertexai-web @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/google-vertexai @langchain/core
```

```bash pnpm
pnpm add @langchain/google-vertexai @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
17 changes: 10 additions & 7 deletions src/oss/javascript/integrations/chat/groq.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,16 +46,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain ChatGroq integration lives in the `@langchain/groq` package:

```{=mdx}

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/groq @langchain/core
```

<Npm2Yarn>
@langchain/groq @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/groq @langchain/core
```

```bash pnpm
pnpm add @langchain/groq @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
16 changes: 10 additions & 6 deletions src/oss/javascript/integrations/chat/ibm.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,15 +111,19 @@ If you want to get automated tracing of your model calls you can also set your [

The LangChain IBM watsonx.ai integration lives in the `@langchain/community` package:

```{=mdx}
import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";
<IntegrationInstallTooltip></IntegrationInstallTooltip>
<CodeGroup>
```bash npm
npm install @langchain/community @langchain/core
```

<Npm2Yarn>
@langchain/community @langchain/core
</Npm2Yarn>
```bash yarn
yarn add @langchain/community @langchain/core
```

```bash pnpm
pnpm add @langchain/community @langchain/core
```
</CodeGroup>

## Instantiation

Expand Down
Loading