Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable support for Browsers, Cloudflare Workers, Next.js Browser/Serverless/Edge #632

Merged
merged 83 commits into from Apr 10, 2023
Merged
Show file tree
Hide file tree
Changes from 72 commits
Commits
Show all changes
83 commits
Select commit Hold shift + click to select a range
8b2cd13
Add test-exports-cf package to test using LangChain on Cloudflare Wor…
nfcampos Apr 5, 2023
b899fb3
Add readme
nfcampos Apr 5, 2023
541be01
Remove wrangler from root package
nfcampos Apr 5, 2023
a407078
Add support for lite entrypoints
nfcampos Apr 5, 2023
40b3c42
Improve the test
nfcampos Apr 5, 2023
ff4228c
Replace usage of is-binary-path because it uses node:path
nfcampos Apr 5, 2023
f55a1da
Add a try:catch around fs import in unstructured loader
nfcampos Apr 5, 2023
7f88a59
Move extname to its own file
nfcampos Apr 5, 2023
ef3fcb6
Fix lite entrypoint definition, add 2 more lite entrypoints
nfcampos Apr 5, 2023
0d40202
More thorough cf test
nfcampos Apr 5, 2023
56082bd
Finish setting up test-exports-cf package scripts
nfcampos Apr 5, 2023
c721a5f
Add sideEffects flag
nfcampos Apr 5, 2023
0f64067
Update setup instructions for csv loader
nfcampos Apr 5, 2023
722d052
Update entrypoints
nfcampos Apr 6, 2023
2fdb457
More thorough test
nfcampos Apr 6, 2023
9085258
Fix entrypoints, types needs to contain the full interface, othewise …
nfcampos Apr 6, 2023
a566b2b
Remove fs/load functionality from lite entrypoints
nfcampos Apr 6, 2023
241480d
Run build
nfcampos Apr 6, 2023
7b73ae1
Prevent typeorm from being included in every bundle
nfcampos Apr 6, 2023
bd0033c
Reorganise document loaders, remove any document loaders that use fil…
nfcampos Apr 7, 2023
03cdb5f
Nc/test exports browser (#652)
nfcampos Apr 7, 2023
a145858
Lint
nfcampos Apr 8, 2023
03dc9b6
Fix d3-dsv install instruction
nfcampos Apr 8, 2023
722e3c8
Rename test-exports to test-exports-esm
nfcampos Apr 8, 2023
73eb914
Add missing deps in test-exports-esm
nfcampos Apr 8, 2023
ef89125
Add command to run export tests with docker
nfcampos Apr 8, 2023
64074f9
Add test:exports:docker to CI
nfcampos Apr 8, 2023
0ece50b
Make tiktoken a required dependency
nfcampos Apr 8, 2023
e649df6
Rename CI workflow
nfcampos Apr 8, 2023
c50cd2e
Remove yaml dependency from prompts entrypoint
nfcampos Apr 9, 2023
7ebfbbe
If tiktoken fails fallback to approx count
nfcampos Apr 9, 2023
a5c854f
Fix cra test
nfcampos Apr 9, 2023
41eef02
Place load functions into separate /load entrypoints
nfcampos Apr 9, 2023
503808f
Fix more index imports
nfcampos Apr 9, 2023
e180f36
Create granular entrypoints for /embeddings
nfcampos Apr 9, 2023
1e60f75
Add note
nfcampos Apr 9, 2023
a5ad8b8
Add mroe comments
nfcampos Apr 9, 2023
b2aef79
Remove test for node-only exports
nfcampos Apr 9, 2023
d0cf418
Try to fix test in ci in windows
nfcampos Apr 9, 2023
2472c40
Create granular entrypoints for /llms
nfcampos Apr 9, 2023
cab09f5
Add granular entrypoints for /chat_models
nfcampos Apr 9, 2023
74a07e1
Remove internal things from generated docs
nfcampos Apr 9, 2023
c5b318a
Update typedoc
nfcampos Apr 9, 2023
93a589b
Move src/agents/tools to src/tools/
nfcampos Apr 9, 2023
b611e41
Add generated entrypoint test files
nfcampos Apr 9, 2023
345a1f1
Move /vectorstores to granular entrypoints
nfcampos Apr 9, 2023
1340ceb
Update after rebase
nfcampos Apr 9, 2023
b156d8c
Move /retrievers to granular endpoints
nfcampos Apr 9, 2023
632d07b
Move /document_loaders to granular entrypoints
nfcampos Apr 9, 2023
7ad4ae3
Merge pull request #682 from hwchase17/nc/granular-entrypoints
nfcampos Apr 9, 2023
b3b850d
Lint
nfcampos Apr 9, 2023
a5f09fe
Add package to test exports on vercel / next.js
nfcampos Apr 6, 2023
eb9dc11
Move Calculator tool to its own entrypoint because Vercel Edge doesn'…
nfcampos Apr 9, 2023
240c156
Add streaming to vercel edge example
nfcampos Apr 9, 2023
99e4cdc
Add vercel nextjs frontend example
nfcampos Apr 9, 2023
516caed
Add vercel to env tests
nfcampos Apr 9, 2023
edaf1ae
Update install instructions
nfcampos Apr 9, 2023
98fa8c9
Update pinecone peer dep
nfcampos Apr 9, 2023
b820770
Finish update pinecone peer dep
nfcampos Apr 9, 2023
4ba8535
Fix publish command
nfcampos Apr 9, 2023
c311c6c
Release 0.0.52-0
nfcampos Apr 9, 2023
320f9fb
Merge branch 'main' into nc/test-exports-cf
nfcampos Apr 9, 2023
5987015
Add install instructions for vercel
nfcampos Apr 9, 2023
15e132d
Add update instructions
nfcampos Apr 9, 2023
bde2c97
Reword vercel instructions
nfcampos Apr 9, 2023
b97c1d4
Import entrypoints file in vercel frontend
nfcampos Apr 9, 2023
3f3cdd3
Add deprecation notice to deprecated entrypoints, update remaining do…
nfcampos Apr 9, 2023
51fcf1f
Update tiktoken
nfcampos Apr 9, 2023
1f8df25
Update anthropic sdk
nfcampos Apr 9, 2023
7b0bde0
Update import paths in integrations tests
nfcampos Apr 9, 2023
09bb7ae
Remove unused dev deps
nfcampos Apr 9, 2023
9cb6171
Add back zod in examples deps because of eslint
nfcampos Apr 9, 2023
9fd3a6c
Merge branch 'main' into nc/test-exports-cf
agola11 Apr 10, 2023
fc0f5ef
Code review
nfcampos Apr 10, 2023
48a7ef9
Update upgrade instructions
nfcampos Apr 10, 2023
dce825d
Update pdf loader docs
nfcampos Apr 10, 2023
e728224
Add test-exports-vite, update docusaurus
nfcampos Apr 10, 2023
0ae48ef
Add Deno section
nfcampos Apr 10, 2023
7deaaed
Merge pull request #705 from hwchase17/nc/env-docs
nfcampos Apr 10, 2023
16d214a
Merge branch 'main' into nc/test-exports-cf
nfcampos Apr 10, 2023
299d647
Add node-only labels to docs
nfcampos Apr 10, 2023
9245c7b
Add one more alternative for loading PDFLoader
nfcampos Apr 10, 2023
3369f7d
Smaller node label in docs sidebar
nfcampos Apr 10, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
21 changes: 19 additions & 2 deletions .github/workflows/ci.yml
@@ -1,7 +1,7 @@
# This workflow will do a clean installation of node dependencies, cache/restore them, build the source code and run tests across different versions of node
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-nodejs

name: Node.js CI
name: CI

on:
push:
Expand Down Expand Up @@ -55,7 +55,7 @@ jobs:
run: yarn run build

test:
name: Test
name: Unit Tests
strategy:
matrix:
os: [macos-latest, windows-latest, ubuntu-latest]
Expand All @@ -75,3 +75,20 @@ jobs:
run: yarn run build --filter="!docs"
- name: Test
run: yarn run test:unit

test-exports:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Separate CI job for the environment tests using docker

The environment tests are essentially templates of the various environments people can use LangChain in

  • Node.js with CJS
  • Node.js with ESM
  • Cloudflare Workers
  • Browser with Create React App

The docker test ensures that each environment's own build and test scripts pass

name: Environment Tests
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Use Node.js 18.x
uses: actions/setup-node@v3
with:
node-version: 18.x
cache: "yarn"
- name: Install dependencies
run: yarn install --immutable
- name: Build
run: yarn run build --filter="!docs"
- name: Test Exports
run: yarn run test:exports:docker
4 changes: 3 additions & 1 deletion .vscode/settings.json
Expand Up @@ -3,8 +3,10 @@
"./langchain",
"./examples",
"./docs",
"./test-exports-vercel",
"./test-exports-cra",
],
"yaml.schemas": {
"https://json.schemastore.org/github-workflow.json": "./.github/workflows/deploy.yml"
}
}
}
20 changes: 15 additions & 5 deletions CONTRIBUTING.md
Expand Up @@ -162,6 +162,14 @@ To run only integration tests, run:
yarn test:int
```

**Environment tests** test whether LangChain works across different JS environments, including Node.js (both ESM and CJS), Edge environments (eg. Cloudflare Workers), and browsers (using Webpack).

To run the environment tests with Docker run:

```bash
yarn test:exports:docker
```

### Building

To build the project, run:
Expand All @@ -183,21 +191,23 @@ level of the repo.

### Adding an Entrypoint

Langchain exposes multiple multiple subpaths the user can import from, e.g.
LangChain exposes multiple subpaths the user can import from, e.g.

```ts
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
```

We call these subpaths "entrypoints". In general, you should create a new entrypoint if you are adding a new integration with a 3rd party library. If you're adding self-contained functionality without any external dependencies, you can add it to an existing entrypoint.

In order to declare a new entrypoint that users can import from, you
should edit the `langchain/create-entrypoints.js` script. To add an
entrypoint `tools` that imports from `agents/tools/index.ts` you could add
should edit the `langchain/scripts/create-entrypoints.js` script. To add an
entrypoint `tools` that imports from `tools/index.ts` you'd add
the following to the `entrypoints` variable:

```ts
const entrypoints = {
// ...
tools: "agents/tools/index.ts",
tools: "tools/index",
};
```

Expand Down
2 changes: 1 addition & 1 deletion README.md
Expand Up @@ -12,7 +12,7 @@ Please fill out [this form](https://forms.gle/57d8AmXBYp8PP8tZA) and we'll set u
`yarn add langchain`

```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
```

## 🤔 What is this?
Expand Down
56 changes: 56 additions & 0 deletions docker-compose.yml
@@ -0,0 +1,56 @@
version: '3'
services:
test-exports-esm:
image: node:18
working_dir: /app
volumes:
- ./test-exports-esm:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-cjs:
image: node:18
working_dir: /app
volumes:
- ./test-exports-cjs:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-cra:
image: node:18
working_dir: /app
volumes:
- ./test-exports-cra:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-cf:
image: node:18
working_dir: /app
volumes:
- ./test-exports-cf:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
test-exports-vercel:
image: node:18
working_dir: /app
volumes:
- ./test-exports-vercel:/package
- ./langchain:/langchain
- ./scripts:/scripts
command: bash /scripts/docker-ci-entrypoint.sh
success:
image: alpine:3.14
command: echo "Success"
depends_on:
test-exports-esm:
condition: service_completed_successfully
test-exports-cjs:
condition: service_completed_successfully
test-exports-cra:
condition: service_completed_successfully
test-exports-cf:
condition: service_completed_successfully
test-exports-vercel:
condition: service_completed_successfully
2 changes: 1 addition & 1 deletion docs/docs/getting-started/guide-chat.mdx
Expand Up @@ -22,7 +22,7 @@ To get started, follow the [installation instructions](./install) to install Lan
This section covers how to get started with chat models. The interface is based around messages rather than raw text.

```typescript
import { ChatOpenAI } from "langchain/chat_models";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanChatMessage, SystemChatMessage } from "langchain/schema";

const chat = new ChatOpenAI({ temperature: 0 });
Expand Down
11 changes: 6 additions & 5 deletions docs/docs/getting-started/guide-llm.mdx
Expand Up @@ -32,7 +32,7 @@ The most basic building block of LangChain is calling an LLM on some input. Let'
In order to do this, we first need to import the LLM wrapper.

```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
```

We will then need to set the environment variable for the OpenAI key. Three options here:
Expand Down Expand Up @@ -110,7 +110,7 @@ The most core type of chain is an LLMChain, which consists of a PromptTemplate a
Extending the previous example, we can construct an LLMChain which takes user input, formats it with a PromptTemplate, and then passes the formatted response to an LLM.

```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
import { PromptTemplate } from "langchain/prompts";

const model = new OpenAI({ temperature: 0.9 });
Expand Down Expand Up @@ -165,9 +165,10 @@ SERPAPI_API_KEY="..."
Now we can get started!

```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutor } from "langchain/agents";
import { SerpAPI, Calculator } from "langchain/tools";
import { SerpAPI } from "langchain/tools";
import { Calculator } from "langchain/tools/calculator";

const model = new OpenAI({ temperature: 0 });
const tools = [new SerpAPI(), new Calculator()];
Expand Down Expand Up @@ -203,7 +204,7 @@ LangChain provides several specially created chains just for this purpose. This
By default, the `ConversationChain` has a simple type of memory that remembers all previous inputs/outputs and adds them to the context that is passed. Let's take a look at using this chain.

```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
import { BufferMemory } from "langchain/memory";
import { ConversationChain } from "langchain/chains";

Expand Down
64 changes: 49 additions & 15 deletions docs/docs/getting-started/install.md
Expand Up @@ -4,13 +4,15 @@ sidebar_position: 1

# Setup and Installation

## Quickstart
:::info
Updating from <0.0.52? See [this section](#updating-from-0052) for instructions.
:::

To get started with Langchain, you'll need to initialize a new Node.js project and configure some scripts to build, format, and compile your code.
## Quickstart

If you just want to get started quickly, [clone this repository](https://github.com/domeccleston/langchain-ts-starter) and follow the README instructions for a boilerplate project with those dependencies set up.
If you want to get started quickly on using LangChain in Node.js, [clone this repository](https://github.com/domeccleston/langchain-ts-starter) and follow the README instructions for a boilerplate project with those dependencies set up.

If you'd prefer to set things up yourself, read on for instructions.
If you prefer to set things up yourself, or you want to run LangChain in other environments, read on for instructions.

## Installation

Expand All @@ -20,20 +22,18 @@ To get started, install LangChain with the following command:
npm install -S langchain
```

We currently support LangChain on Node.js 18 and 19. Go [here](https://github.com/hwchase17/langchainjs/discussions/152) to vote on the next environment we should support.

### TypeScript

LangChain is written in TypeScript and provides type definitions for all of its public APIs.

## Loading the library

### ESM in Node.js
### ESM

LangChain provides an ESM build targeting Node.js environments. You can import it using the following syntax:

```typescript
import { OpenAI } from "langchain/llms";
import { OpenAI } from "langchain/llms/openai";
```

If you are using TypeScript in an ESM project we suggest updating your `tsconfig.json` to include the following:
Expand All @@ -48,25 +48,59 @@ If you are using TypeScript in an ESM project we suggest updating your `tsconfig
}
```

### CommonJS in Node.js
### CommonJS

LangChain provides a CommonJS build targeting Node.js environments. You can import it using the following syntax:

```typescript
const { OpenAI } = require("langchain/llms");
const { OpenAI } = require("langchain/llms/openai");
```

### Cloudflare Workers

LangChain can be used in Cloudflare Workers. You can import it using the following syntax:

```typescript
import { OpenAI } from "langchain/llms/openai";
```

### Other environments
### Vercel / Next.js

LangChain can be used in Vercel / Next.js. We support using LangChain in frontend components, in Serverless functions and in Edge functions. You can import it using the following syntax:

```typescript
import { OpenAI } from "langchain/llms/openai";
```

If you want to use LangChain in frontend `pages`, you need to add the following to your `next.config.js` to enable support for WebAssembly modules (which is required by the tokenizer library `@dqbd/tiktoken`):

```js
const nextConfig = {
webpack(config) {
config.experiments = {
asyncWebAssembly: true,
layers: true,
};

return config;
},
};
```

LangChain currently supports only Node.js-based environments. This includes Vercel Serverless functions (but not Edge functions), as well as other serverless environments, like AWS Lambda and Google Cloud Functions.
## Updating from <0.0.52

We currently do not support running LangChain in the browser. We are listening to the community on additional environments that we should support. Go [here](https://github.com/hwchase17/langchainjs/discussions/152) to vote and discuss the next environments we should support.
If you are updating from a version of LangChain prior to 0.0.52, you will need to update your imports to use the new path structure. For example, if you were previously importing `OpenAI` from `langchain/llms`, you will now need to import it from `langchain/llms/openai`. This applies to all imports from

Please see [Deployment](../production/deployment.md) for more information on deploying LangChain applications.
- `langchain/llms`, see [LLMs](../modules/models/llms/integrations)
- `langchain/chat_models`, see [Chat Models](../modules/models/chat/integrations)
- `langchain/embeddings`, see [Embeddings](../modules/models/embeddings/integrations)
- `langchain/vectorstores`, see [Vector Stores](../modules/indexes/vector_stores/integrations/)
- `langchain/document_loaders`, see [Document Loaders](../modules/indexes/document_loaders/examples/)
- `langchain/retrievers`, see [Retrievers](../modules/indexes/retrievers/)

## Unsupported: Node.js 16

We do not support Node.js 16, but if you want to run LangChain on Node.js 16, you will need to follow the instructions in this section. We do not guarantee that these instructions will continue to work in the future.
We do not support Node.js 16, but if you still want to run LangChain on Node.js 16, you will need to follow the instructions in this section. We do not guarantee that these instructions will continue to work in the future.

You will have to make `fetch` available globally, either:

Expand Down
5 changes: 3 additions & 2 deletions docs/docs/modules/agents/executor/getting-started.md
Expand Up @@ -24,9 +24,10 @@ SERPAPI_API_KEY="..."
Now we can get started!

```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutor } from "langchain/agents";
import { SerpAPI, Calculator } from "langchain/tools";
import { SerpAPI } from "langchain/tools";
import { Calculator } from "langchain/tools/calculator";

const model = new OpenAI({ temperature: 0 });
const tools = [new SerpAPI(), new Calculator()];
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/modules/agents/toolkits/examples/json.md
Expand Up @@ -3,9 +3,9 @@
This example shows how to load and use an agent with a JSON toolkit.

```typescript
import { OpenAI } from "langchain";
import * as fs from "fs";
import * as yaml from "js-yaml";
import { OpenAI } from "langchain/llms/openai";
import { JsonSpec, JsonObject } from "langchain/tools";
import { JsonToolkit, createJsonAgent } from "langchain/agents";

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/modules/agents/toolkits/examples/openapi.md
Expand Up @@ -5,9 +5,9 @@ This example shows how to load and use an agent with a OpenAPI toolkit.
```typescript
import * as fs from "fs";
import * as yaml from "js-yaml";
import { OpenAI } from "langchain/llms/openai";
import { JsonSpec, JsonObject } from "langchain/tools";
import { createOpenApiAgent, OpenApiToolkit } from "langchain/agents";
import { OpenAI } from "langchain";

export const run = async () => {
let data: JsonObject;
Expand Down
6 changes: 3 additions & 3 deletions docs/docs/modules/agents/toolkits/examples/vectorstore.md
Expand Up @@ -3,9 +3,9 @@
This example shows how to load and use an agent with a vectorstore toolkit.

```typescript
import { OpenAI } from "langchain";
import { HNSWLib } from "langchain/vectorstores";
import { OpenAIEmbeddings } from "langchain/embeddings";
import { OpenAI } from "langchain/llms/openai";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import * as fs from "fs";
import {
Expand Down
6 changes: 3 additions & 3 deletions docs/docs/modules/agents/tools/agents_with_vectorstores.md
Expand Up @@ -7,12 +7,12 @@ The recommended method for doing so is to create a VectorDBQAChain and then use
First, you'll want to import the relevant modules:

```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutor } from "langchain/agents";
import { SerpAPI, Calculator, ChainTool } from "langchain/tools";
import { VectorDBQAChain } from "langchain/chains";
import { HNSWLib } from "langchain/vectorstores";
import { OpenAIEmbeddings } from "langchain/embeddings";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import * as fs from "fs";
```
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/modules/agents/tools/index.mdx
Expand Up @@ -47,7 +47,7 @@ The `DynamicTool` class takes as input a name, a description, and a function. Im
See below for an example of defining and using `DynamicTool`s.

```typescript
import { OpenAI } from "langchain";
import { OpenAI } from "langchain/llms/openai";
import { initializeAgentExecutor } from "langchain/agents";
import { DynamicTool } from "langchain/tools";

Expand Down