Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Cloudflare Workers / Vercel Edge Functions #62

Closed
10 of 11 tasks
dqbd opened this issue Feb 19, 2023 · 31 comments · Fixed by #632
Closed
10 of 11 tasks

Add support for Cloudflare Workers / Vercel Edge Functions #62

dqbd opened this issue Feb 19, 2023 · 31 comments · Fixed by #632
Labels
env/packaging Issues related to packaging/bundling

Comments

@dqbd
Copy link
Collaborator

dqbd commented Feb 19, 2023

So far it seems like there are 3 culprits which may block us from running Langchainjs on Edge.

This is the error output when attempting to deploy the examples/src/chains/llm_chain.ts via Vercel:

CleanShot 2023-02-19 at 19 28 23@2x

@jasongill
Copy link
Contributor

I'm also investigating this, seeing if I can clean up or pare down the package a bit to make it work in Cloudflare Workers. In theory node_compat = true in wrangler.toml should fix many of those missing modules by adding polyfills, but will also need to replace node's crypto with WebCrypto I think.

@nfcampos
Copy link
Collaborator

Another thing to fix here is to add support for at least one docstore that is external, as there is no filesystem support to load one from a file

@nfcampos
Copy link
Collaborator

One option for openai adapter is mentioned here openai/openai-node#30 (comment)

@nfcampos
Copy link
Collaborator

@dqbd thanks so much for getting this started, updated your issue above to keep track of progress being made

@nfcampos
Copy link
Collaborator

Discussion on next envs to support here #152

@jasongill
Copy link
Contributor

@nfcampos @dqbd I would be happy to pay to encourage support for Cloudflare Workers for Langchain, if there's a place/dev I can send $500 to contribute to getting this done please let me know! Langchain at the edge is very exciting.

@hwchase17
Copy link
Collaborator

@jasongill trying to think/understand the use case here. this may sound like a really stupid question, but what exactly is it about langchain at the edge that excites you so much?

@jasongill
Copy link
Contributor

We run a large "AI writing" tool (think Jasper competitor) and one big issue we have is slow responses from LLM's. With a "server side" model, these slow responses consume resources - be it webserver connection slots, open file descriptors, etc, not to mention a small amount of CPU cycles which can add up with tens of thousands of simultaneous users.

Being able to build our prompts and chains into a serverless environment allows us to no longer worry about scalability issues - each user request straight from the browser can be handled by a Cloudflare Worker and can take as long as it wants without consuming any of our server time or resources, returning a response back to our frontend interface when it's done.

Basically, having Langchain at the edge will allow us to get rid of servers and worry less about stability and more about just building great prompts & user experiences.

@nfcampos
Copy link
Collaborator

nfcampos commented Mar 1, 2023

Btw if someone wants to take this on, I started on the remaining issues on this branch main...nc/test-envs

This

  1. Adds a way to test against cf workers and vercel edge (to find all the missing issues)
  2. Fixes (or starts to fix, can't remember just now) the crypto issue mentioned in the issue at the top

@jasongill
Copy link
Contributor

I'm still happy to put a bounty on getting Cloudflare Worker support added, or donate to charity of devs choice, to get this done - unfortunately actually doing the coding is above my skill level but happy to contribute to the project to encourage this feature!

@nfcampos nfcampos added the env/packaging Issues related to packaging/bundling label Mar 18, 2023
@isaiahtaylor
Copy link

FYI - if you are trying to use Edge runtime because it supports streaming, you can now stream from Serverless: https://twitter.com/isaiah_p_taylor/status/1638216571507331078

@tiagoefreitas
Copy link

tiagoefreitas commented Mar 22, 2023 via email

@yvesloy
Copy link

yvesloy commented Apr 3, 2023

Do you already know when you can get to task 2 (better treeshaking support)?
Can I help with this?

@nfcampos
Copy link
Collaborator

nfcampos commented Apr 4, 2023

@yvesloy task 2 has actually been completed already, I had forgotten to update the list above. The next step is to setup a build specifically for edge

@treyhuffine
Copy link

Very excited about this and happy to contribute as needed to get it working on Vercel Edge

@nfcampos
Copy link
Collaborator

Hi, edge/browser support has been published as a prerelease, you can install the prerelease with npm i langchain@next to try it out before we release to everyone.

See upgrade instructions here https://langchainjs-docs-git-nc-test-exports-cf-langchain.vercel.app/docs/getting-started/install you'll need to update import paths, see the link for more details

If you test it let me know any issues!

@treyhuffine
Copy link

Thanks @nfcampos I'll start testing!

@nfcampos
Copy link
Collaborator

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes

Any issues let me know

@abhagsain
Copy link

abhagsain commented Apr 12, 2023

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes

Any issues let me know

I'm really excited about this update. However, importing langchain significantly increases the bundle size for Cloudflare Workers. I just imported it to try it out in my app and received a warning.

Before

Total Upload: 928.13 KiB / gzip: 187.40 KiB

After

Total Upload: 4178.92 KiB / gzip: 1457.85 KiB
▲ [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. 
Exceeding past this can affect cold start time

I haven't used langchain yet, so I followed the docs and only imported these.

import { OpenAI } from 'langchain/llms/openai'
import { PromptTemplate } from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'

Any recommendations @nfcampos ?

Edit - It's cold start time so it won't be that bad after the first request so I think I don't need to worry I guess.

@beljand
Copy link

beljand commented Apr 15, 2023

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes
Any issues let me know

I'm really excited about this update. However, importing langchain significantly increases the bundle size for Cloudflare Workers. I just imported it to try it out in my app and received a warning.

Before

Total Upload: 928.13 KiB / gzip: 187.40 KiB

After

Total Upload: 4178.92 KiB / gzip: 1457.85 KiB
▲ [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. 
Exceeding past this can affect cold start time

I haven't used langchain yet, so I followed the docs and only imported these.

import { OpenAI } from 'langchain/llms/openai'
import { PromptTemplate } from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'

Any recommendations @nfcampos ?

Edit - It's cold start time so it won't be that bad after the first request so I think I don't need to worry I guess.

Same issue here, you will not be able to deploy this worker (since it exceeded the size limit)

@abhagsain
Copy link

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes
Any issues let me know

I'm really excited about this update. However, importing langchain significantly increases the bundle size for Cloudflare Workers. I just imported it to try it out in my app and received a warning.

Before

Total Upload: 928.13 KiB / gzip: 187.40 KiB

After

Total Upload: 4178.92 KiB / gzip: 1457.85 KiB
▲ [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. 
Exceeding past this can affect cold start time

I haven't used langchain yet, so I followed the docs and only imported these.

import { OpenAI } from 'langchain/llms/openai'
import { PromptTemplate } from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'

Any recommendations @nfcampos ?
Edit - It's cold start time so it won't be that bad after the first request so I think I don't need to worry I guess.

Same issue here, you will not be able to deploy this worker (since it exceeded the size limit)

I haven't tried deploying it yet but the limit is 5MB after compression right? So shouldn't be a problem.
image

@captainjapeng
Copy link

The 5MB limit only applies for paid plans, for the Free Tier it's only 1MB. Also the warning says it could affect the start up time.

It would help if we can reduce this or use treeshaking but I don't know if CF Worker is already doing that.

@abhagsain
Copy link

The 5MB limit only applies for paid plans, for the Free Tier it's only 1MB. Also the warning says it could affect the start up time.

It would help if we can reduce this or use treeshaking but I don't know if CF Worker is already doing that.

Got it. I asked this in the CF worker discord, they said it will only affect the cold start time so it will only happen once if your worker is asleep for a long time. After the first time forward it will faster even so they said don't worry about it.
But I do agree on the limit on the free and paid tiers some people would need to upgrade to Paid plan just to use langchain on workers.

@nfcampos
Copy link
Collaborator

I've opened an issue to investigate bundle size, #809 if anyone wants to help out there very welcome

@jaredpalmer
Copy link

jaredpalmer commented May 8, 2023

Jared here from Vercel. Can we please re-open this issue? The most basic OpenAI x Next.js call w/Edge Runtime still fails because of TikToken. We would like to send folks toward LangChain in our documentation, but serverless is not cost effective or as performant for our customers

I have created an example that reproduces this issue: https://github.com/jaredpalmer/nextjs-langchain-bug/tree/main

CleanShot 2023-05-08 at 09 14 10@2x

@nfcampos
Copy link
Collaborator

nfcampos commented May 8, 2023

cc @dqbd this one might be best for you
@jaredpalmer thanks for the repro, we'll look into this asap

@jaredpalmer
Copy link

jaredpalmer commented May 8, 2023

Update: I was able to get it working in Next.js and Vercel with the following changes to Next.js's webpack config:

/** @type {import('next').NextConfig} */
const nextConfig = {
  experimental: {
    serverActions: true,
  },
  webpack(config) {
    config.experiments = {
      asyncWebAssembly: true,
      layers: true,
    };

    return config;
  },
};

module.exports = nextConfig;

@nfcampos
Copy link
Collaborator

nfcampos commented May 8, 2023

We actually have similar config in our install instructions in the docs, see https://js.langchain.com/docs/getting-started/install#vercel--nextjs would you like us to reword it or change it in any way?

@jaredpalmer
Copy link

If you want to use LangChain in frontend pages, you need to add the following to your next.config.js to enable support for WebAssembly modules (which is required by the tokenizer library @dqbd/tiktoken):

should change to something like

To use LangChain with Next.js (either with app/ or pages/), add the following to your next.config.js to enable support for WebAssembly modules (which is required by the LangChain's tokenizer library @dqbd/tiktoken):

@nfcampos
Copy link
Collaborator

nfcampos commented May 8, 2023

Cool, I've updated it here #1165

@nfcampos
Copy link
Collaborator

@jaredpalmer hi, we've merged PR #1239 which removes the usage of a WASM tokeniser library, and so removes the need for any special config for Next.js/Vercel. This will be released today, and should make using langchain from Next.js "just work".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
env/packaging Issues related to packaging/bundling
Projects
None yet