-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenAI module not found/failed to load module in Supabase edge functions #3242
Comments
🤖 Hi @j1philli, great to see you again! I'm diving into the issue you've reported right now. Rest assured, I'll get back to you with a detailed response as soon as possible. |
🤖 Based on the issues you've described, it seems like you're encountering a problem with the OpenAI module not being found or failing to load in your Supabase edge functions. This issue has been encountered before in the langchainjs repository, and there are a few potential solutions you could try. One solution that was suggested in a similar issue (ChatOpenAI breaks edge streaming on Vercel Edge functions.) was to bump the OpenAI dependency. This was resolved by a pull request: Bump OpenAI dep. Another issue (OpenAI import fails in Vercel deployment Lastly, if you're having import problems with Deno on Supabase edge functions, you could try using esm with the following URL: If OpenAI no longer supports ES modules for edge functions, you can use a CommonJS require statement instead of an ES6 import statement. Here's how you can modify your code: const { ChatOpenAI } = require("langchain/chat_models/openai");
const { HumanMessage } = require("langchain/schema");
const { SerpAPI } = require("langchain/tools");
const model = new ChatOpenAI({
temperature: 0.9,
openAIApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
});
// You can also pass tools or functions to the model, learn more here
// https://platform.openai.com/docs/guides/gpt/function-calling
const modelForFunctionCalling = new ChatOpenAI({
modelName: "gpt-4",
temperature: 0,
});
await modelForFunctionCalling.predictMessages(
[new HumanMessage("What is the weather in New York?")],
{ tools: [new SerpAPI()] }
// Tools will be automatically formatted as functions in the OpenAI format
); In this code, we're using the Unfortunately, I wasn't able to find the exact version of OpenAI SDK that langchainjs is currently using. You might want to check the I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
For now, you can downgrade to 0.0.181, which should work - unfortunately not much we can do until the underlying SDK fixes it. |
@jacoblee93 the comment tagged in the issue sounds like they are not publishing to esm anymore |
Yeah :( hopefully they will start again soon. |
Hopefully fixed by the above ^ |
Symptoms
Getting this log output when running my edge functions in Supabase
Failed to load module: "https://esm.sh/v133/openai@4.17.4/esnext/resources.js" - Import 'https://esm.sh/v133/openai@4.17.4/esnext/resources.js' failed, not found.
VS Code is throwing an error this when caching dependencies
Module not found "https://esm.sh/v133/openai@4.17.4/denonext/resources.js". at https://esm.sh/v133/openai@4.17.4/denonext/openai.mjs:7:13034
Looks like OpenAI isn't supporting esm.sh anymore for edge functions:
openai/openai-node#393 (comment)
Deno 1.38.1
Langchain .186
The text was updated successfully, but these errors were encountered: