Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SDK fails in ESM mode in combination with openai #12414

Open
3 tasks done
Tracked by #12485 ...
Xhale1 opened this issue Jun 7, 2024 · 14 comments
Open
3 tasks done
Tracked by #12485 ...

SDK fails in ESM mode in combination with openai #12414

Xhale1 opened this issue Jun 7, 2024 · 14 comments
Assignees
Labels

Comments

@Xhale1
Copy link

Xhale1 commented Jun 7, 2024

Is there an existing issue for this?

How do you use Sentry?

Sentry Saas (sentry.io)

Which SDK are you using?

@sentry/node

SDK Version

8.8.0

Framework Version

Node 20.14.0

Link to Sentry event

https://trainwell.sentry.io/issues/5463070600/?project=6067364&query=is%3Aunresolved+issue.priority%3A%5Bhigh%2C+medium%5D&referrer=issue-stream&statsPeriod=14d&stream_index=1

SDK Setup

Sentry.init({
  dsn: "[REDACTED]",
});

Steps to Reproduce

  1. Add openai package
  2. Instrument using instrumentation.js file
  3. Run with node --import ./dist/instrumentation.js dist/index.js

The addition of the following code is what triggers the issue:

import OpenAI from "openai";

const openAI = new OpenAI({
  apiKey: "[REDACTED]",
});

Expected Result

App builds with sentry instrumentation and no errors.

Actual Result

I receive the following error:

TypeError: API.Completions is not a constructor
    at new OpenAI (file:///[REDACTED]/node_modules/.pnpm/openai@4.49.0/node_modules/openai/index.mjs:46:28)
    at file:///[REDACTED]
    at ModuleJob.run (node:internal/modules/esm/module_job:222:25)
    at async ModuleLoader.import (node:internal/modules/esm/loader:316:24)
    at async asyncRunEntryPointWithESMLoader (node:internal/modules/run_main:123:5)

This might be related to #12237 however it appears to be a unique issue unrelated to initialization.

@github-actions github-actions bot added the Package: node Issues related to the Sentry Node SDK label Jun 7, 2024
@Lms24
Copy link
Member

Lms24 commented Jun 10, 2024

Hey @Xhale1 thanks for reporting!

Just to rule something out: Which version of openai are you using? It seems there were related issues in an older version. Thanks!

@hudovisk
Copy link

Same issue here, using openai: 4.47.3

"openai": "^4.47.3",

@Xhale1
Copy link
Author

Xhale1 commented Jun 10, 2024

Thanks for checking in! Just confirmed that my issue exists with version 4.49.1 which appears to be the latest offered by openai.

Let me know if a reproduction repo would help

@Lms24
Copy link
Member

Lms24 commented Jun 10, 2024

A reproduction would be greatly appreciated, thanks :)

@Xhale1
Copy link
Author

Xhale1 commented Jun 10, 2024

This is my first time making an issue reproduction repo, let me know if it works for you: https://github.com/Xhale1/sentry-openai

@Lms24
Copy link
Member

Lms24 commented Jun 10, 2024

Hmm yeah, I can reproduce this myself, thanks for the excellent repro app!

I already tried switching the import style to namespace or named import but it doesn't matter. Looks like for some reason API.Completions within the OpenAI package is undefined rather than a class. My bet is that import-in-the-middle plays a role here. Also I wonder if IITM doesn't handle namespace imports within node_modules correctly. Not sure though.

Update: Narrowed it down to the import * as API from "./resources/index.mjs"
statement in /openai@4.49.1/node_modules/openai/index.mjs. For some reason, this doesn't include Completions.

@timfish when you have a minute, would you mind taking a look?

@timfish
Copy link
Collaborator

timfish commented Jun 12, 2024

It looks like openai/resources/index.mjs exports two different Completions classes, one for /completions and one for /chat/completions.

For import-in-the-middle we deduced that for ESM, duplicate named exports were not exported at all but that doesn't appear to be the case here.

@NatoBoram
Copy link

I made a minimal reproduction here:

It shows that all you have to do to make your app crash is new OpenAI({ apiKey: OPENAI_API_KEY }) when both Sentry and OpenAI are installed and Sentry is preloaded with --import @sentry/node/preload.

@timfish
Copy link
Collaborator

timfish commented Jun 13, 2024

I've opened a PR for import-in-the-middle to fix this:
nodejs/import-in-the-middle#103

@timfish
Copy link
Collaborator

timfish commented Jun 15, 2024

This should hopefully have been fixed by the numerous PRs recently merged at import-in-the-middle.

While we wait for this to be released, there is a patch available that combines all the fixes. If anyone can confirm this patch fixes this issue that would be super helpful!

@danilofuchs
Copy link

@timfish from my local testing, the patch seems to address the issue!

@Xhale1
Copy link
Author

Xhale1 commented Jun 24, 2024

Confirming that that patch also appears to be working for us, thank you!

@NatoBoram
Copy link

Confirmed as fixed in Sentry v8.13.0

@timfish
Copy link
Collaborator

timfish commented Jun 28, 2024

There has been some remaining issues reported with openai due to the way it's "shims" work internally.

For example, someone has reported issues with this code:

import OpenAI from "openai";

const openai = new OpenAI({
  apiKey: "fake-api-key",
});

async function doWork() {
  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      {
        role: "user",
        content: "Hello, how are you?",
      },
    ],
  });

  console.log(response);
}

doWork().catch((err) => {
  console.error(err);
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: No status
Development

No branches or pull requests

8 participants