Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chat.completions.create return null on browser #232

Closed
1 task done
aleluff opened this issue Aug 22, 2023 · 33 comments
Closed
1 task done

chat.completions.create return null on browser #232

aleluff opened this issue Aug 22, 2023 · 33 comments
Labels
bug Something isn't working

Comments

@aleluff
Copy link

aleluff commented Aug 22, 2023

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • This is an issue with the Node library

Describe the bug

While developing an web app on localhost
Works like a charm with 3.6 but after upgrading to v4 and adapating the code

chat.completions.create return always null with dangerouslyAllowBrowser: true

Works well in back with the same code

To Reproduce

  1. Use demo code in browser
  2. Add dangerouslyAllowBrowser to initiator option
  3. Run and get error

Code snippets

No response

OS

macOS

Node version

Node v18.16

Library version

openai v4

@aleluff aleluff added the bug Something isn't working label Aug 22, 2023
@rattrayalex
Copy link
Collaborator

Can you provide sample code for reproduction?

@aleluff
Copy link
Author

aleluff commented Aug 25, 2023

import { OpenAI } from "openai";

@Component({
	selector: 'app-ai',
	templateUrl: './ai.component.html',
	styleUrls: ['./ai.component.scss']
})
export class AiComponent {

	openai?: OpenAI;

	constructor(
	) {
		this.openai = new OpenAI({apiKey: "API_KEY", dangerouslyAllowBrowser: true});
	}

	async sendMessage(txt: string) {
		try {
			const response = await this.openai!.chat.completions.create({
				model: 'gpt-3.5-turbo',
				messages: [
					{role: "system", content: "Explain me the purpose of life"},
				],
			});
			console.log(response);
		} catch (error: any) {
			console.log(error);
		}
	}
}


@rattrayalex
Copy link
Collaborator

I can't reproduce; for me, response prints out a chat completion object.

Can you share more details, such as the build system you are using, the browser you are running in, the output of the program, whether other kinds of requests from the API work in the browser, whether the same request works for you with the same API Key in Node, etc?

@aleluff
Copy link
Author

aleluff commented Aug 25, 2023 via email

@rattrayalex
Copy link
Collaborator

I was not using Angular, no. Can you provide a GitHub repo with minimal repro?

@aleluff
Copy link
Author

aleluff commented Aug 26, 2023

Ok, so it seems that the library intercept the OPTIONS request instead of the real POST one

@aleluff
Copy link
Author

aleluff commented Aug 26, 2023

@rattrayalex
Copy link
Collaborator

rattrayalex commented Aug 26, 2023

Running that locally, it works just fine (my only change was adding an API key):

Screenshot 2023-08-26 at 10 54 17 AM

Can you include a screenshot of the Network tab in devtools, showing the request being made and the response? I suspect this may be a networking issue; perhaps check OpenAI's list of supported countries.

@aleluff
Copy link
Author

aleluff commented Aug 26, 2023

Ok so, after some investigation,
I solved it use .then instead of await
I guess it comes from Angular 14 / Zone 0.12

@vahid1975
Copy link

Great! using .then in stead of await in Angular was the solution for me as well

@rattrayalex
Copy link
Collaborator

This does seem related to the way Zone.js handles promises; angular/angular#31730 may be related.

You might consider posting there (and may want to note that this library subclasses Promise).

@aleluff
Copy link
Author

aleluff commented Aug 29, 2023

I don't think so, it works with v15 so it's already fixed i guess

@piotrek-k
Copy link

I can confirm this issue using Angular 16. Running GPT inference on browser results with create() returning null.

@rattrayalex
Copy link
Collaborator

Thank you @piotrek-k . Please report this issue to the Angular project.

@maxmumford
Copy link

maxmumford commented Nov 10, 2023

We're running angular 16 and zone 0.13.0. Upgrading zone to ^0.14.2 at least allowed me to use .then - still no luck with await though.

I also had to change the zone.js import in polyfills.ts to import 'zone.js';.

@kuncevic
Copy link

I am having same issue once migrated from openai v3 to openai v4, I wonder if this is related https://discord.com/channels/974519864045756446/1176330586621755492.

Angular v17, have no problems with v3 module but once migrated to v4 completion is null

async createCompletionViaOpenAI(
    messages: OpenAI.Chat.ChatCompletionMessageParam[]
  ) {
    return await this.openai.chat.completions.create(
      {
        model: 'gpt-3.5-turbo',
        messages: messages,
      },
      {
        headers: {
          'Content-Type': 'application/json',
          'X-User-Agent': 'OpenAPI-Generator/1.0/Javascript',
        },
      }
    );
  }

 const completion = await this.chatService.createCompletionViaOpenAI(
        this.messages
      );
 console.log(completion); //returns null

@rattrayalex
Copy link
Collaborator

Probably – did you try upgrading zone to ^0.14.2 and using .then() instead of await, as suggested here?

This bug report belongs on the Angular / Zone project.

@kuncevic
Copy link

@rattrayalex my project uses zone.js@0.14.2, seems still an issue, yes, .then() helped, I wonder if that was reported to Angular team?

@rattrayalex
Copy link
Collaborator

I encourage you to open an issue with them!

@gsans
Copy link

gsans commented Dec 2, 2023

This is not an Angular related issue. As it breaks async/await code as seen in the issue opened in LangChain codebase.

Within Angular is fine that there’s a workaround when calling the API but when included as a dependency (LangChainJS) Angular has nothing to do with it.

Given it breaks async/await code I would rather investigate that than ask all maintainers to change async/await code to use .then syntax instead.

@rattrayalex
Copy link
Collaborator

rattrayalex commented Dec 2, 2023

This bug only appears in Angular's Zones project. I have filed an issue there: angular/angular#53380

I don't know which Langchain issue you're referring to, but I suggest linking to that issue there.

@gsans
Copy link

gsans commented Dec 3, 2023

This has been reported both in Angular and LangchainJS repositories. In any case the root of the issue seems to reside here.

These are the issues filed and the context around these

angular/angular-cli#26204

langchain-ai/langchainjs#3105 (comment)

As explained previously in this same thread the cause to this problem seems to be introduced by OpenAI.

@rattrayalex
Copy link
Collaborator

It's definitely not a langchain bug; the problem is that zone.js can't handle the way that this library subclasses Promise.

If the zone.js maintainers have a reasonable recommendation for how we could make that easier for their tooling, I'm willing to take it into consideration, but this library isn't doing anything disallowed by the spec, so I consider this a bug in Angular Zones.

Until this is resolved, I recommend not using Angular Zones if you wish to use the OpenAI library in the browser.

@gsans
Copy link

gsans commented Dec 4, 2023

It’s difficult to defend your position when we have seen a change in versions from working to not working using the same version of Angular and LangChainJS with the only change was in an openAI package.

@JiaLiPassion
Copy link

here is a walk around.

https://github.com/angular/angular/issues/53380

And the reason of the zone.js conflict with openai is here

resolve(null as any);

need more time to find out a solution.

@gsans
Copy link

gsans commented Dec 5, 2023

Thanks a lot for providing a bit more context into this. My issue at the moment is that LangChainJS codebase is not using this patch so LangChainJS/OpenAI at the moment is not working for any Angular projects for the time being which is quite an impact.

As this is a dependency we can’t do any patches unless we drop LangChainJS altogether or froze the dependency to the last version before the breaking change resolve(null) and miss on all updates to LangChainJS from that point forward.

@rattrayalex
Copy link
Collaborator

@gsans please share your concerns with the Angular team. This library does not prioritize supporting zone.js.

@gsans
Copy link

gsans commented Dec 6, 2023

I don’t know how to explain it to you. The issue is not with Angular but Angular/LangChainJS/openAI.

There’s a version of LangChainJS/openAI that works perfectly fine in Angular. The issue was introduced by a change in OpenAI.

There’s no way we can change from Angular how LangChainJS codebase uses OpenAI. The Angular team can’t do anything about LangChainJS or OpenAI.

If you are supportive of the ecosystem you will make sure these play well together but don’t ignore the fact you introduced this issue in the first place not Angular.

@aleluff
Copy link
Author

aleluff commented Dec 6, 2023

@gsans Is right, the problem appear with the version publish few days before I opened the issue

If you need a working workaround : https://gitlab.com/web-db/app/-/blob/main/front/src/app/right/ai/ai.component.ts?ref_type=heads#L203

Once said, maybe the problem come really from Angular/LangChain but keep quiet

@sbbeez
Copy link

sbbeez commented Mar 27, 2024

I'm still facing the same issue in next.js

Framework: Next.js 14.2
OpenAI npm: 4.29.2

Following is my code:

    const openai = new OpenAI({
      apiKey: apiKey,
      dangerouslyAllowBrowser: true,
    });

    const response = await openai.chat.completions.create({
      ...openaiConstants,
      messages: [
        {
          role: "system",
          content: prompt.replace("{{content}}", content),
        },
        {
          role: "user",
          content,
        },
      ],
    });
    const output = response.choices?.[0]?.message?.content
      ?.trim()
      .replace(/^"/g, "")
      .replace(/"$/g, "");

in my case the response is returned immediately as null but I can see the API was still loading in the network tab.

The weird thing I'm noticing is that the whole error happens only on the first load of the app, after I refresh the screen once things seems to work normally.

@gsans
Copy link

gsans commented Mar 27, 2024

Thanks @JiaLiPassion for following up. Much appreciated. It seems that with more knowledge around the root cause LangChainJS could have remedied the openAI API quirk for all clients. Zones are handling this now but still causing issues apparently as reported by @sbbeez . Note the workarounds mentioned in this thread.

@rattrayalex
Copy link
Collaborator

@sbbeez the issue you're referring to is different from the one discussed here, which is Angular-specific.

I'm not sure what could be causing it; could you open a separate issue? Ideally with a minimal repro in a codesandbox or equivalent?

@aleluff
Copy link
Author

aleluff commented Mar 27, 2024

You have to use the old promise resolution way ".then("
Like that

const stream = this.openai!.chat.completions.create(<ChatCompletionCreateParamsStreaming>body);
const response = await new Promise(resolve => {
  stream.then(async str => {
	  let response = "";
	  for await (const part of str) {  
		  response += part.choices[0]?.delta?.content || '';
	  }
	  resolve(response);
  }).catch(error => {
	  resolve(error.message || `An error occurred during OpenAI request: ` + error);
  });
});

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

9 participants