Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

generateContentStream ERROR: TypeError: response.body.pipeThrough is not a function #43

Closed
stackifyit opened this issue Jan 23, 2024 · 10 comments

Comments

@stackifyit
Copy link

stackifyit commented Jan 23, 2024

This works:

const genAI = new GoogleGenerativeAI(apiKey);
const genModel = genAI.getGenerativeModel({ 'gemini-pro' });
const result = await genModel.generateContent(content);

But when I stream, I get the error TypeError: response.body.pipeThrough is not a function:

const genAI = new GoogleGenerativeAI(apiKey);
const genModel = genAI.getGenerativeModel({ 'gemini-pro' });
//error occurs here
const result = await genModel.generateContentStream(content);

I am using node v21.3.0
npm version @google/generative-ai : "^0.1.3"

Thanks in advance!

@stackifyit
Copy link
Author

Per this answer, this is the correct place for this issue.

@123jimin
Copy link

123jimin commented Jan 26, 2024

I've encountering the same problem for ChatSession's sendMessageStream. (Node v20.5.1 and v20.11.0, @google/generative-ai 0.1.3, on Windows 10)

I'm not having the problem on Debian (Node v20.10.0, @google/generative-ai 0.1.3).

https://github.com/google/generative-ai-js/blob/main/packages/main/src/requests/stream-reader.ts#L38
At this line, it seems that response.body is an instance of zlib.Gunzip, not a ReadableStream.

https://github.com/google/generative-ai-js/blob/main/packages/main/src/requests/request.ts#L68
The request was created at this line.

@123jimin
Copy link

I found the source of error. It's not a bug from this repo.

Another library I was using (@mistralai/mistralai) was incorrectly shimming fetch using node-fetch. node-fetch doesn't seem to be compatible with the Google AI SDK.

After I fixed it, the streaming seems to work well.

@hoantran-it
Copy link

I found the source of error. It's not a bug from this repo.

Another library I was using (@mistralai/mistralai) was incorrectly shimming fetch using node-fetch. node-fetch doesn't seem to be compatible with the Google AI SDK.

After I fixed it, the streaming seems to work well.

Can I confirm that you can use sendMessageStream without error, right? I don't have mistralai but I still have the error 😢

@123jimin
Copy link

123jimin commented Jan 26, 2024

Can I confirm that you can use sendMessageStream without error, right? I don't have mistralai but I still have the error 😢

Yes, I can use sendMessageStream without problem.

const genModel = client.getGenerativeModel({model: 'gemini-pro'});
const chat = genModel.startChat({history: []});

const result = await chat.sendMessageStream("What is the answer to the ultimate question of life, the universe, and everything?");
for await(const chunk of result.stream) {
    console.log(chunk.text()); // prints "42"
}

Same for generateContentStream.

const genModel = client.getGenerativeModel({model: 'gemini-pro'});
const result = await genModel.generateContentStream("What is the answer to the ultimate question of life, the universe, and everything?");
for await(const chunk of result.stream) {
    console.log(chunk.text()); // also prints "42"
}

There are a few ways you can check whether you are having the same problem with me:

  • Check whether the lockfile (package-lock.json, pnpm-lock.yaml, or other files depending on the package manager you're using) contains node-fetch as a dependency.
  • Use npm ls --all (warning: output is quite long) (or other similar commands for the package manager you're using) to check which library is using node-fetch.
  • Just before calling sendMessageStream, put console.log(fetch) and check what it prints. On Node.js, it would print [Function: fetch] unless there is a polyfill involved.
  • Use some other hacky ways to check whether fetch was modified; one example would be putting globalThis.native_fetch = fetch at the very start of your program (before including any other package) and check whether globalThis.native_fetch === fetch just before calling sendMessageStream.

@hoantran-it
Copy link

Can I confirm that you can use sendMessageStream without error, right? I don't have mistralai but I still have the error 😢

Yes, I can use sendMessageStream without problem.

const genModel = client.getGenerativeModel({model: 'gemini-pro'});
const chat = genModel.startChat({history: []});

const result = await chat.sendMessageStream("What is the answer to the ultimate question of life, the universe, and everything?");
for await(const chunk of result.stream) {
    console.log(chunk.text()); // prints "42"
}

Same for generateContentStream.

const genModel = client.getGenerativeModel({model: 'gemini-pro'});
const result = await genModel.generateContentStream("What is the answer to the ultimate question of life, the universe, and everything?");
for await(const chunk of result.stream) {
    console.log(chunk.text()); // also prints "42"
}

There are a few ways you can check whether you are having the same problem with me:

  • Check whether the lockfile (package-lock.json, pnpm-lock.yaml, or other files depending on the package manager you're using) contains node-fetch as a dependency.
  • Use npm ls --all (warning: output is quite long) (or other similar commands for the package manager you're using) to check which library is using node-fetch.
  • Just before calling sendMessageStream, put console.log(fetch) and check what it prints. On Node.js, it would print [Function: fetch] unless there is a polyfill involved.
  • Use some other hacky ways to check whether fetch was modified; one example would be putting globalThis.native_fetch = fetch at the very start of your program (before including any other package) and check whether globalThis.native_fetch === fetch just before calling sendMessageStream.

Yes, I have node-fetch in the source code, I use it in another file like this

import fetch from 'node-fetch';

So the problem we have is that the streaming gets node-fetch instead of the original fetch of nodejs, right? Good clue for me to debug the issue, I will try this way tomorrow, thanks for the support @123jimin 🙏

@hoantran-it
Copy link

Yes, you're correct @123jimin the issue caused by this cohere-ai/cohere-typescript#113

I will follow up there, thank you! 🙏

@stackifyit
Copy link
Author

stackifyit commented Jan 31, 2024

So once this repo updates the cohere-typescript fix to 7.7.4 and @google/generative-ai updates its related packages to reflect this, the issue will be fixed?

@hoantran-it
Copy link

@stackifyit Yes, correct, you can try to debug in the code to see if anything modifies the original fetch, in my case, the Async is the original one

Screenshot 2024-02-01 at 10 13 42

@raslasarslas
Copy link

raslasarslas commented Mar 3, 2024

I found the source of error. It's not a bug from this repo.

Another library I was using (@mistralai/mistralai) was incorrectly shimming fetch using node-fetch. node-fetch doesn't seem to be compatible with the Google AI SDK.

After I fixed it, the streaming seems to work well.

Same for me, I modified the content of @mistralai/mistralai and it's work now

@hsubox76 hsubox76 closed this as completed Mar 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants