Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quietly exit for no reason #54

Closed
dany-nonstop opened this issue Dec 8, 2022 · 6 comments
Closed

Quietly exit for no reason #54

dany-nonstop opened this issue Dec 8, 2022 · 6 comments

Comments

@dany-nonstop
Copy link

Thanks for making the API, it works pretty well until a very annoying bug surfaces:

if you make multiple queries consecutively, after about several queries, calling api.sendMessage() would exit without the program any error output at the console.

I'm using node v16.18.1, if that matters.

@dany-nonstop
Copy link
Author

After some tests, it seems when I have too many requests, it will also exit without any prompt -- though the error will be visible if I try ChatGPT's web interface. It'll state: "Too many requests in 1 hour. Try again later." It would be nice to have these error messages also returned to the user as exception, for example.

@transitive-bullshit
Copy link
Owner

Can you provide a code snippet of how you are invoking api.sendMessage? And also include the output stdout and stderr?

if you make multiple queries consecutively, after about several queries, calling api.sendMessage() would exit without the program any error output at the console.

There really shouldn't be any way for this library to exit the process. The only real result I can potentially see is that an Error is thrown and you're not handling it, and for some reason the process' stderr isn't displaying that error / stack trace to you. Really depends on your environment, though.

NOTE: I strongly recommend against having multiple api.sendMessage calls running concurrently. While it may be possible and work for a bit, this will severely degrade the robustness of your program since the current unofficial API isn't really meant to be used this way. I recommend adding a reasonable delay in between sendMessage calls to mitigate rate limits.

Here are a few of the special responses I'm looking for from chatgpt: https://github.com/transitive-bullshit/chatgpt-twitter-bot/blob/main/src/respond-to-new-mentions.ts#L258-L269

@dany-nonstop
Copy link
Author

dany-nonstop commented Dec 8, 2022

Thanks for looking at the problem. The code is very straightforward, there is no concurrent execution, however after some time it will just stop with nothing to report

for (var uid of uids) {
    prompt = 'Generate a summary of the the article\n\n' + text
    response = await api.sendMessage(prompt) //, {conversationId: uid})
    console.log(new Date().toLocaleTimeString() + ' uid ' + result.uid)
}

output

$ node test.js
...
11:58:09 PM uid 151
11:58:49 PM uid 152
$

with no error message nothing, and I have many more items in array uids. I can however resume from uid 153 after an hour. So i believe it's chatgpt-api's problem. It doesn't recognize the error and quietly exit.

@transitive-bullshit
Copy link
Owner

@dany-nonstop is your code open source? It's likely that at some point, a message fails to send and throws an error.

I recommend adding a delay in-between sendMessage requests to avoid running into 503/429 errors. Example: https://github.com/transitive-bullshit/chatgpt-twitter-bot/blob/main/src/respond-to-new-mentions.ts#L157-L158

@transitive-bullshit
Copy link
Owner

@dany-nonstop let me know if there's anything else I can to to try and help. Otherwise, I'm closing this as it doesn't seem to be affecting other people.

Thanks

@dany-nonstop
Copy link
Author

Thank you @transitive-bullshit it's still there, the smallest I can repeat is this, if it helps you debug. I'm already adding 60s between calls. It just stays there.

async function extract() {
  for (var row of content) {
    prompt = 'summarize this paragraph \n\n' + row
    response = await api.sendMessage(prompt)
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants