Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New resend command #89

Merged
merged 2 commits into from
Mar 29, 2023
Merged

New resend command #89

merged 2 commits into from
Mar 29, 2023

Conversation

VolunPotest
Copy link
Contributor

@VolunPotest VolunPotest commented Mar 20, 2023

Hi!
I would like to add /resend command
I will send the latest question/promt from user to openai

Useful case: sometime openai return network errors.
You can always copy/paste the command, but I think the faster wau will be using this simple command

bot/telegram_bot.py Outdated Show resolved Hide resolved
bot/telegram_bot.py Outdated Show resolved Hide resolved
Comment on lines 104 to 145
with update.message._unfrozen() as message:
message.text = self.last_message[chat_id]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not super stoked about using a private api here, but it does seem to be the easiest way to mutate the update. A cleaner approach would be to split the prompt method up such that we don't have to pass a full Update object here. @n3d1117 what do you think?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There were 2 way how to do it. Copy the object and change it, or change only one filed. I think changing one filed is the most efficient way, and we won't lose messag.text anyway, because it will be = "resend"

with update.message._unfrozen() as message:
message.text = self.last_message[chat_id]
await self.prompt(update=update, context=context)

async def reset(self, update: Update, context: ContextTypes.DEFAULT_TYPE):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should the last message be reset here as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It will be reseted automatically in self.promt function

Copy link
Contributor

@carlsverre carlsverre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In testing this seems to work, however it's a bit weird since the history can contain all of the resent messages. To make this more correct, the previous message should be truncated from the chat history to truly "resend" the prompt.

An alternative solution to the issue of resending on error is to make it an explicit configuration. Something like RETRIES_ON_ERROR=int - this would cause the regular prompt loop to retry up to int times using something like exponential backoff. In my expeirence the primary annoyance is hitting the openai rate limit.

We could also simply add a minimum time between requests config to reduce the chance of being rate limited in the first place. Although it looks like they are increasing capacity pretty quick based on their rate limit docs so this might not be much of an issue over time.

@VolunPotest
Copy link
Contributor Author

You can close the PR if you think that feature is useless. Sometimes I get network errors, and for me it's easier to click resend command, then highlight the message and copy paste it.

@n3d1117
Copy link
Owner

n3d1117 commented Mar 25, 2023

Thank you @VolunPotest! I like this, the _unfrozen() call is fine in this context imo.

As @carlsverre was saying, should we pop elements from the chat history before resending? What do you think?

@VolunPotest
Copy link
Contributor Author

VolunPotest commented Mar 28, 2023

Thank you @VolunPotest! I like this, the _unfrozen() call is fine in this context imo.

As @carlsverre was saying, should we pop elements from the chat history before resending? What do you think?

We're clearing/replacing the chat history in prompt function. Basically, poping will not affect anything.

Why is it useful?
Because if prompt isn't sent (network/openai errors), we can try to resend it again.

But I can clean the message before resending, no problem

@VolunPotest
Copy link
Contributor Author

Done

@n3d1117
Copy link
Owner

n3d1117 commented Mar 29, 2023

Thanks @VolunPotest!

@n3d1117 n3d1117 merged commit fec6694 into n3d1117:main Mar 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants