-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New resend command #89
Conversation
bot/telegram_bot.py
Outdated
with update.message._unfrozen() as message: | ||
message.text = self.last_message[chat_id] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not super stoked about using a private api here, but it does seem to be the easiest way to mutate the update. A cleaner approach would be to split the prompt method up such that we don't have to pass a full Update object here. @n3d1117 what do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There were 2 way how to do it. Copy the object and change it, or change only one filed. I think changing one filed is the most efficient way, and we won't lose messag.text anyway, because it will be = "resend"
with update.message._unfrozen() as message: | ||
message.text = self.last_message[chat_id] | ||
await self.prompt(update=update, context=context) | ||
|
||
async def reset(self, update: Update, context: ContextTypes.DEFAULT_TYPE): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should the last message be reset here as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will be reseted automatically in self.promt function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In testing this seems to work, however it's a bit weird since the history can contain all of the resent messages. To make this more correct, the previous message should be truncated from the chat history to truly "resend" the prompt.
An alternative solution to the issue of resending on error is to make it an explicit configuration. Something like RETRIES_ON_ERROR=int
- this would cause the regular prompt loop to retry up to int
times using something like exponential backoff. In my expeirence the primary annoyance is hitting the openai rate limit.
We could also simply add a minimum time between requests config to reduce the chance of being rate limited in the first place. Although it looks like they are increasing capacity pretty quick based on their rate limit docs so this might not be much of an issue over time.
You can close the PR if you think that feature is useless. Sometimes I get network errors, and for me it's easier to click resend command, then highlight the message and copy paste it. |
Thank you @VolunPotest! I like this, the As @carlsverre was saying, should we pop elements from the chat history before resending? What do you think? |
We're clearing/replacing the chat history in prompt function. Basically, poping will not affect anything. Why is it useful? But I can clean the message before resending, no problem |
Done |
Thanks @VolunPotest! |
Hi!
I would like to add /resend command
I will send the latest question/promt from user to openai
Useful case: sometime openai return network errors.
You can always copy/paste the command, but I think the faster wau will be using this simple command