Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Streaming Support for Inline-Query Callback Responses #235

Merged
merged 2 commits into from
Apr 16, 2023

Conversation

bugfloyd
Copy link
Contributor

@bugfloyd bugfloyd commented Apr 15, 2023

This PR introduces streaming support for inline query responses when the call to action button is used. While reviewing the code, please consider the following notes:

  • Message streaming has been implemented to improve the user experience without including the code snippet related to chunk management for inline queries, as new messages cannot be sent in this context.
  • A thorough review of the token management and async coroutine behavior is appreciated to ensure optimal performance and adherence to best practices.
  • The signature of the wrap_with_indicator function has been modified to accommodate the handling of inline queries.

Feel free to provide feedback and suggest any additional enhancements to the implementation.

text=f'{query}\n\n_{answer_tr}:_\n{content}',
is_inline=True)
except:
continue
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not log?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jvican good question. I have used the logic from the main prompt function, as the base for the inline query streaming.
I think we can add logging to both and also try to combine the semi-duplicated code

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for future logic isolation! Yes we can add a log here no problem, I didn't do that initially because these operations (delete+send initial message) failed very rarely in my testing, and even if they did they will be just retried when the next chunk comes in

backoff = 0
async for content, tokens in stream_response:
if len(content.strip()) == 0:
continue
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not log?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and here it is the same as the other comment.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to log anything here. Sometimes OpenAI stream response will just be whitespaces at the beginning, I simply ignore it and wait for next chunk

@n3d1117
Copy link
Owner

n3d1117 commented Apr 15, 2023

Thank you @bugfloyd!
Earlier today I was exploring the possibility of isolating some logic for handling stream response, but I couldn't find a nice way to do that. The telegram_bot.py file is now 1000 lines and I don't like that 😃 some refactoring might be needed in the future to keep it readable.

Anyway, some notes on the PR:

  • I think we can use the faster cutoff values instead of the group chat ones. From my testing they seem to work just fine
  • You're right, there can't be any chunking in inline mode. So for very long answers I think we need to only include first 4096 characters and ignore the rest

I've pushed these minor fixes to your PR branch (hope you're ok with it). Looks good to me, let me know what you think.

@bugfloyd
Copy link
Contributor Author

@n3d1117 Thanks for the fixes. Based on the current changes, I believe this PR is good to go for now. Additionally, I have some suggestions to improve the code and reduce the size of this class. If you don't get to it first, I plan to propose a PR to implement these changes next week.

@n3d1117
Copy link
Owner

n3d1117 commented Apr 16, 2023

That would be awesome @bugfloyd!

@n3d1117 n3d1117 merged commit 1ac0b24 into n3d1117:main Apr 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants