Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading state when awaiting llm responses #19

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

hunter547
Copy link
Contributor

I was noticing that backroad can only support synchronous llm responses, which is not a realistic scenario given llm requests don't provide immediate replies. This one was tough to implement and not sure this is the best strategy, but it gets the job done and looks great on the UI. Open to make changes, as I'm sure I didn't follow a pattern or two. Thanks

@hunter547
Copy link
Contributor Author

opera_TmM31pzjI3

@sudo-vaibhav
Copy link
Collaborator

Thanks for the PR, give me some time to go through the code changes and get back with my feedback

@hunter547
Copy link
Contributor Author

@sudo-vaibhav Sounds good, thanks

@hunter547
Copy link
Contributor Author

@sudo-vaibhav Checking in to see if you’ve had a chance to review this PR. Thanks!

@sudo-vaibhav
Copy link
Collaborator

This is a nice idea, and I appreciate this contribution. I see two ways ahead -

  1. This code as you showed in the screenshot works. And I understand the value that this will add immediately to the LLM experience. And here I propose diving slightly deeper and seeing if we can enhance this further.

  2. The real core ask which I understand here is a better way of handling blocking actions across backroad. Maybe a standardized way or experience of handling UI when part of it is blocked via an await. Here in LLMs, it manifests itself as a loader in the message box, but the clean programmer in me (or whatever is left of that disillusioned old man within me) thinks of if we can come up with a core change that allows adding these loading experiences across the board for more than just this use case.

Maybe this second idea will create a subpar experience for individual components (maybe the maximum that we can achieve with this generic approach is putting a loader on the whole LLM container as a whole) but then cater to a much broader array of use cases.

Since you have worked much more on this interesting problem and have done the cool hands-on work while implementing this LLM loader, do you have any inputs on how we might implement this loading concept once framework-wide, and then be able to make these individual component implementations of loaders in much fewer lines of code @hunter547

@sudo-vaibhav
Copy link
Collaborator

lgtm for now, merging to alpha, can prioritise refactoring later

@sudo-vaibhav
Copy link
Collaborator

sudo-vaibhav commented Feb 26, 2024

Good work @hunter547 , urge you to also raise a PR on the docs repo, so that people can be aware of how to use this new addition here: https://github.com/sudomakes/backroad-documentation/blob/main/src/content/docs/docs/API%20Reference/LLM.mdx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants