Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dev UI chatting capability #196

Merged
merged 1 commit into from Jan 3, 2024
Merged

Dev UI chatting capability #196

merged 1 commit into from Jan 3, 2024

Conversation

jmartisk
Copy link
Collaborator

@jmartisk jmartisk commented Jan 2, 2024

Fixes #195

Adds a page for chatting when there's a chat model available .

It assumes that only one chat model registered in CDI is allowed, and that a single chat memory provider will also be always available (are these assumptions ok?)

It allows to have a conversation utilizing a memory and then click a button to restart, dropping the memory

@jmartisk jmartisk requested a review from a team as a code owner January 2, 2024 13:08
@jmartisk
Copy link
Collaborator Author

jmartisk commented Jan 2, 2024

@geoand , NoAiServicesTest is complaining because it assumes that no AI services are found in an application, but with this PR we always add one that is found during build... We need the AI service to be able to use chat memory... Any suggestion what to do?
Is there a way to have the AI service found only in dev mode to avoid having it scanned when building for prod/test mode? Add a specific exception to ignore this one based on the class name?

@geoand
Copy link
Collaborator

geoand commented Jan 2, 2024

This sounds very cool!!!

I'll have a look tomorrow when I'm back.

What I do think we should add (if it's not already in) is the ability to set a system message. This way users can experiment quickly with different prompts.

As a follow up, I think it would also be nice to be able to tweak the model settings thus giving users even more ease in interacting with the model.
And finally, if we also add some kind of history, it would allow users to see what the different options resulted in.

@jmartisk
Copy link
Collaborator Author

jmartisk commented Jan 2, 2024

What I do think we should add (if it's not already in) is the ability to set a system message. This way users can experiment quickly with different prompts.

Is it possible to set a system message dynamically at runtime? I can only see you can set it directly in the @SystemMessage annotation on the AI service, so it has to be set at compile time.

As a follow up, I think it would also be nice to be able to tweak the model settings thus giving users even more ease in interacting with the model.

What exactly do you mean by tweaking the model? We just inject the pre-configured model from CDI.

And finally, if we also add some kind of history, it would allow users to see what the different options resulted in.

The next step might be to implement "chatting" with AI services, but it will be trickier due to having to deal with (de)serialization of model classes and parameters, as opposed to raw usage of a ChatLanguageModel, where the whole prompt and response are simply two strings.
I guess a "history" to compare different conversations might be more relevant to this approach.

@geoand
Copy link
Collaborator

geoand commented Jan 2, 2024

Is it possible to set a system message dynamically at runtime? I can only see you can set it directly in the @SystemMessage annotation on the AI service, so it has to be set at compile time.

Yeah, the ChatLanguageModel allows you to send any kind of message you like, so we would just add a SystemMessage as the first message.

What exactly do you mean by tweaking the model? We just inject the pre-configured model from CDI.

Right, my point is that we would take that model and copy it, while using whatever model parameters the user configured in this view.

I guess a "history" to compare different conversations might be more relevant to this approach.

+1

@geoand
Copy link
Collaborator

geoand commented Jan 3, 2024

NoAiServicesTest is complaining because it assumes that no AI services are found in an application, but with this PR we always add one that is found during build... We need the AI service to be able to use chat memory... Any suggestion what to do?

Do we really need an AI service for this feature?

@jmartisk
Copy link
Collaborator Author

jmartisk commented Jan 3, 2024

Do we really need an AI service for this feature?

Right, probably not, I'll rework it, hold on

@jmartisk
Copy link
Collaborator Author

jmartisk commented Jan 3, 2024

I've removed the usage of AI service now

Copy link
Collaborator

@geoand geoand left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

@geoand geoand merged commit a67a589 into quarkiverse:main Jan 3, 2024
2 checks passed
@jmartisk jmartisk deleted the devui-chat branch January 3, 2024 13:57
@geoand
Copy link
Collaborator

geoand commented Jan 3, 2024

I think we should also do something similar for images now that we have an ImageModel

@jmartisk
Copy link
Collaborator Author

jmartisk commented Jan 3, 2024

Great idea, I've reported #202

@geoand
Copy link
Collaborator

geoand commented Jan 3, 2024

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dev UI support for chatting
2 participants