Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A "real time" chat with Open AI #6

Closed
dec opened this issue Mar 26, 2023 · 3 comments
Closed

A "real time" chat with Open AI #6

dec opened this issue Mar 26, 2023 · 3 comments
Labels
question Further information is requested

Comments

@dec
Copy link
Contributor

dec commented Mar 26, 2023

Hello to all!

First of all, thanks very much for your work with DelphiOpenAI, I find it really useful and very easy to implement. I have a question related to a possible "conversation" chat with Open AI.

By a "conversation" (sorry for my english) I mean to maintain a chat like we can do at https://chat.openai.com/chat

If we start a chat using the above URL and place the below question:

"Hello! My name is David, how are you?"

... we will get a similar response like below:

"Hello David! As an artificial intelligence language model, I don't have emotions, but I'm functioning optimally and ready to assist you with any questions or tasks you have. How can I assist you today?"

If, after that response, we place another question / text like below:

"What is my name?"

... Open AI chat response is something like below:

"Your name is David, as you mentioned in your previous message."

However, using DelphiOpenAI, if I reproduce the above "conversation", after the "What is my name?" question, what I get is something similar to below:

"As an AI language model, I don't have access to your personal information or your name. Can you please tell me your name?"

Maybe I am missing something? It's very probably, so, sorry for that if it's the case! I am using your provided chat stream sample "as is" and I can't find a way to do something like the chat with Open AI with DelphiOpenAI.

Thanks in advance for any help, and, again thanks again for your work!

@HemulGM
Copy link
Owner

HemulGM commented Mar 26, 2023

Because the OpenAI servers do not store the history of communication. This is only done specifically for the chat service own. In order for you to make the same dialogue with AI, you need to send the entire history of communication each time.
For this, I even made a special class that allows you to store history in it, ask for it to be sent and cut it off if it is too long (models have a limit on tokens)

The class is in the OpenAI.Utils.ChatHistory module.

Create a TChatHistory class and add messages there
MyChatHistory.New(TMessageRole.User, 'Your text'); //When you text a bot
and
MyChatHistory.New(TMessageRole.Assistant, 'AI answer'); //When the bot answers you

And use:

var Chat := API.Chat.Create(
    procedure(Params: TChatParams)
    begin
      Params.Messages(MyChatHistory.ToArray);
      Params.MaxTokens(MAX_TOKENS);
    end);

@HemulGM HemulGM added the question Further information is requested label Mar 26, 2023
@dec
Copy link
Contributor Author

dec commented Mar 26, 2023

Hello @HemulGM!

Many thanks for your fast an useful answer, sir!

Keep up the good work!

@dec
Copy link
Contributor Author

dec commented Mar 26, 2023

Hello again, @HemulGM!

Just to say that I tested it right now and works like a charm! Thanks veyr much again for your work and your fast help!

@HemulGM HemulGM closed this as completed Mar 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants