Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable OpenAI model streaming and fix num_tokens_from_messages #184

Merged
merged 9 commits into from
Jun 30, 2023

Conversation

zechengz
Copy link
Member

@zechengz zechengz commented Jun 26, 2023

Description

  • Enable stream for OpenAI model and handle stream mode in ChatAgent
  • Improve the translator example
  • Fix num_tokens_from_messages

Following is an example for translator

python3 translation/translator.py --single --language chinese --stream

for a simple input json file

{
	"message_1": {"content": "My name is Zecheng and I am from China."},
	"message_2": {"content": "I love the CAMEL Python package and now I am a contributor to the package!"},
	"message_3": {"content": "This is a CAMEL translator example :)."},
	"num_messages": 3
}

Motivation and Context

Parts of #178

Some changes for future PRs:

  • Support other agents with the stream mode and check whether stream mode works for role playing etc.
  • Deprecate print in other place if possible.

Types of changes

What types of changes does your code introduce? Put an x in all the boxes that apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of example)

Checklist

  • I have read the CONTRIBUTION guide. (required)
  • My change requires a change to the documentation. (minor change)
  • I have updated the tests accordingly. (required for a bug fix or a new feature) (Will add unit tests after first round of reviews.)
  • I have updated the documentation accordingly.

@zechengz zechengz added enhancement New feature or request Agent Related to camel agents Example labels Jun 26, 2023
@zechengz zechengz requested a review from lightaime June 26, 2023 05:04
@zechengz zechengz self-assigned this Jun 26, 2023
Copy link
Collaborator

@Obs01ete Obs01ete left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A library can NOT use print (and input) as its core functionality. RolePlaying step should be split into single steps and the step should return an iterator that yields words or chunks as they are received as deltas. All printing should be done in user code: examples, apps etc. Even though CAMEL is currently using print and input we are going to move away from this to a better API. I suggest to not merge this.

@zechengz
Copy link
Member Author

zechengz commented Jun 27, 2023

@Obs01ete Several questions,

  • Are we going to support any terminal experience in the future with printing response directly? If not I will remove print directly. If yes, I may still leave it as an option for users. Let user specify whether print response directly in the terminal
  • Now we return ChatAgentResponse with msgs as List[ChatMessage] for step(), if we are going to change the return value to iterator, I will move functions that handle model output (for batch and streaming) to some other place such as utils
    Wdyt? Thanks in advance.

@Obs01ete
Copy link
Collaborator

@zechengz thanks for your comments. We are not going to support terminal prints to display the answers starting from the nearest future. Most likely we will rework ChatMessage into a future object supporting .get() and a chunk generator for streaming mode.

@zechengz zechengz changed the title Enable OpenAI model streaming and print animated text "on-the-fly" for a ChatAgent Enable OpenAI model streaming and fix num_tokens_from_messages Jun 29, 2023
@zechengz zechengz added the bug Something isn't working label Jun 29, 2023
@zechengz zechengz marked this pull request as ready for review June 29, 2023 11:22
@zechengz
Copy link
Member Author

@Obs01ete I have updated the PR. For now I still parse the streaming output same as that batch output. Other modification please refer to the PR description. Please take another round of review if possible. Thanks in advance!

@zechengz zechengz requested a review from Obs01ete June 29, 2023 11:26
Copy link
Member

@lightaime lightaime left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @zechengz! Looks great overall. Left some comments.

@Obs01ete if we do not support any logging how stream is supposed to be usefull?

camel/configs.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
camel/agents/chat_agent.py Outdated Show resolved Hide resolved
Comment on lines +368 to +371
encoding = get_model_encoding(self.model.value_for_tiktoken)
completion_tokens = 0
for message in output_messages:
completion_tokens += len(encoding.encode(message.content))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the difference between this and num_tokens_from_messages?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that openai usage for completion tokens does not include tokens_per_message etc in the num_tokens_from_messages

camel/configs.py Outdated Show resolved Hide resolved
@zechengz zechengz requested a review from lightaime June 30, 2023 10:02
Copy link
Member

@lightaime lightaime left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks @zechengz

@zechengz zechengz enabled auto-merge (squash) June 30, 2023 20:45
@lightaime lightaime disabled auto-merge June 30, 2023 23:16
@lightaime lightaime merged commit e92705f into master Jun 30, 2023
@lightaime lightaime deleted the zecheng_chat_agent_stream branch June 30, 2023 23:16
@Obs01ete
Copy link
Collaborator

Obs01ete commented Jul 3, 2023

Thanks @zechengz! Looks great overall. Left some comments.

@Obs01ete if we do not support any logging how stream is supposed to be usefull?

The Agent API will have a streaming mode. This way an app like Agents can print the chat in real time as OpenAI API generates it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Agent Related to camel agents bug Something isn't working enhancement New feature or request Example
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants