Skip to content

prompt tokens length compute result is different from openai.ChatCompletion return which "name" in messages" #613

@TTurn

Description

@TTurn

I have problem in file "examples/How_to_format_inputs_to_ChatGPT_models.ipynb",

I use the method "num_tokens_from_messages" in file "examples/How_to_format_inputs_to_ChatGPT_models.ipynb" to calculate the length of prompt tokens,the result is different from the openai.ChatCompletion.create return when there have "function" role in messages, and the model is "gpt-3.5-turbo-0613".

When a "function" role appears, the length will differ by 2,

I think it's because of the value of the parameter "tokens_per_name" is set to -1 instead of 1.

example:
message = [{'role': 'function',"name":"get_info_from_web", 'content': '87°F'}, {'role': 'user', 'content': 'What is the weather like in Hangzhou today?'}]

the prompt_tokens in response of openai.ChatCompletion.create is 26,

but use num_tokens_from_messages the result is 28.

Metadata

Metadata

Assignees

No one assigned

    Labels

    StalebugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions