Skip to content

GPT-4-0613 Finish_Reason Null Issue in Streaming Mode #558

@pratikchhapolika

Description

@pratikchhapolika

Describe the bug

I am facing an issue while using the latest OpenAI GPT-4-0613 models in streaming mode. Specifically, I have noticed that for some cases, the finish_reason field in the response comes as Null instead of the expected value 'stop' for the last token. I have provided the necessary context along with my question in the prompt. Similar Question is asked in Openai Forum: https://community.openai.com/t/completion-finish-reason-is-missing-when-stream-true/90526

My understanding is that the finish_reason field should indicate 'stop' for the last token when using the streaming mode. However, in certain cases, it appears to be Null. This behavior seems inconsistent with the expected behavior of the model.

Notice: However, gpt-0314 always gives finish_reason as stop for last token

To Reproduce

Take a prompt which has Context and Question. ( Make a prompt size of 5200 approx)

Code snippets

response = openai.ChatCompletion.create(
    model='gpt-4-0613',
    messages=[
        {'role': 'user', 'content': "Context along with question"}
    ],
    temperature=0,
    stream=True 
)

for chunk in response:
    print(chunk)

OS

macOS

Python version

Python3.9

Library version

openai from azure

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions