Skip to content

Opaque output truncation as max_tokens is not explicitally included in the __init__ for GPT3 (OpenAI)  #730

@harrysalmon

Description

@harrysalmon

Currently the output tokens for Openai is set to 75, half of the value of the max tokens specified here in gpt3.py

...
self.kwargs = {
    "temperature": 0.0,
    "max_tokens": 150,
    "top_p": 1,
    "frequency_penalty": 0,
    "presence_penalty": 0,
    "n": 1,
    **kwargs,
}  # TODO: add kwargs above for </s>
...

As this is only setable through kwargs, there's no indication to the user in the OpenAI class why the output is being truncated or how to modify it.

I might be missing a reason for this, but I'll open a PR to modify it and see what folks think.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions