Skip to content

Bug: Token limit exceeded, 16k model should be used #907

@Bit0r

Description

@Bit0r

Chat2DB Version

Linux-v3.0.14

Describe the bug

  • Operating system: Kubuntu 22.04
  • Software version: 3.0.14
  • AI type: Open AI
    -Agent: API2D

There are multiple tables in the database. When using dialogue to generate SQL, the token limit is exceeded. The error log returned is as follows:

出现异常,请在帮助中查看详细日志:{"object":"error","message":"流式输出失败 {\"error\":{\"message\":\"This model's maximum context length is 4097 tokens. However, you requested 4146 tokens (2098 in the messages, 2048 in the completion). Please reduce the length of the messages or completion.\",\"type\":\"invalid_request_error\",\"param\":\"messages\",\"code\":\"context_length_exceeded\"}}","code":50099}

image

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions