Skip to content

Context window below 50% but LLM provider error: Error code: 400 - {'error': {'message': 'Invalid request: Your request exceeded model token limit: 262144 (re quested: 269030)' #2011

@creatiVision

Description

@creatiVision

What version of Kimi Code CLI is running?

kimi, version 1.37.0

Which open platform/subscription were you using?

moonshot.ai

Which model were you using?

kimi-2.5

What platform is your computer?

Linux 6.8.0-110-generic x86_64 x86_64

What issue are you seeing?

LLM provider error: Error code: 400 - {'error': {'message': 'Invalid request: Your request exceeded model token limit: 262144 (re quested: 269030)', 'type': 'invalid_request_error'}} If this persists, run kimi export and send the exported data to support for assistance. Please do not share the exported file pub licly. ✨ again LLM provider error: Error code: 400 - {'error': {'message': 'Invalid request: Your request exceeded model token limit: 262144 (re quested: 313014)', 'type': 'invalid_request_error'}} If this persists, run kimi export and send the exported data to support for assistance. Please do not share the exported file pub licly. LLM provider error: Error code: 400 - {'error': {'message': 'Invalid request: Your request exceeded model token limit: 262144 (re quested: 269053)', 'type': 'invalid_request_error'}} If this persists, run kimi export and send the exported data to support for assistance. Please do not share the exported file pub licly. Exported 7 messages to ~/kimi-export-599a0bb3-20260423-003025.md Note: The exported file may contain sensitive information. Please be cautious when sharing it externally. ✨ now? LLM provider error: Error code: 400 - {'error': {'message': 'Invalid request: Your request exceeded model token limit: 262144 (re quested: 313016)', 'type': 'invalid_request_error'}} If this persists, run kimi export and send the exported data to support for assistance. Please do not share the exported file pub licly.

What steps can reproduce the bug?

dont know

What is the expected behavior?

no matter what I type the error comes again:
LLM provider error: Error code: 400 - {'error': {'message': 'Invalid request: Your request exceeded model token limit: 262144 (re quested: 269030)', 'type': 'invalid_request_error'}} If this persists, run kimi export and send the exported data to support for assistance. Please do not share the exported file pub licly.

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions