Skip to content

Prompt token count of exceeds the limit of 128000 #11624

@DanielProkhorov17

Description

@DanielProkhorov17

Description

I get this error when a model like claude is trying to read file attachments that are images (.png, jpeg), thus crashing with token count limit.

I'd be great to perform token count before the request is getting sent to the model. In case, it exceeds the images shall be compressed to a reasonable size.

Plugins

No response

OpenCode version

1.1.48

Steps to reproduce

  1. Let the model download an image (for e.g. via MPC), or python script ivocation, or curl
  2. The AI model tries to read downloaded image(s)

Screenshot and/or share link

No response

Operating System

Windows 11

Terminal

Windows Terminal

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingperfIndicates a performance issue or need for optimizationwindows

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions