Skip to content

Opencode Forces Compaction #4

@null-axiom

Description

@null-axiom

I noticed that after changing the execute_threshold_percentage configuration, Opencode shows an error toast saying I’ve exceeded the model’s context window.

To preface, I’m using gpt-5.3-codex, which has a 400k context window, and magic-context has already been shrinking the context over time. Even at 30%, that should still be around 120k tokens, so I wouldn’t expect this limit to be hit.

Afterwards, it starts compacting using the built-in Opencode compaction and not magic-context compaction.

Once I reverted the change to the configuration, it seemed to work just fine.

One more thing I noticed: in the TUI popup for /ctx-status the execute threshold stays at 65% and doesn't reflect the configuration.

Image Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions