Problem
The experimental.chat.system.transform hook currently receives { sessionID, model } in its input, but does not include the user's message text. This makes it impossible for plugins to do targeted context injection based on the user's current prompt.
Use Case
I've built a memory recall plugin that automatically injects relevant context from a persistent knowledge base into the system prompt. The ideal flow:
- User types a message about topic X
system.transform hook reads the message
- Plugin queries the knowledge base for topic X
- Plugin pushes relevant context into
output.system
- The LLM sees the context on the same turn the user asked about it
Currently, the user's message text is only available in chat.message which fires after system.transform, so the targeted context can only be injected on the next turn.
Current Workaround
Two-phase approach:
- Phase 1 (
session.created): Broad recall using project name
- Phase 2 (
chat.message): Captures user text, does targeted recall, rewrites an instructions file that OpenCode re-reads on the next turn
This means turn 1 gets generic context and turn 2+ gets targeted context. Works but adds a one-turn delay.
Proposed Change
Include the user's message content in system.transform input:
await Plugin.trigger("experimental.chat.system.transform", {
sessionID: input.sessionID,
model: input.model,
userMessage: currentUserMessage, // add this
}, { system });
Or include the full messages array:
await Plugin.trigger("experimental.chat.system.transform", {
sessionID: input.sessionID,
model: input.model,
messages: msgs, // the messages being sent to the LLM
}, { system });
Verification
Confirmed via strings analysis of the OpenCode 1.2.26 binary that:
experimental.chat.system.transform exists and fires during prompt construction
chat.message exists and fires after system prompt is built
- The user message is available in memory at
system.transform time but isn't passed to the hook
Environment
- OpenCode 1.2.26
- Plugin: TypeScript, uses
event + chat.message + experimental.chat.system.transform hooks
- Use case: Persistent memory integration (cognitive-stack with vector store + knowledge graph)
Problem
The
experimental.chat.system.transformhook currently receives{ sessionID, model }in its input, but does not include the user's message text. This makes it impossible for plugins to do targeted context injection based on the user's current prompt.Use Case
I've built a memory recall plugin that automatically injects relevant context from a persistent knowledge base into the system prompt. The ideal flow:
system.transformhook reads the messageoutput.systemCurrently, the user's message text is only available in
chat.messagewhich fires aftersystem.transform, so the targeted context can only be injected on the next turn.Current Workaround
Two-phase approach:
session.created): Broad recall using project namechat.message): Captures user text, does targeted recall, rewrites an instructions file that OpenCode re-reads on the next turnThis means turn 1 gets generic context and turn 2+ gets targeted context. Works but adds a one-turn delay.
Proposed Change
Include the user's message content in
system.transforminput:Or include the full messages array:
Verification
Confirmed via
stringsanalysis of the OpenCode 1.2.26 binary that:experimental.chat.system.transformexists and fires during prompt constructionchat.messageexists and fires after system prompt is builtsystem.transformtime but isn't passed to the hookEnvironment
event+chat.message+experimental.chat.system.transformhooks