Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 7 additions & 4 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,11 @@
<!-- lore:019d9ac2-0bdf-711e-b563-9ca6c851604d -->
* **OpenCode repo moved to anomalyco/opencode with Node conditional imports**: OpenCode repo moved from \`sst/opencode\` to \`anomalyco/opencode\`. Still uses Bun as default runtime (\`packageManager: bun@1.3.11\`) but added Node support via conditional imports pattern in \`packages/opencode/package.json\`: \`"#db"\`, \`"#pty"\`, \`"#hono"\` each have \`bun\`/\`node\`/\`default\` variants. Uses Drizzle ORM over both \`bun:sqlite\` and \`node:sqlite\` (via \`drizzle-orm/bun-sqlite\` and \`drizzle-orm/node-sqlite\`). Dependencies include \`@hono/node-server\`, \`@hono/node-ws\`, \`@agentclientprotocol/sdk@0.16.1\`. This dual-runtime pattern validates Lore's planned conditional imports approach.

<!-- lore:019da1f4-cebd-711e-808e-6ebce4f48c48 -->
* **opencode-lore ships raw TS — build script is a no-op**: The \`opencode-lore\` package has no build step; OpenCode's plugin loader runs it under Bun which executes TS directly. The package.json \`build\` script is an intentional no-op echo so \`bun --filter '\*' build\` succeeds uniformly. Only \`@loreai/core\` has a real esbuild build (\`packages/core/script/build.ts\`) producing \`dist/node/\` and \`dist/bun/\` bundles with per-target conditional driver selection. Declarations are emitted via \`tsc -p tsconfig.build.json\` into \`dist/types/\`, then the full tree is copied to both target dirs so barrel re-exports resolve for published consumers.

<!-- lore:019d9af0-d691-77c7-9a8b-cc5a21037b0c -->
* **SQLite #db/driver subpath import for Bun/Node dual-runtime**: Core package uses Node subpath imports (\`#db/driver\` in \`package.json\`) to resolve \`bun:sqlite\` or \`node:sqlite\` at runtime. \`packages/core/src/db/driver.bun.ts\` re-exports \`Database\` from \`bun:sqlite\` + \`sha256\` via \`node:crypto\`. \`packages/core/src/db/driver.node.ts\` extends \`DatabaseSync\` from \`node:sqlite\` adding a \`.query(sql)\` method with WeakMap-based statement caching — providing API parity with \`bun:sqlite\`'s \`.query()\`. All 99+ \`.query()\` call sites in core work unchanged. \`db.ts\` imports \`{ Database, sha256 } from "#db/driver"\`. Tests run under Bun (which picks \`driver.bun.ts\`); esbuild bundles will use \`conditions: \["node"]\` or \`\["bun"]\` to select the right driver.
* **SQLite #db/driver subpath import for Bun/Node dual-runtime**: Core package uses Node subpath imports (\`#db/driver\` in \`package.json\`) to resolve \`bun:sqlite\` or \`node:sqlite\` at runtime. \`driver.bun.ts\` re-exports \`Database\` from \`bun:sqlite\` + \`sha256\` via \`node:crypto\`. \`driver.node.ts\` extends \`DatabaseSync\` from \`node:sqlite\` adding a \`.query(sql)\` shim with WeakMap-based statement caching — API parity with \`bun:sqlite\`'s \`.query()\`. All 99+ \`.query()\` call sites work unchanged. Tests run under Bun; esbuild bundles use \`conditions: \["node"]\` or \`\["bun"]\` to select the driver. API differences: \`.query()\` vs \`.prepare()\`, \`{ create: true }\` is Bun-only. FTS5, transactions, pragmas work identically. \`node:sqlite\` is stable without flags in Node 22.5+. No native addons needed; Drizzle adoption orthogonal — FTS5/BM25 queries don't benefit from ORM.

<!-- lore:019d8c54-e51c-7fe8-87ba-273269c39b7a -->
* **Worker session prompt helper with agent-not-found retry**: src/worker.ts owns workerSessionIDs Set, isWorkerSession(), and promptWorker(). promptWorker() calls session.prompt() and uses the return value directly (no redundant session.messages() call). On 'agent not found' errors (detected via regex on JSON.stringify(result.error)), it retries once without the agent parameter on a fresh session. All callers (distillation×2, curator×2, search×1) use this shared helper. Session rotation (deleting from the caller's Map) happens after every call. The retry creates a new child session via client.session.create() and registers its ID in workerSessionIDs.
Expand All @@ -47,9 +50,6 @@
<!-- lore:019c904b-7924-7187-8471-8ad2423b8946 -->
* **Curator prompt scoped to code-relevant knowledge only**: CURATOR\_SYSTEM in src/prompt.ts now explicitly excludes: general ecosystem knowledge available online, business strategy and marketing positioning, product pricing models, third-party tool details not needed for development, and personal contact information. This was added after the curator extracted entries about OpenWork integration strategy (including an email address), Lore Cloud pricing tiers, and AGENTS.md ecosystem facts — none of which help an agent write code. The curatorUser() function also appends guidance to prefer updating existing entries over creating new ones for the same concept, reducing duplicate creation.

<!-- lore:019d15de-e2e4-777f-8e00-fe21198117ad -->
* **Lore plugin cannot use native Node addons — pure bun:sqlite only**: Lore uses Node conditional imports (\`#db/driver\`) to swap \`bun:sqlite\` ↔ \`node:sqlite\` at runtime — two driver files behind a subpath alias in \`packages/core/package.json\`. The Node driver extends \`DatabaseSync\` adding a \`.query()\` shim with statement caching. \`bun:sqlite\` and \`node:sqlite\` APIs differ: \`.query()\` vs \`.prepare()\`, \`{ create: true }\` option exists only in Bun. FTS5, transactions, pragmas work identically in both. \`node:sqlite\` is stable without flags in Node 22.5+. Drizzle adoption is orthogonal — Lore's FTS5/BM25 queries wouldn't benefit from ORM. No native addons needed.

<!-- lore:019d92cd-67d5-705b-b60d-4537fcf4f054 -->
* **Lore standalone ACP server using Pi as agentic engine**: Lore is planned to become a standalone ACP (Agent Client Protocol) server, independent of OpenCode. Architecture: Lore speaks ACP to editors (Zed, JetBrains), uses Pi (\`@mariozechner/pi-coding-agent\`) internally as the agentic loop engine, and layers its memory system via Pi extensions. ACP proxy approach was rejected because proxies cannot modify the downstream agent's internal message array or system prompt — losing gradient context management and LTM injection, Lore's most valuable features. As a full ACP agent, Lore owns the LLM interaction with full control. Pi was chosen for its extension hooks (message injection, history filtering, custom compaction, custom tools) that map to Lore's existing OpenCode hooks. Requires a research spike first to verify Pi's extension API compatibility.

Expand All @@ -67,6 +67,9 @@
<!-- lore:019c8f4f-67ca-7212-a8c4-8a75b230ceea -->
* **Test DB isolation via LORE\_DB\_PATH and Bun test preload**: Lore test suite uses isolated temp DB via \`packages/core/test/setup.ts\` preload (\`bunfig.toml\` at repo root: \`preload = \["./packages/core/test/setup.ts"]\`). Preload sets \`LORE\_DB\_PATH\` to \`mkdtempSync\` path before any imports of \`src/db.ts\`; \`afterAll\` cleans up. \`src/db.ts\` checks \`LORE\_DB\_PATH\` first. \`agents-file.test.ts\` needs \`beforeEach\` cleanup for intra-file isolation and \`TEST\_UUIDS\` cleanup in \`afterAll\` (shared with \`ltm.test.ts\`). Tests covering OpenCode-specific code (plugin init, recovery functions) live in \`packages/opencode/test/\`. Driver-level tests in \`packages/core/test/db-driver.test.ts\`.

<!-- lore:019da1f4-ceb2-7b5d-8a7a-180bbec8a1f7 -->
* **TypeScript can't resolve @loreai/core via Bun conditions — use tsconfig paths**: tsc doesn't understand the \`bun\` export condition, so in workspace typecheck it falls to \`default\` → \`dist/node/index.js\` which doesn't exist pre-build. Fix: add \`compilerOptions.paths\` in \`packages/opencode/tsconfig.json\` mapping \`@loreai/core\` → \`../core/src/index.ts\`. This lets tsc resolve to source without requiring a build step in CI. Removing \`types\` from the core exports map alone is insufficient — tsc needs explicit path mapping. Also: the core build must copy the entire \`dist/types/\` tree (not just \`index.d.ts\`) into \`dist/node/\` and \`dist/bun/\`, because the barrel's \`export \* as foo from './foo'\` statements need peer \`.d.ts\` files to resolve for published consumers.

### Pattern

<!-- lore:019cb050-ef48-7cbe-8e58-802f17c34591 -->
Expand Down
1 change: 0 additions & 1 deletion bun.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 0 additions & 1 deletion packages/core/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@
"zod": "^4.3.6"
},
"devDependencies": {
"@opencode-ai/sdk": "^1.1.39",
"@types/mdast": "^4.0.4"
},
"files": [
Expand Down
64 changes: 13 additions & 51 deletions packages/core/src/curator.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
import type { createOpencodeClient } from "@opencode-ai/sdk";
import { config } from "./config";
import * as temporal from "./temporal";
import * as ltm from "./ltm";
import * as log from "./log";
import { CURATOR_SYSTEM, curatorUser, CONSOLIDATION_SYSTEM, consolidationUser } from "./prompt";
import { workerSessionIDs, promptWorker } from "./worker";
import type { LLMClient } from "./types";

/**
* Maximum length (chars) for a single knowledge entry's content.
Expand All @@ -14,25 +13,6 @@ import { workerSessionIDs, promptWorker } from "./worker";
*/
const MAX_ENTRY_CONTENT_LENGTH = 1200;

type Client = ReturnType<typeof createOpencodeClient>;

const workerSessions = new Map<string, string>();

async function ensureWorkerSession(
client: Client,
parentID: string,
): Promise<string> {
const existing = workerSessions.get(parentID);
if (existing) return existing;
const session = await client.session.create({
body: { parentID, title: "lore curator" },
});
const id = session.data!.id;
workerSessions.set(parentID, id);
workerSessionIDs.add(id);
return id;
}

type CuratorOp =
| {
op: "create";
Expand Down Expand Up @@ -71,7 +51,7 @@ function parseOps(text: string): CuratorOp[] {
const lastCuratedAt = new Map<string, number>();

export async function run(input: {
client: Client;
llm: LLMClient;
projectPath: string;
sessionID: string;
model?: { providerID: string; modelID: string };
Expand All @@ -98,21 +78,12 @@ export async function run(input: {
messages: text,
existing: existingForPrompt,
});
const workerID = await ensureWorkerSession(input.client, input.sessionID);
const model = input.model ?? cfg.model;
const parts = [
{ type: "text" as const, text: `${CURATOR_SYSTEM}\n\n${userContent}` },
];

const responseText = await promptWorker({
client: input.client,
workerID,
parts,
agent: "lore-curator",
model,
sessionMap: workerSessions,
sessionKey: input.sessionID,
});
const responseText = await input.llm.prompt(
CURATOR_SYSTEM,
userContent,
{ model, workerID: "lore-curator" },
);
if (!responseText) return { created: 0, updated: 0, deleted: 0 };

const ops = parseOps(responseText);
Expand Down Expand Up @@ -188,7 +159,7 @@ export function resetCurationTracker(sessionID?: string) {
* Only "update" and "delete" ops are applied — consolidation never creates entries.
*/
export async function consolidate(input: {
client: Client;
llm: LLMClient;
projectPath: string;
sessionID: string;
model?: { providerID: string; modelID: string };
Expand All @@ -210,21 +181,12 @@ export async function consolidate(input: {
entries: entriesForPrompt,
targetMax: cfg.curator.maxEntries,
});
const workerID = await ensureWorkerSession(input.client, input.sessionID);
const model = input.model ?? cfg.model;
const parts = [
{ type: "text" as const, text: `${CONSOLIDATION_SYSTEM}\n\n${userContent}` },
];

const responseText = await promptWorker({
client: input.client,
workerID,
parts,
agent: "lore-curator",
model,
sessionMap: workerSessions,
sessionKey: input.sessionID,
});
const responseText = await input.llm.prompt(
CONSOLIDATION_SYSTEM,
userContent,
{ model, workerID: "lore-curator" },
);
if (!responseText) return { updated: 0, deleted: 0 };

const ops = parseOps(responseText);
Expand Down
71 changes: 17 additions & 54 deletions packages/core/src/distillation.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import type { createOpencodeClient } from "@opencode-ai/sdk";
import { db, ensureProject } from "./db";
import { config } from "./config";
import * as temporal from "./temporal";
Expand All @@ -11,32 +10,14 @@ import {
recursiveUser,
} from "./prompt";
import { needsUrgentDistillation } from "./gradient";
import { workerSessionIDs, promptWorker } from "./worker";
import { workerSessionIDs } from "./worker";
import type { LLMClient } from "./types";

// Re-export for backwards compat — index.ts and others may still import from here.
export { workerSessionIDs };

type Client = ReturnType<typeof createOpencodeClient>;
type TemporalMessage = temporal.TemporalMessage;

// Worker sessions keyed by parent session ID — hidden children, one per source session
const workerSessions = new Map<string, string>();

async function ensureWorkerSession(
client: Client,
parentID: string,
): Promise<string> {
const existing = workerSessions.get(parentID);
if (existing) return existing;
const session = await client.session.create({
body: { parentID, title: "lore distillation" },
});
const id = session.data!.id;
workerSessions.set(parentID, id);
workerSessionIDs.add(id);
return id;
}

// Segment detection: group related messages together
function detectSegments(
messages: TemporalMessage[],
Expand Down Expand Up @@ -286,7 +267,7 @@ function resetOrphans(projectPath: string, sessionID: string): number {

// Main distillation entry point — called on session.idle or when urgent
export async function run(input: {
client: Client;
llm: LLMClient;
projectPath: string;
sessionID: string;
model?: { providerID: string; modelID: string };
Expand Down Expand Up @@ -320,7 +301,7 @@ export async function run(input: {
const segments = detectSegments(pending, cfg.distillation.maxSegment);
for (const segment of segments) {
const result = await distillSegment({
client: input.client,
llm: input.llm,
projectPath: input.projectPath,
sessionID: input.sessionID,
messages: segment,
Expand All @@ -339,7 +320,7 @@ export async function run(input: {
cfg.distillation.metaThreshold
) {
await metaDistill({
client: input.client,
llm: input.llm,
projectPath: input.projectPath,
sessionID: input.sessionID,
model: input.model,
Expand All @@ -355,7 +336,7 @@ export async function run(input: {
}

async function distillSegment(input: {
client: Client;
llm: LLMClient;
projectPath: string;
sessionID: string;
messages: TemporalMessage[];
Expand All @@ -378,21 +359,12 @@ async function distillSegment(input: {
messages: text,
});

const workerID = await ensureWorkerSession(input.client, input.sessionID);
const model = input.model ?? config().model;
const parts = [
{ type: "text" as const, text: `${DISTILLATION_SYSTEM}\n\n${userContent}` },
];

const responseText = await promptWorker({
client: input.client,
workerID,
parts,
agent: "lore-distill",
model,
sessionMap: workerSessions,
sessionKey: input.sessionID,
});
const responseText = await input.llm.prompt(
DISTILLATION_SYSTEM,
userContent,
{ model, workerID: "lore-distill" },
);
if (!responseText) return null;

const result = parseDistillationResult(responseText);
Expand All @@ -416,7 +388,7 @@ async function distillSegment(input: {
}

async function metaDistill(input: {
client: Client;
llm: LLMClient;
projectPath: string;
sessionID: string;
model?: { providerID: string; modelID: string };
Expand All @@ -426,21 +398,12 @@ async function metaDistill(input: {

const userContent = recursiveUser(existing);

const workerID = await ensureWorkerSession(input.client, input.sessionID);
const model = input.model ?? config().model;
const parts = [
{ type: "text" as const, text: `${RECURSIVE_SYSTEM}\n\n${userContent}` },
];

const responseText = await promptWorker({
client: input.client,
workerID,
parts,
agent: "lore-distill",
model,
sessionMap: workerSessions,
sessionKey: input.sessionID,
});
const responseText = await input.llm.prompt(
RECURSIVE_SYSTEM,
userContent,
{ model, workerID: "lore-distill" },
);
if (!responseText) return null;

const result = parseDistillationResult(responseText);
Expand Down
Loading
Loading