Problem
Two tools produce high failure rates and trigger agent retry loops because their input validation and error messages do not match their parameter contracts.
altimate_core_validate
Contract mismatch. The parameter schema declares schema_path and schema_context as .optional(), but the runtime hard-gates on their absence and returns a terse error before the call even reaches the core library:
"No schema provided. Provide schema_context or schema_path so table/column references can be resolved."
The Rust core (altimate-core) accepts an empty schema just fine (there is already a schemaOrEmpty helper for exactly this case), so the hard-gate is unnecessary. Because the schema looks optional to the caller, calls without schema are the common path and every one of them fails identically.
Expected behavior: the tool should run with an empty schema when neither schema_path nor schema_context is provided, surface a clear warning in the output that table/column references cannot be resolved, and still return semantic findings (syntax errors, dialect issues) that do not require a schema.
sql_explain
Missing input validation and verbatim DB errors. Several issues stack on top of each other:
-
No pre-execution validation. execute() passes the sql string straight into \`${explainPrefix} ${params.sql}\``` without checking if it is empty, whitespace-only, or a bare placeholder character. An LLM-generated parameterized query like SELECT * FROM users WHERE id = ?sent through `sql_explain` ends up asEXPLAIN ?``, which produces:
OperationFailedError: SQL compilation error: syntax error line 1 at position 8 unexpected ?.
-
No warehouse name validation. If the warehouse parameter is an empty string or a placeholder, the Registry lookup returns:
Error: Connection ? not found. Available: (none)
The error does not list what warehouses are available, so the caller has no way to recover.
-
Error messages are not actionable. Both of the above errors are surfaced verbatim from Snowflake / the Registry with no translation. A calling agent has no signal that the problem was a placeholder character versus a real SQL syntax issue, so it retries the same broken call.
-
EXPLAIN is not dialect-aware. The handler hard-codes EXPLAIN and EXPLAIN ANALYZE prefixes. Snowflake does not support EXPLAIN ANALYZE (it uses EXPLAIN USING TEXT / USING JSON), and BigQuery requires a dry-run API call rather than an EXPLAIN statement at all.
Expected behavior: reject empty/placeholder inputs with an actionable error before hitting the warehouse, translate common DB errors into guidance that explains how to fix the call, and use dialect-specific EXPLAIN syntax.
Scope
Both fixes live in packages/opencode/src/altimate. The Rust core (altimate-core-internal) does not need changes — its validate() function already accepts empty schemas, and sql_explain does not call into Rust at all.
Files affected:
packages/opencode/src/altimate/tools/sql-explain.ts (input validation)
packages/opencode/src/altimate/native/connections/register.ts (dialect-aware EXPLAIN, error translation)
packages/opencode/src/altimate/tools/altimate-core-validate.ts (remove hard-gate, add warning path)
Acceptance Criteria
sql_explain rejects empty SQL, whitespace-only SQL, and bare placeholder characters (?, :var, $1) with a clear actionable error message before executing any warehouse call
sql_explain translates "Connection X not found" errors into messages that include the list of configured warehouses (or a suggestion to run warehouse_add)
sql_explain translates SQL compilation errors related to placeholder characters into guidance that explicitly says "sql_explain does not support parameterized queries"
sql_explain uses the correct EXPLAIN prefix for Snowflake (EXPLAIN USING TEXT), PostgreSQL (EXPLAIN (ANALYZE, BUFFERS)), MySQL, and other supported warehouses
altimate_core_validate runs with an empty schema when neither schema_path nor schema_context is provided, returning semantic findings and a clear warning that schema-dependent checks were skipped
altimate_core_validate exposes has_schema in metadata so callers can distinguish full validation from schema-less runs
- Unit tests cover the above paths for both tools
Problem
Two tools produce high failure rates and trigger agent retry loops because their input validation and error messages do not match their parameter contracts.
altimate_core_validateContract mismatch. The parameter schema declares
schema_pathandschema_contextas.optional(), but the runtime hard-gates on their absence and returns a terse error before the call even reaches the core library:The Rust core (
altimate-core) accepts an empty schema just fine (there is already aschemaOrEmptyhelper for exactly this case), so the hard-gate is unnecessary. Because the schema looks optional to the caller, calls without schema are the common path and every one of them fails identically.Expected behavior: the tool should run with an empty schema when neither
schema_pathnorschema_contextis provided, surface a clear warning in the output that table/column references cannot be resolved, and still return semantic findings (syntax errors, dialect issues) that do not require a schema.sql_explainMissing input validation and verbatim DB errors. Several issues stack on top of each other:
No pre-execution validation.
execute()passes thesqlstring straight into\`${explainPrefix} ${params.sql}\``` without checking if it is empty, whitespace-only, or a bare placeholder character. An LLM-generated parameterized query likeSELECT * FROM users WHERE id = ?sent through `sql_explain` ends up asEXPLAIN ?``, which produces:No warehouse name validation. If the
warehouseparameter is an empty string or a placeholder, the Registry lookup returns:The error does not list what warehouses are available, so the caller has no way to recover.
Error messages are not actionable. Both of the above errors are surfaced verbatim from Snowflake / the Registry with no translation. A calling agent has no signal that the problem was a placeholder character versus a real SQL syntax issue, so it retries the same broken call.
EXPLAINis not dialect-aware. The handler hard-codesEXPLAINandEXPLAIN ANALYZEprefixes. Snowflake does not supportEXPLAIN ANALYZE(it usesEXPLAIN USING TEXT/USING JSON), and BigQuery requires a dry-run API call rather than anEXPLAINstatement at all.Expected behavior: reject empty/placeholder inputs with an actionable error before hitting the warehouse, translate common DB errors into guidance that explains how to fix the call, and use dialect-specific EXPLAIN syntax.
Scope
Both fixes live in
packages/opencode/src/altimate. The Rust core (altimate-core-internal) does not need changes — itsvalidate()function already accepts empty schemas, andsql_explaindoes not call into Rust at all.Files affected:
packages/opencode/src/altimate/tools/sql-explain.ts(input validation)packages/opencode/src/altimate/native/connections/register.ts(dialect-aware EXPLAIN, error translation)packages/opencode/src/altimate/tools/altimate-core-validate.ts(remove hard-gate, add warning path)Acceptance Criteria
sql_explainrejects empty SQL, whitespace-only SQL, and bare placeholder characters (?,:var,$1) with a clear actionable error message before executing any warehouse callsql_explaintranslates "Connection X not found" errors into messages that include the list of configured warehouses (or a suggestion to runwarehouse_add)sql_explaintranslates SQL compilation errors related to placeholder characters into guidance that explicitly says "sql_explain does not support parameterized queries"sql_explainuses the correct EXPLAIN prefix for Snowflake (EXPLAIN USING TEXT), PostgreSQL (EXPLAIN (ANALYZE, BUFFERS)), MySQL, and other supported warehousesaltimate_core_validateruns with an empty schema when neitherschema_pathnorschema_contextis provided, returning semantic findings and a clear warning that schema-dependent checks were skippedaltimate_core_validateexposeshas_schemain metadata so callers can distinguish full validation from schema-less runs