Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ With the database connected, three additional skills become available for schema
To manage database connections later, use `storm db` for the global connection library and `storm mcp` for project-level configuration. See [Database Connections & MCP](database-and-mcp.md) for the full guide.

:::tip Looking for a database MCP server for Python, Go, Ruby, or any other language?
The Storm MCP server works standalone — no Storm ORM required. Run `npx @storm-orm/cli mcp init` to set up schema access and optional read-only data queries without installing Storm rules or skills. See [Using Without Storm ORM](database-and-mcp.md#using-without-storm-orm).
The Storm MCP server works standalone — no Storm ORM required. Run `npx @storm-orm/cli mcp` to set up schema access and optional read-only data queries without installing Storm rules or skills. See [Using Without Storm ORM](database-and-mcp.md#using-without-storm-orm).
:::

---
Expand Down
10 changes: 5 additions & 5 deletions docs/database-and-mcp.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ Run `storm mcp remove reporting` to remove an alias from the project. This unreg

### Re-registering connections

If your AI tool's MCP configuration gets out of sync (for example, after switching branches or resetting editor config files), run `storm mcp` without arguments. This re-registers all connections from `databases.json` for every configured AI tool.
If your AI tool's MCP configuration gets out of sync (for example, after switching branches or resetting editor config files), run `storm mcp update`. This re-registers all connections from `databases.json` for every configured AI tool.

---

Expand Down Expand Up @@ -191,7 +191,7 @@ Global connections are stored in `~/.storm/connections/`. Project-level configur
The Storm MCP server is a standalone database tool — it does not require Storm ORM in your project. If you use Python, Go, Ruby, or any other language and just want your AI tool to have schema awareness and optional data access, run:

```bash
npx @storm-orm/cli mcp init
npx @storm-orm/cli mcp
```

This walks you through:
Expand Down Expand Up @@ -324,11 +324,11 @@ Excluded tables still appear in `list_tables` and can be described with `describ

### `storm mcp` — Project MCP servers

#### `storm mcp init`
#### `storm mcp`

Standalone setup for the MCP database server, intended for projects that do not use Storm ORM. Walks you through AI tool selection, database connections, data access, and MCP registration. No Storm rules or language-specific configuration is installed.
Set up a MCP database server (default). Walks you through AI tool selection, database connections, data access, and MCP registration. Works standalone — no Storm ORM required. `storm mcp init` is an alias for this command.

#### `storm mcp`
#### `storm mcp update`

Re-register all MCP servers defined in `.storm/databases.json` with your AI tools. Useful after switching branches, resetting editor config files, or when MCP registrations get out of sync.

Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import TabItem from '@theme/TabItem';
# ST/ORM

:::tip Give your AI tool access to your database schema
Storm includes a schema-aware MCP server that exposes your table definitions, column types, and foreign keys to AI coding tools like Claude Code, Cursor, Copilot and Codex. Run `npx @storm-orm/cli` for full Storm ORM support including AI skills, conventions, and schema access. Using Python, Go, Ruby, or another language? Run `npx @storm-orm/cli mcp init` to set up the MCP server standalone.
Storm includes a schema-aware MCP server that exposes your table definitions, column types, and foreign keys to AI coding tools like Claude Code, Cursor, Copilot and Codex. Run `npx @storm-orm/cli` for full Storm ORM support including AI skills, conventions, and schema access. Using Python, Go, Ruby, or another language? Run `npx @storm-orm/cli mcp` to set up the MCP server standalone.
:::

**Storm** is a modern, high-performance ORM for Kotlin 2.0+ and Java 21+, built around a powerful SQL template engine. It focuses on simplicity, type safety, and predictable performance through immutable models and compile-time metadata.
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<properties>
<revision>1.11.2</revision>
<revision>1.11.3</revision>
<java.version>21</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
Expand Down
2 changes: 1 addition & 1 deletion storm-cli/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@storm-orm/cli",
"version": "1.11.2",
"version": "1.11.3",
"description": "Storm ORM - AI assistant configuration tool",
"type": "module",
"bin": {
Expand Down
127 changes: 69 additions & 58 deletions storm-cli/storm.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import { basename, join, dirname } from 'path';
import { homedir } from 'os';
import { execSync, spawn } from 'child_process';

const VERSION = '1.11.2';
const VERSION = '1.11.3';

// ─── ANSI ────────────────────────────────────────────────────────────────────

Expand Down Expand Up @@ -905,13 +905,18 @@ async function fetchSkill(name) {
}
}

function installSkill(name, content, toolConfig, created) {
function installSkill(name, content, toolConfig, created, appended) {
const cwd = process.cwd();
const fullPath = join(cwd, toolConfig.skillPath(name));
if (existsSync(fullPath) && readFileSync(fullPath, 'utf-8') === content) return;
const exists = existsSync(fullPath);
if (exists && readFileSync(fullPath, 'utf-8') === content) return;
mkdirSync(dirname(fullPath), { recursive: true });
writeFileSync(fullPath, content);
created.push(toolConfig.skillPath(name));
if (exists && appended) {
appended.push(toolConfig.skillPath(name));
} else {
created.push(toolConfig.skillPath(name));
}
}

function cleanStaleSkills(toolConfigs, installedSkillNames, skipped) {
Expand Down Expand Up @@ -1389,6 +1394,18 @@ function resolveColumnName(validColumns, name) {
return null;
}

function coerceArray(value) {
if (value == null) return value;
if (Array.isArray(value)) return value;
if (typeof value === 'string') {
var trimmed = value.trim();
if (trimmed.charAt(0) === '[') {
try { var parsed = JSON.parse(trimmed); if (Array.isArray(parsed)) return parsed; } catch (e) { /* fall through */ }
}
}
return value;
}

async function selectData(args) {
if (excludedTables.has((args.table || '').toLowerCase())) throw new Error('Data access is excluded for table: ' + args.table);
var tables = await listTables();
Expand All @@ -1397,7 +1414,7 @@ async function selectData(args) {

var validColumns = await resolveColumns(tableName);

var columns = args.columns;
var columns = coerceArray(args.columns);
if (columns && columns.length > 0) {
for (var i = 0; i < columns.length; i++) {
var resolved = resolveColumnName(validColumns, columns[i]);
Expand All @@ -1410,10 +1427,11 @@ async function selectData(args) {
var sql = 'SELECT ' + selectClause + ' FROM ' + quoteIdentifier(tableName);
var params = [];

if (args.where && args.where.length > 0) {
var where = coerceArray(args.where);
if (where && where.length > 0) {
var conditions = [];
for (var i = 0; i < args.where.length; i++) {
var w = args.where[i];
for (var i = 0; i < where.length; i++) {
var w = where[i];
var resolvedCol = resolveColumnName(validColumns, w.column);
if (!resolvedCol) throw new Error('Unknown column: ' + w.column + ' in table ' + tableName);
w.column = resolvedCol;
Expand All @@ -1425,6 +1443,7 @@ async function selectData(args) {
} else if (op === 'IS NOT NULL') {
conditions.push(col + ' IS NOT NULL');
} else if (op === 'IN') {
w.value = coerceArray(w.value);
if (!Array.isArray(w.value)) throw new Error('IN operator requires an array value');
var placeholders = w.value.map(function(v) { params.push(v); return ph(params.length); });
conditions.push(col + ' IN (' + placeholders.join(', ') + ')');
Expand All @@ -1436,10 +1455,11 @@ async function selectData(args) {
sql += ' WHERE ' + conditions.join(' AND ');
}

if (args.orderBy && args.orderBy.length > 0) {
var orderBy = coerceArray(args.orderBy);
if (orderBy && orderBy.length > 0) {
var orderParts = [];
for (var i = 0; i < args.orderBy.length; i++) {
var o = args.orderBy[i];
for (var i = 0; i < orderBy.length; i++) {
var o = orderBy[i];
var resolvedOrderCol = resolveColumnName(validColumns, o.column);
if (!resolvedOrderCol) throw new Error('Unknown column: ' + o.column + ' in table ' + tableName);
var dir = (o.direction || 'ASC').toUpperCase();
Expand All @@ -1453,7 +1473,7 @@ async function selectData(args) {
var offset = Math.max(0, Math.floor(args.offset || 0));
if (dbType === 'mssql') {
if (offset > 0) {
if (!args.orderBy || args.orderBy.length === 0) {
if (!orderBy || orderBy.length === 0) {
sql += ' ORDER BY (SELECT NULL)';
}
sql += ' OFFSET ' + offset + ' ROWS FETCH NEXT ' + limit + ' ROWS ONLY';
Expand Down Expand Up @@ -1601,6 +1621,28 @@ rl.on('line', async function(line) {
const MARKER_START = '<!-- STORM:START -->';
const MARKER_END = '<!-- STORM:END -->';

function installSchemaRules(filePath, schemaRules, appended) {
if (!existsSync(filePath)) return;
const existing = readFileSync(filePath, 'utf-8');
const endMarker = existing.indexOf(MARKER_END);
if (endMarker === -1) return;
const cleanRules = schemaRules.replace('\n' + STORM_SKILL_MARKER, '');
const schemaStart = existing.indexOf('## Database Schema Access');
if (schemaStart !== -1 && schemaStart < endMarker) {
// Replace existing schema rules (from start to just before MARKER_END).
const updated = existing.substring(0, schemaStart) + cleanRules + '\n' + existing.substring(endMarker);
if (updated !== existing) {
writeFileSync(filePath, updated);
if (!appended.includes(filePath.replace(process.cwd() + '/', ''))) appended.push(filePath.replace(process.cwd() + '/', ''));
}
} else {
// First time — insert before MARKER_END.
const updated = existing.substring(0, endMarker) + '\n' + cleanRules + '\n' + existing.substring(endMarker);
writeFileSync(filePath, updated);
if (!appended.includes(filePath.replace(process.cwd() + '/', ''))) appended.push(filePath.replace(process.cwd() + '/', ''));
}
}

function installRulesBlock(filePath, content, created, appended) {
const block = `${MARKER_START}\n${content.trim()}\n${MARKER_END}`;
mkdirSync(dirname(filePath), { recursive: true });
Expand Down Expand Up @@ -1998,7 +2040,7 @@ async function update() {

for (const config of skillToolConfigs) {
for (const [name, content] of fetchedSkills) {
installSkill(name, content, config, created);
installSkill(name, content, config, created, appended);
}
}

Expand All @@ -2008,18 +2050,7 @@ async function update() {
for (const toolId of tools) {
const config = TOOL_CONFIGS[toolId];
if (config.rulesFile && schemaRules) {
const rulesPath = join(process.cwd(), config.rulesFile);
if (existsSync(rulesPath)) {
const existing = readFileSync(rulesPath, 'utf-8');
if (!existing.includes('Database Schema Access')) {
const endMarker = existing.indexOf(MARKER_END);
if (endMarker !== -1) {
const updated = existing.substring(0, endMarker) + '\n' + schemaRules.replace('\n' + STORM_SKILL_MARKER, '') + '\n' + existing.substring(endMarker);
writeFileSync(rulesPath, updated);
if (!appended.includes(config.rulesFile)) appended.push(config.rulesFile);
}
}
}
installSchemaRules(join(process.cwd(), config.rulesFile), schemaRules, appended);
}
}

Expand All @@ -2028,7 +2059,7 @@ async function update() {
if (!content) { skipped.push(skillName + ' (fetch failed)'); continue; }
installedSkillNames.push(skillName);
for (const config of skillToolConfigs) {
installSkill(skillName, content, config, created);
installSkill(skillName, content, config, created, appended);
}
}
}
Expand All @@ -2040,6 +2071,7 @@ async function update() {
// Update MCP server script if databases are configured.
if (Object.keys(readDatabases()).length > 0) {
ensureGlobalDir();
appended.push('~/.storm/server.mjs');
}

const uniqueCreated = [...new Set(created)];
Expand Down Expand Up @@ -2549,8 +2581,10 @@ async function updateMcp(subArgs) {
mcpList();
} else if (subcommand === 'remove' || subcommand === 'rm') {
await mcpRemove(subArgs[1]);
} else {
} else if (subcommand === 'update') {
await mcpReregisterAll();
} else {
await mcpInit();
}
}

Expand Down Expand Up @@ -2649,7 +2683,7 @@ async function setup() {
// Install fetched skills into each tool's directory.
for (const config of skillToolConfigs) {
for (const [name, content] of fetchedSkills) {
installSkill(name, content, config, created);
installSkill(name, content, config, created, appended);
}
}
}
Expand Down Expand Up @@ -2726,19 +2760,7 @@ async function setup() {
for (const toolId of tools) {
const config = TOOL_CONFIGS[toolId];
if (config.rulesFile && schemaRules) {
const rulesPath = join(process.cwd(), config.rulesFile);
if (existsSync(rulesPath)) {
const existing = readFileSync(rulesPath, 'utf-8');
// Insert schema rules inside the STORM block if not already present.
if (!existing.includes('Database Schema Access')) {
const endMarker = existing.indexOf(MARKER_END);
if (endMarker !== -1) {
const updated = existing.substring(0, endMarker) + '\n' + schemaRules.replace('\n' + STORM_SKILL_MARKER, '') + '\n' + existing.substring(endMarker);
writeFileSync(rulesPath, updated);
appended.push(config.rulesFile);
}
}
}
installSchemaRules(join(process.cwd(), config.rulesFile), schemaRules, appended);
}
}

Expand All @@ -2749,7 +2771,7 @@ async function setup() {
if (!content) { skipped.push(skillName + ' (fetch failed)'); continue; }
installedSkillNames.push(skillName);
for (const config of skillToolConfigs) {
installSkill(skillName, content, config, created);
installSkill(skillName, content, config, created, appended);
}
}
}
Expand Down Expand Up @@ -2873,7 +2895,7 @@ async function demo() {
installedSkillNames.push(skillName);
}
for (const [name, content] of fetchedSkills) {
installSkill(name, content, config, created);
installSkill(name, content, config, created, appended);
}
}

Expand Down Expand Up @@ -2930,18 +2952,7 @@ async function demo() {
// Fetch and install schema rules into the rules block.
const schemaRules = await fetchSkill('storm-schema-rules');
if (config.rulesFile && schemaRules) {
const rulesPath = join(cwd, config.rulesFile);
if (existsSync(rulesPath)) {
const existing = readFileSync(rulesPath, 'utf-8');
if (!existing.includes('Database Schema Access')) {
const endMarker = existing.indexOf(MARKER_END);
if (endMarker !== -1) {
const updated = existing.substring(0, endMarker) + '\n' + schemaRules.replace('\n' + STORM_SKILL_MARKER, '') + '\n' + existing.substring(endMarker);
writeFileSync(rulesPath, updated);
appended.push(config.rulesFile);
}
}
}
installSchemaRules(join(cwd, config.rulesFile), schemaRules, appended);
}

// Fetch and install schema-dependent skills.
Expand All @@ -2950,7 +2961,7 @@ async function demo() {
for (const skillName of schemaSkillNames) {
const content = await fetchSkill(skillName);
if (!content) { skipped.push(skillName + ' (fetch failed)'); continue; }
installSkill(skillName, content, config, created);
installSkill(skillName, content, config, created, appended);
installedSkillNames.push(skillName);
}
}
Expand Down Expand Up @@ -3031,8 +3042,8 @@ async function run() {
storm db add [name] Add a global database connection
storm db remove [name] Remove a global database connection
storm db config [name] Configure data access and table exclusions
storm mcp init Set up MCP database server (no Storm ORM required)
storm mcp Re-register MCP servers for configured tools
storm mcp Set up MCP database server (default, no Storm ORM required)
storm mcp update Re-register MCP servers for configured tools
storm mcp add [alias] Add a database connection to this project
storm mcp list List project database connections
storm mcp remove [alias] Remove a database connection
Expand Down
2 changes: 2 additions & 0 deletions website/static/skills/storm-query-java.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ The `Operator` enum is in `st.orm` and contains: `EQUALS`, `NOT_EQUALS`, `LESS_T

Ask what data they need, filters, ordering, or pagination.

**DI preference:** In Spring Boot projects, repositories should be constructor-injected (see /storm-repository-java). Use `orm.entity(T.class)` and `orm.repository(T.class)` lookups only in standalone (non-DI) contexts and tests. In DI environments, write queries on injected repository instances.

## API Design: Builder Methods vs Convenience Methods

Repository/entity methods fall into two categories:
Expand Down
24 changes: 24 additions & 0 deletions website/static/skills/storm-query-kotlin.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,30 @@ All infix predicate operators (`eq`, `neq`, `like`, `greater`, `less`, `inList`,

Ask what data they need, filters, ordering, or pagination.

**Repository rule:** All database queries must live in repository interfaces, not inline in services or other classes. In Spring Boot or Ktor projects, repositories are constructor-injected (see /storm-repository-kotlin). Use `orm.entity<T>()` and `orm.repository<T>()` lookups only in standalone (non-DI) contexts and tests.

**Code-first WHERE clauses:** Always express WHERE conditions using metamodel-based predicates (`eq`, `isFalse()`, `isNotNull()`, etc.) instead of template strings. Only fall back to template expressions for conditions that predicates cannot express (e.g., `COALESCE`, date arithmetic, aggregate functions). When a WHERE clause mixes expressible and inexpressible conditions, split it: use code-based predicates for what you can, templates only for what you must. Multiple `where()` calls are AND-combined automatically. FK paths through the object graph (e.g., `User_.city eq city`) do not require explicit joins.

```kotlin
// ❌ Wrong — WHERE conditions as a template string
fun findActiveWithEmail(city: City, minAge: Int): List<User> =
select {
where {
"""${User_.city} = ${city.id()}
AND ${User_.active} = true
AND ${User_.email} IS NOT NULL
AND TIMESTAMPDIFF(YEAR, ${User_.birthDate}, CURDATE()) >= $minAge"""
}
}.resultList

// ✅ Correct — code-based predicates where possible, template only when no alternative exists
fun findActiveWithEmail(city: City, minAge: Int): List<User> =
select {
where((User_.city eq city) and (User_.active.isTrue()) and (User_.email.isNotNull()))
where { "TIMESTAMPDIFF(YEAR, ${User_.birthDate}, CURDATE()) >= $minAge" }
}.resultList
```

## Kotlin Infix Predicate Operators

All operators are extension functions on `Metamodel<T, V>` (generated metamodel fields like `User_.name`):
Expand Down
Loading
Loading