Skip to content
This repository has been archived by the owner on Mar 4, 2019. It is now read-only.

Commit

Permalink
feat: deep insert into related tables
Browse files Browse the repository at this point in the history
  • Loading branch information
dmfay committed Nov 11, 2017
1 parent f1e1610 commit 6bb4c6b
Show file tree
Hide file tree
Showing 9 changed files with 295 additions and 48 deletions.
2 changes: 1 addition & 1 deletion README.md
Expand Up @@ -15,7 +15,7 @@ Here are some of the high points:
* **Dynamic query generation**: Massive features a versatile query builder with support for a wide variety of operators, all generated from a simple criteria object.
* **Do as much, or as little, as you need**: if you're coming from an ORM background, you might be expecting to have to create or load an entity instance before you can write it to the database. You don't. Your tables are _tables_, and you can insert or update directly into them.
* **Document storage**: PostgreSQL's JSONB storage type makes it possible to blend relational and document strategies. Massive offers a robust API to simplify working with documents: objects in, objects out, with document metadata managed for you.
* **Result decomposition**: while Massive does not traverse relationships or build model graphs, the [`decompose` option](https://dmfay.github.io/massive-js/options.html#decomposition-schemas) allows you to map the results of complex views and scripts to nested object trees.
* **Relational awareness**: while Massive does not traverse relationships or build model graphs, [deep inserts](https://dmfay.github.io/massive-js/persistence.html#deep-insert) can create related entities and junctions transactionally, and the [`decompose` option](https://dmfay.github.io/massive-js/options.html#decomposition-schemas) allows you to map the results of complex views and scripts to nested object trees.
* **Postgres everything**: committing to a single RDBMS allows us to leverage it to the fullest extent possible. Massive supports array fields and operations, JSON storage, foreign tables, and many, many more features found in PostgreSQL but not in other databases.

## Full Documentation
Expand Down
37 changes: 35 additions & 2 deletions docs/persistence.md
Expand Up @@ -40,7 +40,9 @@ db.tests.insert({
});
```

You can insert multiple rows at once -- just pass an array, and you'll receive an array containing all your new records:
#### Multiple Records

You can insert multiple records at once -- just pass an array, and you'll receive an array containing all your new data:

```javascript
db.tests.insert([{
Expand All @@ -55,7 +57,38 @@ db.tests.insert([{
});
```

Records in the array may omit keys if the database default value is intended; however, fields having `NOT NULL` constraints must be included or the insert will fail.
Records must be consistently formed, having the same fields in the same iteration order. Fields may be omitted if the database default value is intended; however, fields having `NOT NULL` constraints must be included or the insert will fail.

#### Deep Insert

Inserting into multiple related tables at once happens fairly frequently, especially when you're dealing with many-to-many relationships through a junction table. For these cases, you can pass arrays of related records into `insert` as a property named for the related table.

Entries in each related table array have the same conditions as those for normal inserts: they must be consistently formed, and nullable fields may be omitted if no value is intended. The one difference is that the foreign key to the original record must be explicitly `undefined`.

```javascript
db.tests.insert({
name: 'homepage',
version: 1,
priority: 'low',
user_tests: [{
test_id: undefined,
user_id: 1,
role: 'primary'
}, {
test_id: undefined,
user_id: 2,
role: 'auxiliary'
}]
}).then(tests => {
// as with regular inserts, you only get the test back;
// if user_tests has a primary key constraint, you can
// query db.user_tests to retrieve its rows.
});
```

Deep insert is _only_ supported when inserting single records. Attempting to deep insert an array of records will raise an exception.

#### Options

[Query options](/options) for `INSERT` statements and results processing may be used with `insert`:

Expand Down
7 changes: 6 additions & 1 deletion lib/database.js
Expand Up @@ -275,7 +275,12 @@ Database.prototype.createDocumentTable = function (location) {
table: tableName,
index: indexName
}).then(() =>
this.attach(Table, this.$p.resolve([{schema: schemaName, pk: 'id', name: tableName}]))
this.attach(Table, this.$p.resolve([{
schema: schemaName,
name: tableName,
columns: ['id', 'body', 'search', 'created_at'],
pk: 'id'
}]))
).then(added => {
this.tables.push(added[0]);

Expand Down
51 changes: 41 additions & 10 deletions lib/scripts/tables.sql
Expand Up @@ -11,28 +11,59 @@
-- override blacklisted tables.

SELECT * FROM (
SELECT tc.table_schema AS schema, tc.table_name AS name, NULL AS parent, kc.column_name AS pk, TRUE AS is_insertable_into
SELECT tc.table_schema AS schema,
tc.table_name AS name,
NULL AS parent,
kc.column_name AS pk,
TRUE AS is_insertable_into,
array_agg(c.column_name::text) AS columns
FROM information_schema.table_constraints tc
JOIN information_schema.key_column_usage kc
ON kc.table_name = tc.table_name
AND kc.constraint_schema = tc.table_schema
ON kc.constraint_schema = tc.table_schema
AND kc.table_name = tc.table_name
AND kc.constraint_name = tc.constraint_name
JOIN information_schema.columns c
ON c.table_schema = tc.table_schema
AND c.table_name = tc.table_name
WHERE tc.constraint_type = 'PRIMARY KEY'
GROUP BY tc.table_schema, tc.table_name, kc.column_name

UNION
SELECT tc.table_schema AS schema, c.relname AS name, p.relname AS parent, kc.column_name AS pk, TRUE AS is_insertable_into

SELECT tc.table_schema AS schema,
child.relname AS name,
parent.relname AS parent,
kc.column_name AS pk,
TRUE AS is_insertable_into,
array_agg(c.column_name::text) AS columns
FROM pg_catalog.pg_inherits
JOIN pg_catalog.pg_class AS c ON (inhrelid = c.oid)
JOIN pg_catalog.pg_class AS p ON (inhparent = p.oid)
JOIN information_schema.table_constraints tc ON tc.table_name = p.relname
JOIN pg_catalog.pg_class AS child ON (inhrelid = child.oid)
JOIN pg_catalog.pg_class AS parent ON (inhparent = parent.oid)
JOIN information_schema.table_constraints tc ON tc.table_name = parent.relname
JOIN information_schema.key_column_usage kc
ON kc.table_name = tc.table_name
AND kc.constraint_schema = tc.table_schema
ON kc.constraint_schema = tc.table_schema
AND kc.table_name = tc.table_name
AND kc.constraint_name = tc.constraint_name
JOIN information_schema.columns c
ON c.table_schema = tc.table_schema
AND c.table_name = tc.table_name
WHERE tc.constraint_type = 'PRIMARY KEY'
GROUP BY tc.table_schema, child.relname, parent.relname, kc.column_name

UNION
SELECT t.table_schema AS schema, t.table_name AS name, NULL AS parent, NULL AS pk, CASE t.is_insertable_into WHEN 'YES' THEN TRUE ELSE FALSE END AS is_insertable_into

SELECT t.table_schema AS schema,
t.table_name AS name,
NULL AS parent,
NULL AS pk,
CASE t.is_insertable_into WHEN 'YES' THEN TRUE ELSE FALSE END AS is_insertable_into,
array_agg(c.column_name::text) AS columns
FROM information_schema.tables t
JOIN information_schema.columns c
ON c.table_schema = t.table_schema
AND c.table_name = t.table_name
WHERE table_type = 'FOREIGN TABLE'
GROUP BY t.table_schema, t.table_name, t.is_insertable_into
) tables
WHERE CASE
WHEN $(whitelist) <> '' THEN
Expand Down
60 changes: 56 additions & 4 deletions lib/statement/insert.js
Expand Up @@ -13,7 +13,7 @@ const prepareParams = require('../util/prepare-params');
* @param {Object} [options] - {@link https://dmfay.github.io/massive-js/options.html|Insert options}.
*/
const Insert = function (source, record, options = {}) {
this.source = source.delimitedFullName;
this.source = source;
this.only = options.only;
this.build = options.build;
this.document = options.document;
Expand All @@ -34,8 +34,29 @@ const Insert = function (source, record, options = {}) {
return set;
}, new Set())];

this.columns = fields.map(f => `"${f}"`);
this.params = prepareParams(fields, this.records);
this.columns = _.intersection(fields, this.source.columns);
this.junctions = _.difference(fields, this.source.columns);

const recordParams = prepareParams(this.columns, this.records);

if (this.junctions.length) {
if (this.records.length > 1) {
throw new Error('Deep insert is only supported for single records');
}

// append all junction params (that aren't stubbing out the foreign keys)
// to the insert's parameter list
// TODO generate junction field set to allow more flexibility between
// junction records for the same relationship
this.params = _.reduce(this.junctions, (allParams, j) => {
const junction = this.records[0][j];
const junctionParams = prepareParams(Object.keys(junction[0]), junction);

return allParams.concat(junctionParams.filter(v => v !== undefined));
}, recordParams);
} else {
this.params = recordParams;
}
};

/**
Expand All @@ -44,7 +65,9 @@ const Insert = function (source, record, options = {}) {
* @return {String} A SQL INSERT statement.
*/
Insert.prototype.format = function () {
let sql = `INSERT INTO ${this.source} (${this.columns.join(', ')}) VALUES `;
const quotedColumns = this.columns.map(f => `"${f}"`);

let sql = `INSERT INTO ${this.source.delimitedFullName} (${quotedColumns.join(', ')}) VALUES `;

let offset = 1;
const values = this.records.reduce((acc) => {
Expand All @@ -63,6 +86,35 @@ Insert.prototype.format = function () {

sql += `RETURNING *`;

if (this.junctions.length) {
const junctionQueries = _.reduce(this.junctions, (queries, j, idx) => {
return queries.concat(this.records[0][j].map((r, jdx) => {
// separate out keyColumns so they are consistently positioned in the
// CTE since they won't necessarily be ordered in the source map
const keyColumns = [];
const valColumns = [];

_.keys(r).forEach(k => {
if (r[k] === undefined) {
keyColumns.push(k);
} else {
valColumns.push(k);
}
});

const allQuotedColumns = keyColumns.concat(valColumns).map(k => `"${k}"`);

const rValues = _.range(offset, offset + valColumns.length).map(n => `$${n}`);

offset += valColumns.length;

return `q_${idx}_${jdx} AS (INSERT INTO ${j} (${allQuotedColumns.join(', ')}) SELECT "${this.source.pk}", ${rValues.join(', ')} FROM inserted)`;
}));
}, []);

sql = `WITH inserted AS (${sql}), ${junctionQueries.join(', ')} SELECT * FROM inserted`;
}

return sql;
};

Expand Down
3 changes: 2 additions & 1 deletion lib/table.js
Expand Up @@ -18,13 +18,14 @@ const where = require('./statement/where');
* @param {Object} spec - An {@linkcode Entity} specification representing a table:
* @param {Object} spec.db - A {@linkcode Database}.
* @param {String} spec.name - The table or view's name.
* @param {String} spec.schema - The name of the schema owning the table or
* @param {String} spec.schema - The name of the schema owning the table.
* @param {String} spec.pk - The table's primary key column.
*/
const Table = function (spec) {
Queryable.apply(this, arguments);

this.pk = spec.pk;
this.columns = spec.columns;
this.pkRegexp = new RegExp(`^${this.pk}[^\\w\\d]?`);
this.insertable = spec.is_insertable_into || true;
};
Expand Down
48 changes: 23 additions & 25 deletions test/helpers/scripts/data-products-orders/schema.sql
Expand Up @@ -2,34 +2,32 @@ CREATE SCHEMA public;

CREATE EXTENSION IF NOT EXISTS pgcrypto;

create table products(
id serial primary key,
name varchar(50) NOT NULL,
price decimal(10,2) default 0.00 not null,
description text,
in_stock boolean,
specs jsonb,
created_at timestamptz default now() not null,
tags character varying(255)[]
CREATE TABLE products(
id SERIAL PRIMARY KEY,
name VARCHAR(50) NOT NULL,
price DECIMAL(10,2) DEFAULT 0.00 NOT NULL,
description TEXT,
in_stock BOOLEAN,
specs JSONB,
created_at TIMESTAMPTZ DEFAULT now() NOT NULL,
tags CHARACTER VARYING(255)[]
);

create table orders(
id uuid primary key default gen_random_uuid(),
product_id int,
user_id int,
notes character varying(255),
ordered_at date default now() not null
CREATE TABLE orders(
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
product_id INT,
user_id INT,
notes CHARACTER VARYING(255),
ordered_at DATE DEFAULT now() NOT NULL
);

INSERT INTO products(name, price, description, specs, tags, in_stock)
VALUES ('Product 1', 12.00, 'Product 1 description', NULL, NULL, TRUE),
('Product 2', 24.00, 'Product 2 description', '{"weight": 20, "dimensions": {"length": 15, "width": 12}}', '{tag1,tag2}', TRUE),
('Product 3', 35.00, 'Product 3 description', '{"weight": 30, "sizes": [10, 15, 20]}', '{tag2,tag3}', FALSE),
('Product 4', 40.00, 'Product 4 description', '["why", "not", "have", "an", "array"]', '{tag4,tag''quote,"tag,comma","tag{brace}"}', FALSE);

insert into products(name, price, description, specs, tags, in_stock)
values ('Product 1', 12.00, 'Product 1 description', null, null, true),
('Product 2', 24.00, 'Product 2 description', '{"weight": 20, "dimensions": {"length": 15, "width": 12}}', '{tag1,tag2}', true),
('Product 3', 35.00, 'Product 3 description', '{"weight": 30, "sizes": [10, 15, 20]}', '{tag2,tag3}', false),
('Product 4', 40.00, 'Product 4 description', '["why", "not", "have", "an", "array"]', '{tag4,tag''quote,"tag,comma","tag{brace}"}', false);


insert into orders(product_id, user_id, notes)
values (1, 1, 'user 1 ordered product 1'),
INSERT INTO orders(product_id, user_id, notes)
VALUES (1, 1, 'user 1 ordered product 1'),
(2, 1, 'user 1 ordered product 2'),
(4, 1, 'user 1 ordered product 4');
(4, 1, 'user 1 ordered product 4');

0 comments on commit 6bb4c6b

Please sign in to comment.