Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MySQL insert with JSON doesn't work / ER_WRONG_VALUE_COUNT_ON_ROW #1800

Closed
elevenE opened this issue Nov 23, 2016 · 13 comments
Closed

MySQL insert with JSON doesn't work / ER_WRONG_VALUE_COUNT_ON_ROW #1800

elevenE opened this issue Nov 23, 2016 · 13 comments

Comments

@elevenE
Copy link

elevenE commented Nov 23, 2016

Hi,

I'm using knex with MySQL and inserting JSON values but it fails and gives me this error:

{ [Error: insert into `AuditLog` (`ApiGatewayRequestId`, `EventName`, `EventTime`, `RequestData`, `SourceIp`, `Username`) values ('35a01f58-b1b2-11e6-91d6-11494db70b97', 'login', '2016-11-23T19:22:44.858Z', '{"body-json":{},"params":{"path":{},"querystring":{},"header":{"Accept":"*/*","Accept-Encoding":"gzip, deflate, sdch, br","Accept-Language":"en-US,en;q=0.8","Cache-Control":"no-cache","CloudFront-Forwarded-Proto":"https","CloudFront-Is-Desktop-Viewer":"true","CloudFront-Is-Mobile-Viewer":"false","CloudFront-Is-SmartTV-Viewer":"false","CloudFront-Is-Tablet-Viewer":"false","CloudFront-Viewer-Country":"CA","Content-Type":"application/json","Host":"example.com","Postman-Token":"533739b6-df39-32e4-1e80-0b4650673846","User-Agent":"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36","Via":"1.1 d5da2738774b6f83465e13845679d084.cloudfront.net (CloudFront)","X-Amz-Cf-Id":"J5oRQGLNt-gBSHRnifBVuwGYtmctVMaA1vcglRwU6pDBYDLBI8gOPA==","X-Forwarded-For":"127.0.0.1, 127.0.0.2","X-Forwarded-Port":"443","X-Forwarded-Proto":"https","X-Password":"***","X-Username":"abc@example.com"}}}', '127.00.93.178', 'abc@example.com') - ER_WRONG_VALUE_COUNT_ON_ROW: Column count doesn't match value count at row 1] code: 'ER_WRONG_VALUE_COUNT_ON_ROW', errno: 1136, sqlState: '21S01', index: 0 }

Even though if I execute the query generated (logged by knex.js with debug: true) directly on MySQL it works without any issues.

Sample code:

var mysql = knex({
    client: 'mysql',
    debug: true,
    connection: {
        host: 'abc',
        port: 3306,
        user: 'root',
        password: 'abc',
        database: 'abc'
    }
});

var request = {
    EventTime: new Date().toISOString(),
    EventName: 'login',
    Username: 'abc@example.com',
    ApiGatewayRequestId: 'request-id',
    SourceIp: '127.0.0.1',
    RequestData: {
        "body-json": {},
        "params": {
            "path": {},
            "querystring": {},
            "header": {
                "Accept": "*/*",
                "Accept-Encoding": "gzip, deflate, sdch, br",
                "Accept-Language": "en-US,en;q=0.8",
                "Cache-Control": "no-cache",
                "CloudFront-Forwarded-Proto": "https",
                "CloudFront-Is-Desktop-Viewer": "true",
                "CloudFront-Is-Mobile-Viewer": "false",
                "CloudFront-Is-SmartTV-Viewer": "false",
                "CloudFront-Is-Tablet-Viewer": "false",
                "CloudFront-Viewer-Country": "CA",
                "Content-Type": "application/json",
                "Host": "example.com",
                "Postman-Token": "533739b6-df39-32e4-1e80-0b4650673846",
                "User-Agent": "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36",
                "Via": "1.1 d5da2738774b6f83465e13845679d084.cloudfront.net (CloudFront)",
                "X-Amz-Cf-Id": "J5oRQGLNt-gBSHRnifBVuwGYtmctVMaA1vcglRwU6pDBYDLBI8gOPA==",
                "X-Forwarded-For": "127.0.0.1, 127.0.0.2",
                "X-Forwarded-Port": "443",
                "X-Forwarded-Proto": "https",
                "X-Password": "***",
                "X-Username": "abc@example.com"
            }
        }
    }
};

mysql(auditTable).insert(request).then((res) => {
    console.log('res:', res);
}).catch((err) => {
    console.log('err:', err);
});
@elhigu
Copy link
Member

elhigu commented Nov 24, 2016

Check what mysql(auditTable).insert(request).toSQL() outputs. That way you will see how query is passed to mysql driver.

@elhigu
Copy link
Member

elhigu commented Nov 24, 2016

Also the SQL shown in error doesn't seem to be the same that is shown in your example... SourceIP in error is '127.00.93.178' and other values are also different, what is that about?

@elevenE
Copy link
Author

elevenE commented Nov 24, 2016

@elhigu: This is the output from .toSQL():

{ method: 'insert', options: {}, timeout: false, cancelOnTimeout: false, bindings: [ '595fa531-b25a-11e6-85ef-f391c2ff5cf6', 'login', '2016-11-24T15:26:21.195Z', { 'body-json': {}, params: [Object] }, '127.0.0.1', 'abc@example.com' ], __knexQueryUid: 'c81f5ffb-8fdb-458f-9ac3-54cca702ff2e', sql: 'insert into `AuditLog` (`ApiGatewayRequestId`, `EventName`, `EventTime`, `RequestData`, `SourceIp`, `Username`) values (?, ?, ?, ?, ?, ?)' }

As for the values that are different, please ignore them. It was my (bad) attempt at removing any identifying/actual values from the output. So I forgot to make the code example the same.

Also, it works fine when I JSON.stringify() the object passed for the RequestData column. I'm not sure why is this the case. Does that mean we have to pass the values for JSON type columns as strings? I was under the impression that knex would take care of that.

@elhigu
Copy link
Member

elhigu commented Nov 25, 2016

Looks like indeed JSON data in mysql https://dev.mysql.com/doc/refman/5.7/en/json.html is just string literal which can be parsed as JSON. IIRC it did work the same way with postgresql where it was necessary to stringify value before inserting...

Knex doesn't know anything about the type of column where you are inserting data so it cannot stringify it automatically. E.g. if you have an array type of column JSON-stringifying the passed array would be invalid behavior.

There is still one thing that I don't understand... In that error message JSON part seems to be stringified even that it was sent as javascript object to driver...

Currently I'm using an ORM, which has actually simple schema for validating columns. That schema is also used to provide info about json columns so ORM can automatically convert JSON column data to string before inserting.

@kibertoad
Copy link
Collaborator

There is proper support for JSON columns (given that you are using MySQL version 5.7.8+) in latest knex versions, so I am resolving this.

@Nemoden
Copy link

Nemoden commented Jan 31, 2019

While this is been marked as resolved, the issue is still there (unless I'm doing something wrong).

knex config:

const conf: ConfigType = {
  development: {
    client: config.DB_CLIENT, // mysql
    version: "8.0",
    connection: {
      user: config.DB_USER,
      host: config.DB_HOST,
      database: config.DB_DATABASE,
      password: config.DB_PASS,
      port: config.DB_PORT,
    },

migration:

knex.schema.createTable('events', builder => {
    builder.uuid('id').primary()
    builder.timestamp('created_at').defaultTo(knex.fn.now())
    builder.json('command')
    builder.json('event')
    builder.boolean('is_ack').defaultTo(false)
    builder.dateTime('acknowledged_at')
})

Actual table as it was created in mysql

CREATE TABLE `events` (
  `id` char(36) NOT NULL,
  `created_at` timestamp NULL DEFAULT CURRENT_TIMESTAMP,
  `command` json DEFAULT NULL,
  `event` json DEFAULT NULL,
  `is_ack` tinyint(1) DEFAULT '0',
  `acknowledged_at` datetime DEFAULT NULL,
  PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci

the insert (the following executes against Postgres without any issues):

trx('events').insert({
    id: '450c5272-2905-47fd-a1e6-780a5492ce56',
    command: {
        name: 'CreateUserCommand',
        constructorArguments: { name: 'User', password: 'pass' }
    },
    event: {
        event_id: 'c687dba2-5239-4dc0-a86c-ea2fda77709b',
        type: 'UserCreatedEvent',
        created_at: 2019-01-31T01:55:56.263Z,
        version: '1',
        payload: { id: 1, name: 'User', password: 'pass' }
    }
})

query, fished out from knex:debug:

2019-01-31T01:55:56.275Z knex:query insert into `events` (`command`, `event`, `id`) values (?, ?, ?) trx1

Actual query, which's being executed against mysql server (general_log = 1):

2019-01-31T01:55:56.282007Z 10 Query insert into events (command, event, id) values (name = 'CreateUserCommand', constructorArguments = '[object Object]', event_id = 'c687dba2-5239-4dc0-a86c-ea2fda77709b', type = 'UserCreatedEvent', created_at = '2019-01-31 01:55:56.263', actor = '[object Object]', version = '1', payload = '[object Object]', '450c5272-2905-47fd-a1e6-780a5492ce56')

knex version is 0.6.12
mysql version is 8.0.14
driver being used for mysql connectivity is "mysql": "^2.16.0" (as taken from package.json).
Will try if using mysql2 would fix that.

UPDATE. Same issue with "mysql2": "^1.6.4"

@elhigu
Copy link
Member

elhigu commented Jan 31, 2019

knex version is 0.6.12

Thats really really old knex. It shouldn't support json column / dialect version attribute at all. So probably you are really using 0.16.2 or something like that..

This issue only added support that schema allows to create json type of columns. You need to check from mysql driver docs, in which format it requires JSON input to be inserted. Maybe you need to call JSON.stringify() before insert with that.

Knex doesn't know nor it cares to which type of column you are inserting stuff. It only passes data that end user is giving to the DB driver.

@Nemoden
Copy link

Nemoden commented Jan 31, 2019

Sorry, that was a typo, "1" has "slipped away" from the second offset of the version string to the third.
knex version I'm using is indeed 0.16.2.

Knex doesn't know nor it cares to which type of column you are inserting stuff. It only passes data that end user is giving to the DB driver.

Yeah, fair enough. That was the reason why I've tried mysql2 instead of mysql. I think I'll go with JSON.stringify if the client is MySQL.

Not a knex's concern.

@nagnathincresol
Copy link

I am using
"mysql": "^2.17.1",

I am having same issue.

@shikasta-kashti
Copy link

shikasta-kashti commented Jul 22, 2019 via email

@jakubka
Copy link

jakubka commented Aug 19, 2019

@kibertoad @elhigu

Knex doesn't know nor it cares to which type of column you are inserting stuff. It only passes data that end user is giving to the DB driver.

This makes perfect sense. However, could knex offer some kind of hook where we could implement the JSON serialisation manually? The hook could check what data is being inserted and serialise the object to string if the column name matches something. I understand that it'd only work when inserting data as object (with key being column name and value the data).

Would you accept a PR to do that (and offer some guidance in terms of where this code should go and how the API should look like)? :)

@shikasta-kashti
Copy link

shikasta-kashti commented Aug 19, 2019 via email

@elhigu
Copy link
Member

elhigu commented Aug 19, 2019

@jakubka that is task for ORM to implement. It would mean that we need to start add information about tables to knex which would be in fact models. You might be able to tell db driver to do the serialization, but im pretty sure they doesn't support it unless knex would be able to tell column types for the data that is inserted to paramater bindings.

So one way to do it could be to add option to insert, which tells column names that should be serialized, but then again... that is as difficult as just do serialization manually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants