Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

migration:create changes non-nullable to nullable, changes it back and back and forth for enum and custom type properties with default value #1559

Closed
ddadaal opened this issue Mar 13, 2021 · 5 comments · Fixed by #1641
Assignees

Comments

@ddadaal
Copy link

ddadaal commented Mar 13, 2021

Describe the bug
Using mysql driver, for enum properties with default value and custom type properties with default values, repeated migration:create runs creates new migration files which removes the properties' not null, adds not null, removes it, ands it back and back and forth.

To Reproduce
A repo with clearer description and steps to reproduce: https://github.com/ddadaal/mikro-orm-migration-changing-nonnullable-to-nullable

Expected behavior
No new migration files should be generated after the first migration:create, since there is no change at all in the entity.

Versions

Dependency Version
node v14.15.1
typescript 4.2.3
mikro-orm 4.4.4
mysql-driver 4.4.4.
@B4nan
Copy link
Member

B4nan commented Mar 13, 2021

Afaik in your custom type, "DECIMAL(19, 4)" should be "DECIMAL(19,4)" (no space) - or maybe it should be numeric, not sure right now, but you need to use the actual type name as the DB stores it, one of those is just an alias.

Also, this has nothing to do with migrations, this is schema generator/diffing issue, migrations are just a layer on top of it.

Btw mapping decimal to numbers is not very good idea, you can loose precision. You should map to strings or use some wrapper. In fact your custom type will most probably return strings as it is written now.

@ddadaal
Copy link
Author

ddadaal commented Mar 13, 2021

Thanks for the advice, but actually in my production code, DECIMAL(19, 4) works fine,

screenshot-decimal

and I am mapping it into a bignumber.js instance like so:

import { Type } from "@mikro-orm/core";
import { BigNumber as Decimal } from "bignumber.js";

const precision = 4;

const moneyColumnType = `DECIMAL(19, ${precision})`;

Decimal.set({ DECIMAL_PLACES: precision });

export { Decimal };

export class DecimalType extends Type<Decimal | undefined, string | undefined> {
  convertToDatabaseValue(value: Decimal | string | undefined): string | undefined {
    if (!value) { return value;}
    return value.toString();
  }

  convertToJSValue(value: string | undefined): Decimal | undefined {
    if (!value) { return undefined; }
    return new Decimal(value);
  }

  getColumnType(): string {
    return moneyColumnType;
  }
}

And it seems to work fine as well. How the type is mapped is not the point 😄️, but I can update the code if necessary.

So should I report the issue to umzug?

@B4nan
Copy link
Member

B4nan commented Mar 13, 2021

Thanks for the advice, but actually in my production code, DECIMAL(19, 4) works fine,

Have you tried it? This is not about "it works for creating the column", this is about diffing the column type, and for that the type needs to be normalized to what the information schema stores. And there I believe it has to be without the space - someone already reported this very same issue and without the space it was working as expected.

Umzug has nothing to do with this, neither have the migrations - as I mentioned, this is about schema diffing inside SchemaGenerator.

@ddadaal
Copy link
Author

ddadaal commented Mar 13, 2021

I added some tests in the reproduce repo to test the mapping and updating of the bignumber.js instance and DECIMAL(19, 4) db column, and I think it to some extent proves that the mapping should work as expected.

And I am sorry I got confused back then and kept thinking it was about migration. I fully understand it's about SchemaGenerator now.

@ddadaal
Copy link
Author

ddadaal commented Mar 14, 2021

Removing the space and using defaultRaw fixes the issue about the custom type in my case. Thanks! I am sorry that I still misunderstood you and thought it was about mapping the value to DB.

Here are the code lines that I found related to enum issue:

Setting defaultRaw to AEnum.A can make migration generate correct result, but fails to create the schema when the schema is not already created, for in the create schema SQL the default value is not quoted resulting in a SQL syntax error.

if (info.defaultValue && prop.defaultRaw) {
const defaultValue = info.defaultValue.toString().replace(/\([?\d]+\)/, '').toLowerCase();
const propDefault = prop.defaultRaw.toString().toLowerCase();
const same = propDefault === info.defaultValue.toString().toLowerCase();
const equal = same || propDefault === defaultValue;
return equal || Object.keys(defaultValues).map(t => t.replace(/\([?\d]+\)/, '').toLowerCase()).includes(defaultValue);
}

If set the default to AEnum.A, in the code, the diffing compares the defaultValue of db column with defaultRaw of property. The column prop from db has defaultValue of A, but the defaultRaw of property is 'A', which are not equal.

{
  name: 'a_enum',
  type: 'enum',
  ...
  defaultValue: 'A',
  ...
} {
  name: 'aEnum',
  ...
  defaultRaw: "'A'"
}

But I think the root cause should be inside this function, or knex which actually changes the column:

private configureColumn<T>(meta: EntityMetadata<T>, prop: EntityProperty<T>, col: ColumnBuilder, columnName: string, pkProp = prop, alter?: IsSame) {
const nullable = (alter && this.platform.requiresNullableForAlteringColumn()) || prop.nullable!;
const sameNullable = alter && 'sameNullable' in alter && alter.sameNullable;
const indexed = 'index' in prop ? prop.index : (![ReferenceType.SCALAR, ReferenceType.EMBEDDED].includes(prop.reference) && this.helper.indexForeignKeys());
const index = indexed && !alter?.sameIndex;
const indexName = this.getIndexName(meta, prop, 'index', [columnName]);
const uniqueName = this.getIndexName(meta, prop, 'unique', [columnName]);
const sameDefault = alter && 'sameDefault' in alter ? alter.sameDefault : !prop.defaultRaw;
const composite = prop.fieldNames.length > 1;
Utils.runIfNotEmpty(() => col.nullable(), !sameNullable && nullable);
Utils.runIfNotEmpty(() => col.notNullable(), !sameNullable && !nullable);
Utils.runIfNotEmpty(() => col.primary(), prop.primary && !meta.compositePK);
Utils.runIfNotEmpty(() => col.unsigned(), pkProp.unsigned);
Utils.runIfNotEmpty(() => col.index(indexName), !composite && index);
Utils.runIfNotEmpty(() => col.unique(uniqueName), !composite && prop.unique && (!prop.primary || meta.compositePK));
Utils.runIfNotEmpty(() => col.defaultTo(prop.defaultRaw ? this.knex.raw(prop.defaultRaw) : null), !sameDefault);
Utils.runIfNotEmpty(() => col.comment(prop.comment!), prop.comment);
return col;
}

For the enum property in the original code without the workaround, the result of diffing is only the default value is different, so only

Utils.runIfNotEmpty(() => col.defaultTo(prop.defaultRaw ? this.knex.raw(prop.defaultRaw) : null), !sameDefault);

is executed, which, however, removes non null (or not add non null back) in the final result somehow.

B4nan added a commit that referenced this issue Apr 16, 2021
This commit introduces new schema diffing capabilitites. Instead of shared code for
creating and altering tables, we now build schema objects from entities and existing
schema, and compare those (this will later allow to create down migrations too).

- Better mapping - adds mapped types to abstract most of the column types (e.g. { type: t.decimal })
- FK diffing
- Proper index diffing (before we compared just names)
- Custom index expressions
- Comment diffing
- Column length diffing (e.g. numeric(10,2) or varchar(100))
- Changing PK types
- Schena/namespace diffing (posgtres only)

Closes #1486
Closes #1518
Closes #579
Closes #1559
Closes #1602
Closes #1480
Closes #1687
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants