Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Proposal] Adding the ability to automatically generate migration queries #980

Closed
huytd opened this issue Jun 30, 2017 · 5 comments
Closed

Comments

@huytd
Copy link

huytd commented Jun 30, 2017

Hi everybody, this is my proposal for the new feature of diesel_cli migration.

What is the idea?

The basic idea is: Provide an ability to quickly modelling database schema when we start the new project with Diesel.

Why we need it?

As of now, we create a new project with Diesel and start modelling our database with the command:

diesel migration generate create_something_here

After this step, the two new .sql files will be created in migrations/XXXX_create_something_here folder, and we have to write the actual SQL queries to create our tables.

But why do we need to actually write the SQL queries in the first place? Why don't we just have the ability to automatically generate it from something?

What we want to do here is design the database schema, not writing lengthy queries.

What do you want to do?

So, instead of starting with generating migration, we will start defining our database schema in a text file. Let's say, in yaml or JSON format:

YAML version: db.yaml

roles:
  - id: key
  - name
  - code

users:
  - id: key
  - username
  - email
  - passwod
  - enabled: int
  - role: roles

JSON version: db.json

{ 
  roles: [ 
    { id: 'key' }, 
    'name', 
    'code' 
  ],
  users: [ 
    { id: 'key' },
    'username',
    'email',
    'passwod',
    { enabled: 'int' },
    { role: 'roles' } 
  ]
}

The example above illustrates the simple database with 2 tables: roles and users and their relationship.

The default field type should be VARCHAR, and you can specify your type with this syntax:

- field: TYPE

The db.yaml or db.json file should be stored in project's root.

After finished with your database schema, now it's time to generate the migration, with the new from <table_name> argument:

$ diesel migration generate create_roles from roles
$ diesel migration generate create_users from users

Then the magic happen, diesel_cli will generate the new migration folder with the automatically generated SQL queries from the schema and populate it into up.sql and down.sql files.

For example:

XXX_create_roles/up.sql

CREATE TABLE roles (
  id SERIAL PRIMARY KEY,
  name VARCHAR,
  code VARCHAR
)

How do you expect Diesel to support you with that?

I expect to have the new argument in diesel_cli:

from <table_name>

Which will read from db.yaml or db.json and generate the corresponding SQL query.

How do you think this can be added to Diesel?

Since diesel_cli source code is very simple and straightforward, this feature can be added as a new module, for example, query_generator.
Then we can call it from here:

let migration_name = args.value_of("MIGRATION_NAME").unwrap();
+ let table_name = args.value_of("FROM_TABLE").unwrap();
...
- up.write_all(b"-- Your SQL goes here").unwrap();
+ up.write_all(query_generator.create_table(table_name)).unwrap();
...

We also need to have the parser to read from the schema file as well.

For the query_generator, we need to handle the different in the data type between backend types. For example, in Postgres, we use SERIAL but in MySQL, it should be INT and AUTO_INCREMENT.

What are the possible alternatives?

I could not find any alternatives to my proposal.

Are there any disadvantages?

  • Increase the complexity of diesel_cli source code.
  • Have to maintain the different logic for different database backends.
@sgrif
Copy link
Member

sgrif commented Jul 4, 2017

This is semi-related to a feature we've discussed in the past, which is having a Rust DSL for migrations. If we do end up adding a secondary migration format, it'll almost certainly be that form and not JSON or YAML. At the moment this feature doesn't seem to add a lot of value. Database backends have tools for dumping their schema if you just want to generate a migration for an existing database. Being able to easily swap out backends is not a main goal of Diesel.

@David-OConnor
Copy link

David-OConnor commented Oct 14, 2019

What about pulling it from the model structs/enums? Could use serde-style #[] lines when more info, or non-default behavior is required.

@weiznich
Copy link
Member

@David-OConnor There is no such think like a model in diesel. There are table definitions, which could be generated from the database and there are query specific structs.
For table! macros I do not see any advantage of generating migrations from them instead of generating the macro from a given database. Generating the macros from a database allows the user to use existing databases/whatever external migration scheme they want without being forced to write migrations for those databases.
For query specific structs I do not see how this could be done because a struct implementing Queryable could be used with multiple tables and contain fields from multiple tables. Additionally there could be multiple structs implementing Queryable/Insertable/AsChangeset for a single table.

@David-OConnor
Copy link

David-OConnor commented Oct 14, 2019

I guess I don't understand why we need three separate, degenerate (In the physics sense) definitions for each database table, when we could use the struct (and diffs in it between migrations) to generate the other two. The normal DRY considerations apply: Having to keep them in sync, 3x the chance for a typo/error, updating friction etc.

@weiznich
Copy link
Member

There are only two separate definitions, the table marco call and and your sql migration. The first one can be generated from the second one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants