Skip to content

Delta deployment and migrations for SAP CAP database adapters.

License

Notifications You must be signed in to change notification settings

kshanth/cds-dbm

 
 

Repository files navigation

cds-dbm is deprecated and will no longer be actively maintained. Starting with version 7.0.0 @sap/cds now has native support for PostgreSQL. Please use the official database adapter @cap-js/postgres and check out the official documentation for more details.

 

cds-dbm (Core Data Services – Database Migrations)

cds-dbm is a package that extends the database tooling capabilities of the SAP Cloud Application Programming Model's Node.js Service SDK (@sap/cds) for relational databases other than the native supported ones (SAP HANA and SQLite).

The library offers a set of command line tasks related to deploying a cds data model to the database:

It also contains a sap/@cds builder task to create ready-to-deploy build fragments for SAP Business Technology Platform (BTP) Cloud Foundry:

Current status

cds-dbm is ready to be used!
The library contains all the necessary commands and functionality to build a cds model for and deploy it to the supported databases.

Database support

Internally cds-dbm is relying on the popular Java framework liquibase to handle (most) of the database activities. Liquibase by default has support for a variety of relational databases, but currently cds-dbm offers support for the following ones:

  • PostgreSQL (in combination with the cds-pg database adapter)

Support for other databases is planned whenever a corresponding CDS adapter library is available.

There are some example apps already using cds-dbm in combination with cds-pg:

There are also some blogposts in the SAP Community showcasing the functionality of cds-dbm:

Why does cds-dbm (currently) not support SAP HANA?

As SAP HANA is a first class citizen in CAP, SAP offers its own deployment solution (@sap/hdi-deploy). With CAP it is possible to directly compile a data model into SAP HANA fragments (.hdbtable, etc.), which can then be deployed by the hdi-deploy module taking care of all the important stuff (delta handling, hdi-management on XSA or SAP BTP, etc.).
Nevertheless it may be suitable to use the liquibase-hanadb adapter to add an alternative deployment solution. If so, support might be added in the future.


How to use

Prerequisites

Since the project uses liquibase internally, a Java Runtime Environment (JRE) in at least version 8 is required on your system.

Usage in your CAP project

Simply add this package to your CAP project by running:

npm install cds-dbm

cds-dbm requires some additional configuration added to your package.json:

  "cds": {
    //...
    "migrations": {
      "db": {
        "schema": {
          "default": "public",
          "clone": "_cdsdbm_clone",
          "reference": "_cdsdbm_ref"
        },
        "deploy": {
          "tmpFile": "tmp/_autodeploy.json",
          "undeployFile": "db/undeploy.json"
        }
      }
    }
  }

Internally cds-dbm uses 3 different schemas during deployment:

  • default – The schema containing your currently deployed and thus running schema
  • clone – A schema that will be a clone of the default schema during deployment
  • reference – The schema that will contain the CDS model

Automated delta deployments

TODO: Add full description

In the meantime some notes on the delta processing:

  1. Clone current db schema cds.migrations.db.schema.default into cds.migrations.db.schema.clone.
  2. Drop all CDS based views from clone schema because updates on views do not work in the way liquibase is handling this via CREATE OR REPLACE VIEW (https://liquibase.jira.com/browse/CORE-2377).
  3. Deploy the full CDS model to the reference schema cds.migrations.db.schema.reference.
  4. Let liquibase create a diff between the clone and reference schema (including the recreation of the dropped views).
  5. Do some adjustments on the changelog (handle undeployment stuff, fix order of things).
  6. Finally deploy changelog to current schema.
  7. Load data from .csv files (if requested).

Dropping tables/views

cds-dbm follows a safe approach and does not drop any database tables during deployment. Thus, old tables will still be around even if they are not part of your data model anymore.

You can either remove them manually or rely on cds-dbm to handle this for you.

Undeployment file

An undeployment file makes it possible to specifically list views and tables that should be dropped from the database schema during the deployment.

The undeployment file's path needs to be specified in the package.json configuration (cds.migrations.deploy.undeployFile):

# An example undeploy.json:

{
    "views": [],
    "tables": [
        "csw_beers",
        "csw_anotherTable"
    ]
}

auto-undeployment option

While an undeploy.json file gives you fine grained control, it is also possible to automatically remove tables/views from the database schema. When using the auto-undeploy flag during deployment, cds-dbm will take the cds model as the reference and remove all other existing tables/views.

Commands

The following commands exists for working with cds-dbm in the automated delta deployment mode:

Currently tall tasks must be called with npx

npx cds-dbm <task>

deploy

Performs a delta deployment of the current cds data model to the database. By default no csv files are loaded. If this is required, the load strategy has to be defined (load-via).

Usage

cds-dbm deploy

Flags

  • create-db (boolean) - If set, the deploy task tries to create the database before actually deploying the data model. The deployment will not break, if the database has already been created before.
  • auto-undeploy (boolean) - WARNING: Drops all tables not known to your data model from the database. This should only be used if your cds includes all tables/views in your db (schema). Otherwise it is highly recommended to use an undeployment file.
  • load-via (string) - Can be either full (truncate and insert) or delta (check for existing records, then update or insert)
  • dry (boolean) - Does not apply the SQL to the database but logs it to stdout

Examples

cds-dbm deploy
cds-dbm deploy --create-db
cds-dbm deploy --load-via delta
cds-dbm deploy --auto-undeploy
cds-dbm deploy --auto-undeploy --dry

load

Loads data from CSV files into the database. The following conventions apply (according to the default @sap/cds conventions at https://cap.cloud.sap/docs/guides/databases):

  • The files must be located in folders db/csv, db/data/, or db/src/csv.
  • They contain data for one entity each. File names must follow the pattern -.csv, for example, my.bookshop-Books.csv.
  • They must start with a header line that lists the needed element names.

Different to the default mechanism in @sap/cds (which supports only full loads), cds_dbm offers two loading strategies:

  • full – Truncates a table and then inserts all the data from the CSV file
  • delta – Checks, if an existing record with the same key exists and performs an update. All other records in the table will be left untouched.

Usage

cds-dbm load --via full
cds-dbm load --via delta

Flags

  • via (string) - Can be either full (truncate and insert) or delta (check for existing records, then update or insert)

drop

Drops all tables and views in your data model from the database. If the all parameter is given, then everything in the whole schema will be dropped, not only the cds specific entities.

Usage

cds-dbm drop

Flags

  • all (boolean) - If set, the whole content of the database/schema is being dropped.

Examples

cds-dbm drop
cds-dbm drop --all

diff

Generates a descriptive text containing all the differences between the defined cds model and the current status of the database. By default, the information will be logged to the console.

Usage

cds-dbm diff

Flags

  • to-file (string) - If set, the diff information will be written into the file and not logged in the console.

Examples

cds-dbm diff
cds-dbm diff --to-file db/diff.txt

Custom Build Task

Executes the defined cds build task, either from a .cdsrc or the package json. cds-dbm comes with a pre-baked build task, to deploy the data model to a PostgreSQL database on SAP BTP Cloud Foundry.

Example configuration:

"build": {
      "tasks": [
        {
          "use": "node-cf",
          "src": "srv"
        },
        {
          "use": "cds-dbm/dist/build/postgres-cf",
          "for": "postgres-cf",
          "src": "db",
          "options": {
            "deployCmd": "npx cds-dbm deploy --load-via delta --auto-undeploy"
          }
        }
      ]
    },

Usage via cds build

cds build

This will generate a specifc set of files into the gen/db (or any other configured) folder, that will be deployed to SAP BTP CF environment.

An example configuration for a mta.yml leveraging the build fragments:

  - name: devtoberfest-db-deployer
    type: custom
    path: gen/db
    parameters:
      buildpacks: [https://github.com/cloudfoundry/apt-buildpack#v0.2.2, nodejs_buildpack]
      no-route: true
      no-start: true
      disk-quota: 2GB
      memory: 512MB
      tasks:
      - name: deploy_to_postgresql
        command: ./deploy.sh
        disk-quota: 2GB
        memory: 512MB
    build-parameters:
      ignore: ["node_modules/"]
    requires:
      - name: devtoberfest-database

resources:
  - name: devtoberfest-database
    parameters:
      path: ./pg-options.json
      service: postgresql-db
      service-plan: trial
      skip-service-updates:
        parameters: true
      service-tags:
        - plain
    type: org.cloudfoundry.managed-service

Sponsors

Thank you to p36 (https://p36.io/) for sponsoring this project.

About

Delta deployment and migrations for SAP CAP database adapters.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • TypeScript 80.7%
  • Shell 10.3%
  • CAP CDS 6.4%
  • Batchfile 2.2%
  • JavaScript 0.4%